Welcome to the new Parasoft forums! We hope you will enjoy the site and try out some of the new features, like sharing an idea you may have for one of our products or following a category.

Some tests failing depending where they are run

Speedy993
Speedy993 Posts: 92

I have a web app for which I have created a large number of SOATest tests. When I run the tests locally on my PC they all run successfully. When I log onto the application server and run them there, they also run successfully. However when they are kicked off via script run the command line interface (Edge InPrivate mode) in Jenkins, the tests fail. Sometimes a large number of them fail and sometime just a few. When I go into SOATest 2025.2 on the application server, I see the errors in the Console and Quality Tasks so I know it is running the correct version of SOATest. I can then run the tests for the application successfully. In all cases, I am launching SOATest with the command line parameter to run Microsoft Edge in InPrivate mode.
Is there something I should look at to determine the cause of this? Is there a configuration setting to check or something else? I noticed that when it is going to fail, the tests can take a very long time to run.
Thanks!

Answers

  • benken_parasoft
    benken_parasoft Posts: 1,384 ✭✭✭
    edited September 26

    You would have to make sense of the failure messages themselves. I would look at the first test failure that happened in the run, study the error message to understand what happened, and then consider what may have caused and how to resolve it. There are a lot of different things that can cause tests to fail including unstable locators, insufficient wait conditions, or inconsistent behavior of the web application itself. When elements are not being found on the page the tests take longer to execute because it can take a while for wait conditions to time out.

    One thing that can be useful to troubleshoot failures is enabling screenshots in the HTML report. You can do this by passing -J-Dcom.parasoft.tool.screenshot.Location=<embed> to soatestcli. See Screenshots on failure for browser tests using Selenium WebDriver

  • Speedy993
    Speedy993 Posts: 92

    The errors that occur seem to be somewhat random. When running the tests via the command line, the actual steps of the tests that fail seem to change with each run. Sometimes all of the tests run fine, but most of the time there are errors. They are mostly of the variety of locator errors but not always. I am using the browser Microsoft Edge browser.
    I am also having trouble getting the screenshots of the failures to appear in the HTML report. Among the parameters I have for the calling of soatestcli.exe are:
    -J-Dcom.parasoft.tool.screenshot.Location=

    -J-Dcom.parasoft.browser.BrowserPropertyOptions.EDGE_ARGUMENTS=-inprivate

    -report C:/workspace/Doc_Tests/build/reports/soatest/DocTest/devreport.htm
    Could this '-report' be causing the lack of screenshots?

    Also, many of the errors are this now: (Edge 142.0.3595.53) User action failed: Edge test framework did not start properly. I have made sure that the versions of Edge and the Edge web driver match. Is there something else that may be causing this error? Should I open a new discussion for this question?

    Thanks!

  • benken_parasoft
    benken_parasoft Posts: 1,384 ✭✭✭
    edited November 7

    I am also having trouble getting the screenshots of the failures to appear in the HTML report

    Aside from -J-Dcom.parasoft.tool.screenshot.Location=<embed> you should also pass -property report.test_suites_only=false. If you are already passing a -settings argument then you can instead put report.test_suites_only=false directly in your settings file.

    They are mostly of the variety of locator errors but not always

    Screenshots should help you understand why elements are not being found on the page. The page may not be showing expected content, like if the web application is showing an error or if the page is loading extra slow.

    Also, many of the errors are this now: (Edge 142.0.3595.53) User action failed: Edge test framework did not start properly

    Something may have prevented the browser from starting or prevented it from starting in a timely manner. Low system resources or poor disk performance can be culprits. You would have to observe the system as it is running the tests for clues (Task Manager). Windows machines can be unstable because of background activity consuming resources at unexpected times, like downloading and installing updates in the background. Some malware/antivirus scanners can also significantly affect overall system performance. If this is a virtual machine then sometimes virtual machines are not configured for optimal performance resulting in poor disk and network performance.

    As an aside, screenshots wouldn't help for this type of error. The screenshot is something returned by the browser. If the browser failed to start then there is no screenshot to take.

    I have made sure that the versions of Edge and the Edge web driver match

    In recent versions of SOAtest webdrivers are downloaded from the Internet automatically. This is a feature of Selenium which SOAtest uses to drive the browser playback. If you have been copying a msedgedriver executable into the SOAtest installation then you can consider omitting this step.

  • Speedy993
    Speedy993 Posts: 92

    Thanks benken_parasoft! I was not aware that SOATest was checking for updates. Is this something that is configurable as to how often it checks for an update? Just curious as to how it works.

    Anyway, a little more on the issues I am currently having. The errors that are occurring during the running of the regression tests had been happening to one of the several applications we test but just yesterday happened to another application. As I mentioned above, they are seemingly random locator errors that have been working without issue the last time the test were run and change with each run. The majority, and sometimes the only errors, are the framework error: User action failed: Edge test framework did not start properly. One thing I had not mentioned earlier is that SOATest is running from the command line via ANT script on a server running Windows Server 2019. We will be updating this to Windows Server 2025 at some point in the not too distant future but budget constraints have resulted in this delay. Is there a version of Windows that is not supported for SOATest? I am getting screenshots when the locator errors occur now so that parameter is working correctly.

    And just to add a little fun to this discussion, I just got the Edge framework error running SOATest on my development virtual computer. This is on a Citrix VM running Windows 11 (updated in-place from Windows 10). This is the first time I have seen this error running SOATest on my development VDM. Further running these test has not resulted in the Edge error again. I don't know if the virtual computer configuration may have resulted in Windows being unstable or low on resources. (1 TB virtual disk, 16 GB virtual RAM) I am continuing to try different things to resolve this but if you have any other suggestions, I will certainly try them, if I am able. I have no real control in how the Windows Server is configured but may be able to request some changes.
    Thanks!

  • benken_parasoft
    benken_parasoft Posts: 1,384 ✭✭✭
    edited November 12

    Thanks benken_parasoft! I was not aware that SOATest was checking for updates. Is this something that is configurable as to how often it checks for an update? Just curious as to how it works.

    I was referring to the Windows Operating System and Windows-related software components, not Parasoft tooling. Microsoft's "Patch Tuesday" is a good example, where Windows machines everywhere start downloading and installing updates in the background, slowing down the system and contributing to instability of any automated tests that you may be running on the machine at the time.

    Is there a version of Windows that is not supported for SOATest?

    We currently support Windows Server 2022 and 2025. However, the errors you describe suggests performance issues on the host where you are running the tests, where Edge isn't starting or is taking too long to start. It is msedgedriver returning an error back to Selenium which SOAtest uses for browser playback.

    I am getting screenshots when the locator errors occur now so that parameter is working correctly.

    If you are getting an error about an element not being found within some amount of time then the locator that's configured in the test indeed didn't work. If you see the element on the page in the screenshot then the locator is likely unstable, where it may not be sufficient to uniquely or consistently identify the element.

    Two of the most common issues with UI testing is unstable locators and insufficient wait conditions. Either the locator needs to be improved or the wait conditions need to be improved, sometimes both. It is common for testers to write UI tests, watch them pass on their local machine, then run them on a slower machine somewhere else and see them fail or sometimes fail. If the machine is having performance issues or just performing the actions in the browser too slowly then you may need to adjust the wait conditions, like increasing maximum wait time or adding additional wait conditions.

    just got the Edge framework error running SOATest on my development virtual computer

    Virtual machines can perform very poorly for a variety of different reasons.

  • Speedy993
    Speedy993 Posts: 92

    One more thing I apparently forgot to include is this warning message I am getting when the tests run:
    org.openqa.selenium.devtools.CdpVersionFinder findNearestMatch
    [exec] WARNING: Unable to find an exact match for CDP version 142, returning the closest version; found: 138; Please update to a Selenium version that supports CDP version 142

    I suspect that this error is causing the locator errors when the application is not at the page it should be when the particular test step is executed.
    Is this complaining about Chrome or Edge or both? Edge and its webdriver are both current. Does the Selenium webdriver need to be updated separately? How is this updated? Is there a jar file or something that I can update?
    Thanks for your assistance and patience with me in this discussion.

  • tony
    tony Posts: 35 ✭✭

    I suspect that this error is causing the locator errors when the application is not at the page it should be when the particular test step is executed.
    Is this complaining about Chrome or Edge or both? Edge and its webdriver are both current. Does the Selenium webdriver need to be updated separately? How is this updated? Is there a jar file or something that I can update?

    This message does not affect locators. It indicates that the Selenium version in SOAtest is not up-to-date with the latest browser release. This happens frequently because of the rapid pace of browser releases. However, the APIs used by Selenium WebDriver to communicate with the browsers are very stable by design, so this does not cause issues. It is separate from the webdriver executables you may have updated and installed, and it cannot be updated without updating SOAtest.

    From looking at the conversation in this thread, it looks like the tests are failing due to running in a slower environment. This can usually be handled by adjusting wait times. You can adjust default wait times under Parasoft > Preferences > Browser > Wait Condition Default Timeout Settings. You can also adjust individual wait times at steps that seem most affected by adjusting the settings under the "Wait Conditions" tab of the Browser Playback Tool.

    You can also try disabling fetching browser contents, which can put some overhead on browser playback tests, by passing -J-Dwebtool.browsercontroller.webdriver.thirdparty.GeneralOptions.BROWSER_CONTENTS_ENABLED=false

    If certain test suites are particularly error prone, you can also try configuring the parent test suite to retry the suite after a failure. This can be configured by double-clicking the test suite > Execution Options > Test Flow Logic, and changing the Flow type to "While pass/fail." Specify the maximum number of times you want to retry under "Maximum number of loops" and "Loop until" "Every test" "Succeeds."

    You might also consider what factors may be causing the tests to run slowly, such as competing processes on the virtual machine or virtual machine host. As mentioned earlier in the thread, virtual machines typically run more slowly than non-virtual machines.