Multiple Regression Controls for use with Mulitple Rows in an input data source

LegacyForumLegacyForum Posts: 1,669 ✭✭
edited December 2016 in SOAtest
How do I ingore arrays of elements?
I have a test that runs 4 iterations of a test and returns arrays of data. In one case I expect to verify 5 different arrays of data and in another test I only need to verify three arrays of data. How can I configure the test to verify upto 5 arrays and ignore arrays that are not displayed.

For example: I have a search test that does 4 searches. Search 1 returns 3 arrays. Search 2 returns 4 arrays. Search 3 returns 5 arrays. Search 4 returns nothing. I am using an excel table that has columns for the search value and retun values (upto 5 return values). I created a test and added the multiple regression controls and mapped the data table columns to the appropriate return values. Now when I run the test, any search with less than 5 expected return values fails, eg. Search 1 fails and it says "Output:Deleted XPath.....[4]" and "Output:Deleted XPath.....[5]"
Tagged:

Comments

  • LegacyForumLegacyForum Posts: 1,669 ✭✭
    If you selected the Create Regression Control->Multiple Controls option, then SOAtest will create a Diff tool and use the results of the test as the expected values. The Diff tool should have a row selector called "Data source values used:" Clicking on the Form XML view will show the XML view of the current row selected in the Literal XML. Running the test while in Form XML mode will cause SOAtest to check the responses from all the iterations during the test run with that single regression control.

    If you set the Diff tool to Literal XML mode, then each iteration will be checked with the corresponding row in the Diff tool. We are considering adding a feature to make the Form XML check multiple iterations.

    Is there a reason you are using data sources to contain the expected values instead of the Diff tool? The text in the Literal XML mode can be changed. If you put the expected values in there, then you don't need to parameterize data sources for the Diff tool.
  • LegacyForumLegacyForum Posts: 1,669 ✭✭
    I am using a data source instead of the literal xml that is used in the diff for a variety of reasons.

    1. The diff tool is really nice, however, if someone accidently updates the regression control for a test and the response results are not expected then I might pass the test when it should fail. I've done this on accident numerous times already so I can easily prevent myself from making this mistake, however, anyone else accessing the test could potenially cause this problem unbeknownst to them.

    2. If I use a data source, then the source can only be updated by the tester and thus prevents accidents from happening.

    3. If I test the same operation in a different version of the web service, I can just point the tests to the new data source instead of re-verifying the Literal XML for each test.

    4. In the case where there are similar operation, like createUser and getUser, it would be beneficial to use the same data source to verify the create by calling the get. This can already be done, but if the xml output between test runs does not match the original then I am force to create multiple tests and recreate the regression controls for my data sources.


    In my opinion it would be awesome if the Form XML was a little more flexible when checking multiple iterations. Having this will save users a ton of time setting up tests since they would not have to create multiple tests and point them all to parameters in different data tables. Right now, I feel like I am doing the same thing over and over and over, instead of doing it once.

    Thanks for your assistance. Do you know if this enhancement is slated for the next SOATest release.
  • LegacyForumLegacyForum Posts: 1,669 ✭✭
    Responded to email.
Sign In or Register to comment.