Correlate in load runner with different type of response body - performance-testing

While correlation in load runner,
For same request some time response is received in json format and other time response received in XML for eg.
Json response:
{"VerificationId": "xyzabc123567"}
XML response be like:
<VerificationId>abcuwn274637</verificationId>
Is there any way to correlate it?

Set two correlations for two different variables, such as VerifXML, and VerifJSON. Your failed one will be empty. Set the error condition for warning [ "Notfound=warning" ] in the case of a failed correlation.
Assign the non empty collection to a standard loadrunner variable [ lr_save_string(....) ] to continue.

Related

How to convert REST API's request body of String datatype into JSON Object in Node JS and test that in Postman?

I have a Node JS application running with Express and mongodb. I have a use case where my device sends data to my server. It might be JSON data or Comma-Separated String(Only String, not CSV file). I need to check which type of data is coming and manipulate that to JSON if request body would be a String. When I was trying to display the data type of data being sent to the server, it's displaying as "object" even after giving the "String" data as input. And the operation is getting successful but data is not inserting into the database. Can anyone help me in resolving this issue?
Sample Payload(request.body) would be,
"heat:22,humidity:36,deviceId:sb-0001"
Expected response is,
{
"heat": "22",
"humidity": "36",
"deviceId": "sb-0001"
}
#Aravind Actually typeof will returns "string" if operand is a string.So please check whether the string is coming or not in body, because if it is null then typeof will return "object".
I need to check which type of data is coming and manipulate that to JSON ...
HTTP is build upon the media-type concept that defines the syntax used in the payload of that type as well as the semantics of the elements used within that payload. You might take a look at how HTML defines forms to get a grip what a media-type specification should include. Through content negotiation client and server agree on a common media-type and thus a representation format to exchange payloads for supported by both. This simply increases interoperability between the participants as they exchange data in well-defined, hopefully standardized, representation formats both understand and support.
Through Accept headers a receiver can state preferences on which types to receive, including a weighting scheme to indicate that a certain representation format is preferred over an other one but the recipient is also fine with the other one, while a Content-Type header will indicate the actual representation format being sent.
RFC 7111 defines text/csv for CSV based representations and RFC 8259 specifies application/json for JSON payload. As the sender hopefully knows what kind of document it sends to the receiver, you can use this information to distinguish the payload on the receiver side. Note however that according to Fielding true REST APIs must use representation formats that support hypertext-driven interaction flows to allow clients to take further actions upon the payload received without having to invoke some external documentation. Both, JSON and CSV, by default don't support such an interaction model, that is abreviated with the term HATEOAS. For your simple scenario the content type negotiation might though be sufficient enough to solve your needs.
In terms of processing CSV and converting the data to JSON this is simply a matter of splitting up the CSV string on the delimiter symbol (, in the sample) to key-value pairs and then splitting the key and values further on the : symbol and adding these to a new JSON object. There is also a csvtojson library available at NPM that you might utilize in such a case.

Customizing Validate Node Error Message in OSB 12c

When we add the Validate Node in the OSB 12c for validating the incoming request against XSD, and if the validation fails ,
in some fault messages the field name that is causing the validation error is displayed. But only for decimal values , fault message is just saying Invalid decimal Value and no mention about the field from where the error is thrown. Can we overcome this issue
I am not sure this is direct solution. But there is a workaround which may suit your need
Create an XQuery which validates the payload and throws custom error messages
eg: for xml element which should contain decimal value abc
if ($a instance of xs:long)
then ()
else (fn:error(xs:QName('Your error code'), 'your error message'))
This is a suitable method if the payload is small.
https://gibaholms.wordpress.com/2013/09/24/osb-throw-exception-in-xquery1
If the payload is large
identify the fields which are supposed to have these type of issues.
Create an XQuery for validating these fields with error messages.
Use validate node inside a stage and use a stage error handler
Validate the payload using xquery inside stage error handler

Loopback 3 discards error information on multiple validation errors, turning 422 to 500, how can I solve that?

I'm migrating from Loopback 2 tot 3.
I currently have an issue with validation errors and strong-error-handler
When I post a bulk create which results in multiple validation errors, those get returned as an array of ValidationErrors.
Those errors get grouped by strong-error handler in a 500 internal server error, which is how it was before, but the details of the errors get discarded, when debug is set to false.
In my example I upload an array of tags, but for each tag, a uniqueness validation is executed. When 2 or more tags are already in the database, I have an array of errors, instead of a single validation error
I need a way to determine why the validation failed on the client side, but the details of the errors are discarded now.
Am I doing something wrong here, or should this be considered as a bug?
From the strongloop error handler documentation in loopback,
In production mode, strong-error-handler omits details from error responses to prevent leaking sensitive information:
More information
For 5xx errors, the output contains only the status code and the status name from the HTTP specification.
For 4xx errors, the output contains the full error message (error.message) and the contents of the details property (error.details) that ValidationError typically uses to provide machine-readable details about validation problems. It also includes error.code to allow a machine-readable error code to be passed through which could be used, for example, for translation.
Am I doing something wrong here, or should this be considered as a bug?
No this is the intended behaviour
Safe error fields
You can set the stack trace as "safe-error-field" so that it will be displayed in production.
For example, the stack field is not displayed by default if you run the loopback in production mode.
If you still want to display the stack field, then change the config json in the server/middleware.json
"final:after": {
"strong-error-handler": {
"params": {
"safeFields": ["stack"]
}
}
}

How to verify if values are updated or not by API using groovy in soap ui

I am using soapui and groovy for api automation and assertion.
Have one API which updates user profile data. i.e update username,firstname,lastname etc.
What is best way to verify that if data is updated or not after run update api. In groovy is there any way by which I can store previous data from API response then run update api and again check response and finally compare previous response and latest one?
What I have tried it comparing values which I am going to sent via API and values which API returns. If both equal then assume that values update. But this seems not perfect way to check update function.
Define a test case level custom property, say DEPARTMENT_NAME and value as needed.
Add a Script Assertion for the same request test step with below script:
//Check if the response is received
assert context.response, 'Response is null or empty'
//Parse text to json
def json = new groovy.json.JsonSlurper().parseText(context.response)
log.info "Department name from response ${json.data.name}"
assert json.data.name == context.expand('${#TestCase#DEPARTMENT_NAME}'), 'Department name is not matched'
You may also edit the request, and add the value as ${#TestCase#DEPARTMENT_NAME} instead of current fixed value XAPIAS Department. So that you can just change the value of department name at test case level property, the same is sent in the request and the same is verified in the response.
Use JDBC teststep to run query directly into database:
Use Xpath assertion to Validate your Update API
Assertion 1
/Results/ResultSet[1]/Row[1]/FirstName
Expected result
Updated FirstName
Assertion 2
/Results/ResultSet[1]/Row[1]/LastName
Updated Last Name
In our project we do it in this way:
First we execute all the APIs.
Then we Validate all the new/updated data in database in DB Validation testcase.
Works well in highly integrated environment as well.

Beanstream errorType values

According to their API documentation, the field of errorType can return either N, S, or U. I assume N = None because that's returned upon success.
What do S and U mean?
From a Beanstream integration document:
The errorType response variable will indicate “U” if a form field error
occurs. The errorFields variable will contain a list of fields that failed validation. errorMessage will contain descriptive text that may be displayed to customers if desired.
And:
System generated errors can be identified in a Server to Server integration by a response message “errorType=S” in the Beanstream response string. If a system generated error occurs, validate your integration and website setup.

Resources