For automation purposes, I need to find a reporter for Newman which displays the following: Request headers, request body, status code, and error message.
Can anyone recommend me a decent reporter? In case it does not display what I mentioned above, I just need a reporter with more info than simply "your test failed".
It needs to be in one of the following formats so I can use "Publish Test Results".
I LOVE newman-reporter-htmlextra. The layout gives you everything you could ask for, you can even provide a custom stylesheet if you really want to.
https://www.npmjs.com/package/newman-reporter-htmlextra
Related
This question already has an answer here:
How to add an option to Cucumber report to remove scenarios that have a certain tag
(1 answer)
Closed 1 year ago.
I am already using Error log in logback-test.xml I am also using log modifier to hide secret keys in request and response.
Everything in karate report looks good and all data is masked as expected like the header name is "Authorization" which is coming as masked in karate report in this variable "authToken" value is getting passed from karate-config.js. In cucumber report too same header is masked but cucumber report is showing additional information and displaying karate-config.js file contents which has all data. What can I do to hide this information in cucumber report?
I have tried this as well it doesn't display request and response but still prints all contents of karate-config.js in cucumber report.
This only happens for the first feature file when project executes it displays the contents of karate-config.js
//ConfigurereportstonotshowrawHTTPrequests/responses,andtoskipnon-BDD(asterisk)steps
karate.configure('report',{showLog:false,showAllSteps:false})
EDITED
Thanks Peter atleast now I understood the issue. We have to call one feature file from Karate-config.js once only for whole project. To do that we are doing below
var sample = karate.callSingle('classpath:sample/test.feature#test1',config);
As we are calling this from karate-config.js so in cucumber report it is showing the complete list of variables and then calling this feature file once. This contains very sensitive data which we can't show in report. I tried adding annotation #report=false with scenario in test.feature#test1 it is still showing the variables list. As we are passing arguments as config in callSingle it will show list of variables in report. Please guide on this how to tackle such issue.
Please read the section on "Log Masking Caveats". If you use a call in any form, parameters will be printed to the log by default.
https://github.com/karatelabs/karate#log-masking-caveats
Note that you can call a feature file, which has the #report=false annotation, and that might be the easy solution.
EDIT: A feature request has been logged: https://github.com/karatelabs/karate/issues/1837
The link above also provides a workaround you can use until the feature is released.
I want to have an option on the cucumber report to mute/hide scenarios with a given tag from the results and numbers.
We have a bamboo build that runs our karate repository of features and scenarios. At the end it produces nice cucumber html reports. On the "overview-features.html" I would like to have an option added to the top right, which includes "Features", "Tags", "Steps" and "Failures", that says "Excluded Fails" or something like that. That when clicked provides the same exact information that the overview-features.html does, except that any scenario that's tagged with a special tag, for example #bug=abc-12345, is removed from the report and excluded from the numbers.
Why I need this. We have some existing scenarios that fail. They fail due to defects in our own software, that might not get fixed for 6 months to a year. We've tagged them with a specified tag, "#bug=abc-12345". I want them muted/excluded from the cucumber report that's produced at the end of the bamboo build for karate so I can quickly look at the number of passed features/scenarios and see if it's 100% or not. If it is, great that build is good. If not, I need to look into it further as we appear to have some regression. Without these scenarios that are expected to fail, and continue to fail until they're resolved, it is very tedious and time consuming to go through all the individual feature file reports and look at the failing scenarios and then look into why. I don't want them removed completely as when they start to pass I need to know so I can go back and remove the tag from the scenario.
Any ideas on how to accomplish this?
Karate 1.0 has overhauled the reporting system with the following key changes.
after the Runner completes you can massage the results and even re-try some tests
you can inject a custom HTML report renderer
This will require you to get into the details (some of this is not documented yet) and write some Java code. If that is not an option, you have to consider that what you are asking for is not supported by Karate.
If you are willing to go down that path, here are the links you need to get started.
a) Example of how to "post process" result-data before rendering a report: RetryTest.java and also see https://stackoverflow.com/a/67971681/143475
b) The code responsible for "pluggable" reports, where you can implement a new SuiteReports in theory. And in the Runner, there is a suiteReports() method you can call to provide your implementation.
Also note that there is an experimental "doc" keyword, by which you can inject custom HTML into a test-report: https://twitter.com/getkarate/status/1338892932691070976
Also see: https://twitter.com/KarateDSL/status/1427638609578967047
I want to have an option on the cucumber report to mute/hide scenarios with a given tag from the results and numbers.
We have a bamboo build that runs our karate repository of features and scenarios. At the end it produces nice cucumber html reports. On the "overview-features.html" I would like to have an option added to the top right, which includes "Features", "Tags", "Steps" and "Failures", that says "Excluded Fails" or something like that. That when clicked provides the same exact information that the overview-features.html does, except that any scenario that's tagged with a special tag, for example #bug=abc-12345, is removed from the report and excluded from the numbers.
Why I need this. We have some existing scenarios that fail. They fail due to defects in our own software, that might not get fixed for 6 months to a year. We've tagged them with a specified tag, "#bug=abc-12345". I want them muted/excluded from the cucumber report that's produced at the end of the bamboo build for karate so I can quickly look at the number of passed features/scenarios and see if it's 100% or not. If it is, great that build is good. If not, I need to look into it further as we appear to have some regression. Without these scenarios that are expected to fail, and continue to fail until they're resolved, it is very tedious and time consuming to go through all the individual feature file reports and look at the failing scenarios and then look into why. I don't want them removed completely as when they start to pass I need to know so I can go back and remove the tag from the scenario.
Any ideas on how to accomplish this?
Karate 1.0 has overhauled the reporting system with the following key changes.
after the Runner completes you can massage the results and even re-try some tests
you can inject a custom HTML report renderer
This will require you to get into the details (some of this is not documented yet) and write some Java code. If that is not an option, you have to consider that what you are asking for is not supported by Karate.
If you are willing to go down that path, here are the links you need to get started.
a) Example of how to "post process" result-data before rendering a report: RetryTest.java and also see https://stackoverflow.com/a/67971681/143475
b) The code responsible for "pluggable" reports, where you can implement a new SuiteReports in theory. And in the Runner, there is a suiteReports() method you can call to provide your implementation.
Also note that there is an experimental "doc" keyword, by which you can inject custom HTML into a test-report: https://twitter.com/getkarate/status/1338892932691070976
Also see: https://twitter.com/KarateDSL/status/1427638609578967047
I need a sample file for the latest Scenarios of AATS Testing. We generated the code for this and want to check the against to accepted one.
"Latest scenario" is always changing. More importantly, why don't you just submit your XML and see what response you get from the IRS? If they accept it, there's no need to do anything else. If they reject it, they will indicate where it failed. Either way, there's no need to see anybody else's XML.
You can find the current scenarios and answer keys for the scenarios for AATS testing at the following link:
https://www.irs.gov/for-tax-pros/software-developers/information-returns/affordable-care-act-assurance-testing-system-information
Edit:
Be sure to review the "AATS Scenario Known Issues and Solutions" link. As of this post, that document was last updated on 4/26/2016 and mentions some issues that have been reported and solutions for them.
I am re factoring large sets of tests in SoapUi.
Is there a way to automate creation and renaming of test cases/test steps through Groovy?
Thanks.
Possibly not what you're looking for but I've had some success manually editing the test suite xml using find and replace in a text editor. You need to be careful and make sure to back up a copy.
You can run Groovy code within SoapUI using SoapUI Groovy Console plugin. This way, you can change any property you want programmatically (within API constraints of course).
As for technical details on how to solve your actual problem, I can only refer you to this blog post and SoapUI's javadoc. Based on the blog post, you need to figure out what's given to you, and based on the API, you need to figure out how to achieve what you need.
In my case, I started with my project being bound as a project variable, and moved on from there.
There is really not so much information in the question about what you really need to achieve, but given the little you provided, one way to go would be to directly modify the soapUI project XML file. I have done this with some success in the past. The last I used soapUI, it UI did not come with massive refactoring functionality.