Is there a way to write to a text file using Karate - dsl

In my karate tests i need to write response id's to txt files (or any other file format such as JSON), was wondering if it has any capability to do this, I haven't seen otherwise in the documentation. In the case of no, is there a simple JavaScript function to do so?

Try the karate.write(value, filename) API but we don't encourage it. Also the file will be written only to the current "build" directory which will be target for Maven projects / stand-alone JAR.
value can be any data-type, and Karate will write the bytes (or plain-text) out. There is no built-in support for any other format.
Here is an example.
EDIT: for others coming across this answer in the future the right thing to do is:
don't write files in the first place, you never need to do this, and this question is typically asked by inexperienced folks who for some reason think that the only way to "save" a response before validation is to write it to a file. No, please don't waste your time - and please just match against the response. You can save it (or parts of it) to variables while you make other HTTP requests. And do not write your tests so that scenarios (or features) depend on other scenarios, this is a very bad practice. Also note that by default, Karate will dump all HTTP requests and responses in the log file (typically in target/karate.log) and also in the HTML report.
see if karate.write() works for you as per this answer
write a custom Java (or JS function that uses the JVM) to do what you want using Java interop
Also note that you can use karate.toCsv() to convert JSON into CSV if needed.

My justification for writing to a file is a different one. I am using karate explicitly to implement a mock. I want to expose an endpoint wherein the upstream system will send some basic data through json payload using POST/PUT method and karate will construct the subsequent payload file and stores it the specific folder, and this newly created payload file will be exposed through another GET call.

Related

How to append into txt file through karate.write(value,file.txt) function? [duplicate]

In my karate tests i need to write response id's to txt files (or any other file format such as JSON), was wondering if it has any capability to do this, I haven't seen otherwise in the documentation. In the case of no, is there a simple JavaScript function to do so?
Try the karate.write(value, filename) API but we don't encourage it. Also the file will be written only to the current "build" directory which will be target for Maven projects / stand-alone JAR.
value can be any data-type, and Karate will write the bytes (or plain-text) out. There is no built-in support for any other format.
Here is an example.
EDIT: for others coming across this answer in the future the right thing to do is:
don't write files in the first place, you never need to do this, and this question is typically asked by inexperienced folks who for some reason think that the only way to "save" a response before validation is to write it to a file. No, please don't waste your time - and please just match against the response. You can save it (or parts of it) to variables while you make other HTTP requests. And do not write your tests so that scenarios (or features) depend on other scenarios, this is a very bad practice. Also note that by default, Karate will dump all HTTP requests and responses in the log file (typically in target/karate.log) and also in the HTML report.
see if karate.write() works for you as per this answer
write a custom Java (or JS function that uses the JVM) to do what you want using Java interop
Also note that you can use karate.toCsv() to convert JSON into CSV if needed.
My justification for writing to a file is a different one. I am using karate explicitly to implement a mock. I want to expose an endpoint wherein the upstream system will send some basic data through json payload using POST/PUT method and karate will construct the subsequent payload file and stores it the specific folder, and this newly created payload file will be exposed through another GET call.

Preview transformation in vscode using javascript/typescript

We are consuming apis returning json in our projects. The json from those api can contain rather large structures which need to be mapped into other large structures (usually json but could be xml or csv rarely).
We used to use dataweave (from Mulesoft) to do that, and if you're not familiar with dataweave, it's pretty good at that sort of mapping. It let's you define a sample input, and while editing the dataweave it shows you a preview of the result in a separate pane.
For some apis we switched to using nodejs (because it offers better control and debugging than Mule, long story). But I'd really like the same mapping experience as dataweave.
So I guess the question is: can I use vscode to define an input file in a directory, a transformation file in javascript and have the resulting mapped output display in a pane which is updated live?
Is there some plugin offering that? Couldn't find it.
My understanding is the following:
You have a mule workflow which needs to read a file(you edited the file in vscode) and execute a server side javascript (nodjs) to transform the file and after the result is obtained, the mapped result will be pushed into a web page ? right ?
All happen under a given mule workflow right? and you are wondering there is any mule connector to do this process ? right ?

How to convert CSV to JSON using template via Azure Logic App

Is it possible to convert CSV to JSON using built-in/managed/3rd party template, without using Azure Function via Azure Logic App?
Below is using Azure Function, which is generated automaticately. However, I cannot find the link like what it mentions. Ideally, no Azure function is required.
http://blogs.recneps.org/post/Processing-a-flat-file-with-Azure-Logic-Apps
https://social.msdn.microsoft.com/Forums/en-US/e0ea1adc-1979-44df-a4d1-52290338bc78/transform-csv-in-logic-app?forum=azurelogicapps
Below, No CSV to JSON available.
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-enterprise-integration-liquid-transform
I will admit this is not my proudest work but it seems to work fairly well. I was able to turn a CSV file from my onedrive into JSON objects.
//Updated with less variables, splits, set and replaces actions
Input
Output(second object, first object and last needs to be purged)
How? A lot of steps that could possibly be removed and merged but. Using splits and replace actions I could single out each line and down the line create a JSON object. I was going first for an array but eventually, it was not that hard to make it into a JSON object. Not entirely sure how it works with null values.
This is probably not the best way to handle this, drawbacks here is that it is a lot of actions, the first object is the headers and that needs to be removed, there will also be a very last object that is just null(which is fine).
Entire schema
Concurrency set to 1 here

Handling Excel Spreadsheets with Cucumber

I am planning to work on the Cucumber feature file with Groovy code (Katalon Studio) for step definitions. I wanted to use the excel file in Cucumber file or to see is there any other option to use it.
I have not yet tried as of now any other option. I am thinking just passing the cucumber step file without any parameter and then using the excel file with in the step definition and access excel file and get the corresponding value.
I see there is a post in this forum suggesting to use QMetry Automation Framework for this type of question. But it does not look like this will help on this or should I use the passing the row index from cucumber file and based on that retrieve the value. Please guide on this.
Handling excel spreadsheets with Cucumber Scenario Outline
You should know that this is not supported by Cucumber.
As specified in the FAQ:
"We advise you not to use Excel or csv files to define your test cases; using Excel or csv files is considered an anti-pattern.
One of the goals of Cucumber is to have executable specifications. This means your feature files should contain just the right level of information to document the expected behaviour of the system. If your test cases are kept in separate files, how would you be able to read the documentation?
This also means you shouldn’t have too many details in your feature file. If you do, you might consider moving them to your step definitions or helper methods. For instance, if you have a form where you need to populate lots of different fields, you might use the Builder pattern to do so."
If you are using cucumber java 5+ you can add qaf-cucumber dependency. It should work with groovy as well. It will enable to have examples from external source like CSV, XML, JSON, EXCEL, DB.

p:remotecommand to upload files

I would like to upload file programatically to my jsf application. The user should select a directory on his system, and a js script should loop on any file in dir and send each one to the listener serverside
I cannot use FileUpload, because it cannot select a whole dir with thousands of file, so I was thinking to use jquery and send the file to a remotecommand, but I have no clue to how send the file itself (normally I pass just string)
so I was thinking to use jquery and send the file to a remotecommand, but I have no clue to how send the file itself
Don't go there. It is a bad attempt to a workaround for a bad design choice. You'd most likely run into similar problems and what about the user selecting a lot of files for second time if it fails halfway? It might become slow, might run into browser limits (search for uploading multiple files in plain html)...
If you still want to do it via a webbrowser (which according to one of your other questions you do not want to), maybe try something like https://webdeltasync.github.io/ (disclaimer: I did not use it myself, and there might be similar ones (https://www.google.com/search?q=browser+based+rsync), it is just a hint in which direction to find a real solution)

Resources