Getting all classes in a python project and assigning dummy values to each attribute depending on the field type for test purposes - python-3.x

I have a FastAPI application. I need to test it using test data. I have a shell script written to load the data from a json file and run to test it and it works perfectly. I just need a way to automatically derive the json test file. The Json file contains something like this
{
"name": "registration-test",
"endpoint": "/user/register",
"method" : "post",
"request": {
"name": "hello ",
"email": "foo334#gnuze.org",
"password": "123456"
},
"expression": "email",
"expected": "foo334#gnuze.org"
}
To get the request data I need a way to get all the classes in the project and assign dummy values and return it in a JSON format.

Related

Logic App - Expression to get a particular substring from a json

I have out form one of the tasks in Logic App:
{
"headers": {
"Connection": "close",
"Content-Type": "application/json"
},
"body": {
"systemAlertId": "....",
"endTimeUtc": null,
"entities": [
{
"$id": "us_1",
"hostName": "...",
"azureID": "someID",
"type": "host"
},
{
"$id": "us_2",
"address": "fwdedwedwedwed",
"location": {
"countryCode": "",
},
"type": "ip"
},
],
}
}
I need initialize some variable named resourceID that contains value someID which is read from above example.
Value someID will always be found in the first member of Entities array, in that case I guess need to use function first
Any idea how expression of Initial variable should look?
Thanks
Considering the mentioned data you are receiving from Http trigger, I have used Parse JSON in order to get the inner values of the mentioned JSON. Here is how you can do it.
and now you can initialize the resourceID using 'Initialise variable' connector and set its value to azureID as per your requirement.
Have a look at the Parse JSON action.
To reference or access properties in JavaScript Object Notation (JSON) content, you can create user-friendly fields or tokens for those properties by using the Parse JSON action. That way, you can select those properties from the dynamic content list when you specify inputs for your logic app. For this action, you can either provide a JSON schema or generate a JSON schema from your sample JSON content or payload.
With the information in the JSON available in an object, you can more easily access it.

Convert to JSON and map to new JSON object in Alteryx

I am using Alteryx to take an Excel file and convert to JSON. The JSON output I'm getting looks different to what I was expecting and also the object starts with "JSON": which I don't want to happen and I would also like to know how/which components I would use to map fields to specific JSON fields instead of key value pairs if I need to later in the flow.
I have attached my sample workflow and excel which are:
Excel screenshot
Alteryx test flow
JSON output I am seeing:
[
{
"JSON": "{\"email\":\"test123#test.com\",\"startdate\":\"2020-12-01\",\"isEnabled\":\"0\",\"status\":\"active\"}"
},
{
"JSON": "{\"email\":\"myemail#emails.com\",\"startdate\":\"2020-12-02\",\"isEnabled\":\"1\",\"status\":\"active\"}"
}
]
What I expected:
[{
"email": "test123#test.com",
"startdate": "2020-12-01",
"isEnabled": "0",
"status": "active"
},
{
"email": "myemail#emails.com",
"startdate": "2020-12-02",
"isEnabled": "1",
"status": "active"
}
]
Also, what component would I use if I wanted to map the structure above to another JSON structure similar this one:
[{
"name":"MyName",
"accounType":"array",
"contactDetails":{
"email":"test123#test.com",
"startDate":"2020-12-01"
}
}
} ]
Thanks
In the workflow that you have built, you are essentially creating the JSON twice. The JSON Build creates the JSON structure, so if you then want to output it, select your file to output and then change the dropdown to csv with delimiter \0 and no headers.
However, try putting an output straight after your Excel file and output to JSON, the Output Tool will build the JSON for you.
In answer to your second question, build the JSON for Contact Details first as a field (remember to rename JSON to contactDetails). Then build from there with one of the above options.

How can I write or push json to a json file?

So I am building this react app, where I have to create a json structure and then add this to a json file.
Here is the structure I have to build:
{
"name": {
"label": "Name",
"type": "text",
"operators": ["equal", "not_equal"],
"defaultOperator": "not_equal"
},
"age": {
"label": "Age",
"type": "number",
"operators": [
"equal",
"not_equal",
"less",
"less_or_equal",
"greater",
"greater_or_equal",
"between",
"not_between",
"is_empty",
"is_not_empty"
]
},
"gender": {
"label": "Gender",
"type": "select",
"listValues": {
"male": "Male",
"female": "Female"
}
}
}
After finishing with the json structure, I want to push this to a json file, which is the configuration file for a react library (react-awesome-query-builder). Now how can I write to a json file using JS?
I know that I can use Node.js and use fs for this, but I am not sure how to use this in react. Perhaps there is a library I can use to do this?
Can someone point me to the right direction?
You can write it exactly as a traditional text file (because it is basically text). Meanwhile don't forget to give to your file the extension .json!
It is when you will open it afterward that you will have to parse it as json wich is well handled by plenty of libraries
Well... I guess I figured it out. Instead of using an external json file I tested the config file inside as a variable and it works. No need to use read/write from node.js or any of that.

JSONata - JSON to JSON Transformation in Nodejs API

I need to write REST API in Node jS for JSON to JSON transfromation.
There are many library and I sort listed "JSONata"
Please find JSONata simple sample here
The challenge is API receive JSON which has data and map but JSONata require map value without quotes.
{
"data" : {
"title" : "title1",
"description": "description1",
"blog": "This is a blog.",
"date": "11/4/2013"
},
"map" : {
"name": "title",
"info": "description",
"data" : {
"text": "blog",
"date": "date"
}
}
}
but the map object expected by JSONata is like below.
{
"name": title,
"info": description,
"some" : {
"text": blog,
"date": date
}
}
The above JSON key is in Quotes and value without Quotes.
Please find the NodeJS API code.
app.post('/JSONTransform', function(req, res, next)
{
const data = req.body.data;
const map = req.body.map;
var expression = jsonata(map);
var result = expression.evaluate(data);
res.send(result);
});
I can write simple function to remove quotes but the above map is simple example. It can be any no of child object and may have some special character in the value including quotes.
I prefer some npm library or standard way to remove quotes or configure JSONata to accepts quotes in value.
Appreciate if you suggest any other library or option.
This Node JS API is called from ASP.NET Core Web API.
ASP.NET Core Web API gets the data and map from database and pass this as single JSON to Node JS API.
Please suggestion solution to this problem or best alternative.
Thanks
Raj
I found solution to this problem.
Pass the single JSON that has both data and map. Since MAP is not valid JSON, I made entire map as string and escaped doubles quotes which is inside the string.
Please find the sample.
{
"map": "{ \"name\": title, \"info\": description, \"data\": { \"text\": blog, \"date\": date }}",
"data": {
"title": "title1",
"description": "description1",
"blog": "This is a blog.",
"date": "11/4/2013"
}
}

Azure Logic App: How to save HTTP Connector's body content to OneDrive file?

I have a simple Azure Logic App with the following components:
Recurrence
HTTP Get from HTTPS url
I've tried to configure the next component to save the HTTP response body to OneDrive with OneDrive Connector configured as follows:
FilePath: ApiTest/test.json
Content: #{body('http')}
Content Transfer Encoding: None
This gives the following error:
{"code":"InvalidTemplate","message":"Unable to process template language expressions in action 'microsoftonedriveconnector' inputs at line '1' and column '11': 'Template language expression cannot be evaluated: one of string interpolation segment value has unsupported type 'Object'. Please convert the value to string using the 'string()' function.'."}
If I then use #{string(body('http'))} I get:
{"code":"InvalidTemplate","message":"Unable to process template language expressions in action 'microsoftonedriveconnector' inputs at line '1' and column '11': 'The template language function 'string' was invoked with an invalid parameter. The value cannot be converted to the target type.'."}
How can I use the body of HTTP Connector and save it to One Drive?
I don't have an answer, but I am wrestling with this very issue right now. I will post if I find a solution and would be very interested in case any one else finds a solution. I do find one thing interesting. There is a header and body option on the consuming end of my Http Connector which works when it runs. The error (as the above poster notes) occurs when passing that value to the next card. But, when I look at the output (link) on the run tab, I see a json value for the header and for the body. Does this need to be wrapped in a json parser?
This works for me just fine and generated the file in the /random/test.txt folder, however I believe the problem in your case is that you are downloading a JSON file which is causing it to be interpreted as an object by the engine.
I'll follow up with the team and maybe we need a "ToJson" call, or a way to "Passthrough", or make "String" more "flexible" (though that could be weird).
"fileContent": {
"Content": "#{body('http')}",
"ContentTransferEncoding": "None"
},
Actions in code view looks like:
"http": {
"type": "Http",
"inputs": {
"method": "GET",
"uri": "http://www.carlosag.net/"
},
"conditions": []
},
"microsoftonedriveconnector": {
"type": "ApiApp",
"inputs": {
"apiVersion": "2015-01-14",
"host": {
"id": "/subscriptions/xxx/resourceGroups/zzz/providers/Microsoft.AppService/apiApps/MicrosoftOneDriveConnector",
"gateway": "https://yyy.azurewebsites.net"
},
"operation": "UploadFile",
"parameters": {
"filePath": "random/test.json",
"fileContent": {
"Content": "#{body('http')}",
"ContentTransferEncoding": "None"
},
"overwrite": true
},
"authentication": {
"type": "Raw",
"scheme": "Zumo",
"parameter": "#parameters('/subscriptions/...')"
}
},
You should try "#body('http')". I believe this will work. "#{body('http')}" is a form of string interpolation: expected output value is string and not a JSON.
I initialized a String variable 'Content' and set the value to the body of the HTTP.
#{body('HTTP')}
And then I convert the String to JSON to read the desired node with an expression.
json(variables('Content'))?['ExcelFile']
I went a step further and base64 decoded to binary in order to upload to Sharepoint. Going to OneDrive would likely be the same.
base64ToBinary(json(variables('Content'))?['ExcelFile'])

Resources