Post request with encrypted json data with several keys and one image file() multipart , android - retrofit2

Have to pass this json to server, and encrypted param will be created after creating the json and it will return the same json with three parameters, but I am not able to do it with image or file data.
After encrypting the data these params(param1, param2, param3) automatically get created
I tried using #PartMap using Map, #Part #Multipart
{
"fullname": "hello",
"dob": "20-12-1992",
"locale":{
"language: "en",
"version": 2,
},
"media": {
"media":{
"name:"",
"size":""
}
},
"param1":"",
"param2":"",
"param3":""
}

Related

Getting all classes in a python project and assigning dummy values to each attribute depending on the field type for test purposes

I have a FastAPI application. I need to test it using test data. I have a shell script written to load the data from a json file and run to test it and it works perfectly. I just need a way to automatically derive the json test file. The Json file contains something like this
{
"name": "registration-test",
"endpoint": "/user/register",
"method" : "post",
"request": {
"name": "hello ",
"email": "foo334#gnuze.org",
"password": "123456"
},
"expression": "email",
"expected": "foo334#gnuze.org"
}
To get the request data I need a way to get all the classes in the project and assign dummy values and return it in a JSON format.

How to insert an image into CouchDB

I'm trying to figure out how to insert an image into CouchDB using the node-CouchDB library found here: https://www.npmjs.com/package/node-couchdb
Here's what I've done:
fs.readFile('download.jpeg', (err, data) => {
binary_data = new Buffer(data, 'binary');
couch.insertAttachment("node_db", doc_number, "download.jpeg", binary_data, rev_number).then(({data, headers, status}) => {
}, err => {
console.log("ERROR"+ err.code);
});
});
The result is that CouchDB stores this in the document format like such:
{
"_id": "2741d6f37d61d6bbdf63df3be5000504",
"_rev": "22-bfdbe6db35c7d9873a2cc8a38afb2833",
"_attachments": {
"attachment": {
"content_type": "application/json",
"revpos": 22,
"digest": "md5-on0A+d7045WPI6FyS1ut4g==",
"length": 22482,
"stub": true
}
}
}
//This is what the data looks like in CouchDB using the View Attachment Function through the interface:
{"type":"Buffer","data":[255,216,255,224,0,16,74,70,73,70,0,1,1,0,0,1,0,1,0,0,255,219,0,132,0,9,6,7,18,18,18,21,18,19,19,22,21,21,23,23,23,24,21,21,21,23,23,21,21,24,21,21,21,23,22,22,21,21,22,24,29,40,32,24,26,37,29,21,21,33,49,33,37,41,43,46,46,46,23,31,51,56,51,45,55,40,45,46,43,1,10,10,10,14,13,14,26,16,16,26,45,37,29,37,45,45,45,45,45,45,45,241,...]
I then tried changing the Content-Type attribute to "image/jpeg" in the header of the request resulting in:
{
"_id": "2741d6f37d61d6bbdf63df3be5000504",
"_rev": "23-cf8c2076b43082fdfe605cad68ef2355",
"_attachments": {
"attachment": {
"content_type": "image/jpeg",
"revpos": 23,
"digest": "md5-SaekQP37DCCeGX2M8UVeGQ==",
"length": 22482,
"stub": true
}
}
}
However, this still results in an image that isn't viewable from the CouchDB interface (clicking View Attachments). The image, in this case, is only size 6,904 bytes but it's being stored with a length of ~22k (inflating the size in CouchDB) so I'm assuming I'm not passing the correct representation (encoding) of the image to CouchDB.
You can encode your image data as a base64 string and save it, although i would not recommend it at all. What I would do is to upload the file to a object storage like AWS S3 or it's open source alternative MinIO, and then save in the DB just a reference to the file (e.g. an Image URL).
P.S.: I'm sorry about the lack of links and references in my answer, I'm writing it on my phone. I can edit it and include references as soon as I'm home.

JSONata - JSON to JSON Transformation in Nodejs API

I need to write REST API in Node jS for JSON to JSON transfromation.
There are many library and I sort listed "JSONata"
Please find JSONata simple sample here
The challenge is API receive JSON which has data and map but JSONata require map value without quotes.
{
"data" : {
"title" : "title1",
"description": "description1",
"blog": "This is a blog.",
"date": "11/4/2013"
},
"map" : {
"name": "title",
"info": "description",
"data" : {
"text": "blog",
"date": "date"
}
}
}
but the map object expected by JSONata is like below.
{
"name": title,
"info": description,
"some" : {
"text": blog,
"date": date
}
}
The above JSON key is in Quotes and value without Quotes.
Please find the NodeJS API code.
app.post('/JSONTransform', function(req, res, next)
{
const data = req.body.data;
const map = req.body.map;
var expression = jsonata(map);
var result = expression.evaluate(data);
res.send(result);
});
I can write simple function to remove quotes but the above map is simple example. It can be any no of child object and may have some special character in the value including quotes.
I prefer some npm library or standard way to remove quotes or configure JSONata to accepts quotes in value.
Appreciate if you suggest any other library or option.
This Node JS API is called from ASP.NET Core Web API.
ASP.NET Core Web API gets the data and map from database and pass this as single JSON to Node JS API.
Please suggestion solution to this problem or best alternative.
Thanks
Raj
I found solution to this problem.
Pass the single JSON that has both data and map. Since MAP is not valid JSON, I made entire map as string and escaped doubles quotes which is inside the string.
Please find the sample.
{
"map": "{ \"name\": title, \"info\": description, \"data\": { \"text\": blog, \"date\": date }}",
"data": {
"title": "title1",
"description": "description1",
"blog": "This is a blog.",
"date": "11/4/2013"
}
}

how to parse xml from an http-triggered logic app?

How do I extract the content of my request that's been received inside of the logic app?
I've got a regular http-triggered logic app, like so:
I'm sending it a POST request through postman like so:
{
"$content-type": "application/octet-stream",
"$content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?><cases><file-path>yes</file-path></cases>"
}
I'm attempting to extract the $content payload:
"#{string(xml(string(triggerBody()?['content'])))}"
The issue I am getting is:
How do I extract the content of my request that's been received inside of the logic app?
Here's the entire initialize variable step:
"Initialize_variable": {
"inputs": {
"variables": [
{
"name": "contentOfRequest",
"type": "String",
"value": "#{string(xml(string(triggerBody()?['content'])))}"
}
]
},
"runAfter": {},
"type": "InitializeVariable"
}
Cause the request body is string, it doesn't support select property. so you need parse it Json format firstly, then you will be able to select $content.
About how to get the Json Schema, just click the Use sample payload to generate schema in the Parse Json action and paste your Json data, then click the done.
And then extract the $content value with body('Parse_JSON')?['$content'], in this way you will get the content value.

How to include fields in api server and remove it before returning to results to client in Graphql

I have a Node.js GraphQL server. From the client, I am trying get all the user entries using a query like this:
{
user {
name
entries {
title
body
}
}
}
In the Node.js GraphQL server, however I want to return user entries that are currently valid based on publishDate and expiryDate in the entries object.
For example:
{
"user": "john",
"entries": [
{
"title": "entry1",
"body": "body1",
"publishDate": "2019-02-12",
"expiryDate": "2019-02-13"
},
{
"title": "entry2",
"body": "body2",
"publishDate": "2019-02-13",
"expiryDate": "2019-03-01"
},
{
"title": "entry3",
"body": "body3",
"publishDate": "2020-01-01",
"expiryDate": "2020-01-31"
}
]
}
should return this
{
"user": "john",
"entries": [
{
"title": "entry2",
"body": "body2",
"publishDate": "2019-02-13",
"expiryDate": "2019-03-01"
}
]
}
The entries is fetched via a delegateToSchema call (https://www.apollographql.com/docs/graphql-tools/schema-delegation.html#delegateToSchema) and I don't have an option to pass publishDate and expiryDate as query parameters. Essentially, I need to get the results and then filter them in memory.
The issue I face is that the original query doesn't have publishDate and expiryDate in it to support this. Is there a way to add these fields to delegateToSchema call and then remove them while sending them back to the client?
You are looking for transformResult
Implementation details are:
At delegateToSchema you need to define transforms array.
At Transform you need to define transformResult function for filtering results.
If you have ability to send arguments to remote GraphQL server, then you should use
transformRequest

Resources