I have a problem with testing my node application using using Nock. I record all requests via nock.recorder.rec, but among them there multipart request. I use form-data. This module put the boundary to request body, when i use function form.append. The problem is that the boundary is always different and when i run tests with recorded data Nock can't find match for request (because boyundary in request body not what was when recording). What can be done? Sorry for my bad English.
I came across a similar problem. What you can do is use the second argument as as a function instead and match the object you're trying to send as form data. Example:
nock('localhost')
.post('/url', function(body) {
return JSON.stringify(body) === JSON.stringify(params);
})
.reply(200, 'some data');
More on that in the documentation here: https://github.com/pgte/nock#specifying-request-body
Another solution would be to use a RegExp:
nock(baseUrl)
.post(`/url', /form-data; name="field"[^]*value/m)
.reply(200, 'some data');
Note
the beginning of input character (^) within the regexp (because the form data might contain line breaks)
the m flag for multiline
the example above corresponds to: form.append('field', value);
for a real-world example see here. This also shows how to use a variable within the regex, using the RegExp class.
Related
Due to the deprecation of request, we're currently rewriting the request-service in our node app with superagent. So far all looks fine, however we're not quite sure how to request binary data/octet-stream and to process the actual response body as a Buffer. According to the docs (on the client side) one should use
superAgentRequest.responseType('blob');
which seems to work fine on NodeJS, but I've also found this github issue where they use
superAgentRequest.buffer(true);
which works just as well. So I'm wondering what the preferred method to request binary data in NodeJS is?
According to superagent's source-code, using the responseType() method internally sets the buffer flag to true, i.e. the same as setting it manually to true.
In case of dealing with binary-data/octet-streams, a binary data parser is used, which is in fact just a simple buffer:
module.exports = (res, fn) => {
const data = []; // Binary data needs binary storage
res.on('data', chunk => {
data.push(chunk);
});
res.on('end', () => {
fn(null, Buffer.concat(data));
});
};
In both cases this parser is used, which explains the behaviour. So you can go with either of the mentioned methods to deal with binary data/octet-streams.
As per documentation https://visionmedia.github.io/superagent/
SuperAgent will parse known response-body data for you, currently supporting application/x-www-form-urlencoded, application/json, and multipart/form-data. You can setup automatic parsing for other response-body data as well:
You can set a custom parser (that takes precedence over built-in parsers) with the .buffer(true).parse(fn) method. If response buffering is not enabled (.buffer(false)) then the response event will be emitted without waiting for the body parser to finish, so response.body won't be available.
So to parse other response types, you will need to set .buffer(true).parse(fn). But if you do not want to parse response then no need to set buffer(true).
I'm trying to create an endpoint that contains an actual path that I extract and use as a parameter. For instance, in the following path:
/myapi/function/this/is/the/path
I want to match "/myapi/function/" to my function, and pass the parameter "this/is/the/path" as the parameter to that function.
If I try this it obviously doesn't work because it only matches the first element of the path:
app.get("/myapi/function/:mypath")
If I try this it works, but it doesn't show up in req.params, I instead have to parse req.path which is messy because the logic has to know about the whole path, not just the parameter:
app.get("/myapi/function/*")
In addition, the use of wildcard routing seems to be discouraged as bad practice. I'm not sure I understand what alternative the linked article is trying to suggest, and I'm not using the query as part of a database call nor am I uploading any information.
What's the proper way to do this?
You can use wildcard
app.get("/myapi/function/*")
And then get your path
req.params[0]
// Example
//
// For the route "/myapi/function/this/is/my/path"
// You will get output "this/is/my/path"
I am a beginner who have been banging my head for days with this problem I got really really stuck with.
Basically I just want to make a post request using node and express. The object will be created dynamically, but this is my hard coded example. myObj contain an array because I want to do one insert to the database for each item later on server side.
let myObj = {
id: 50,
damage_type: ["missing", "broken", "light"]
}
// Parse myObj to JSON string to be sent
let myjsonObj = JSON.stringify(myObj);
console.log(myjsonObj );
// {"poi":50,"damage_type":["missing","broken","light"]}
postDamage(myjsonObj )
function postDamage(damage) {
$.post({
type: 'POST',
url: '/damage',
data: damage
}).
done(function (damage) {
// Do things
}
router.post('/damage', (req, res) =>
{
let data = req.body;
console.log(data)
// This is what I get in the node terminal which is nonsense, I cannot work with
{ '{"id":50,"damage_type":["missing","broken","light"]}': '' }
I expect it to look like {"id":50,"damage_type":["missing","broken","light"]}
So I can loop through the damage_type creating new objects with this structure
createSQLfunction({id:50, damage_type:"missing"})
});
If I dont stringify my myObj the node terminal is printing
{poi:'50', 'damage_type[]: [ 'missing','broken','light']} Where does the extra [] come from?!
What am I doing wrong not to be able to send an array inside an object to the server side?
From the jquery website:
data
Type: PlainObject or String or Array
Data to be sent to the
server. It is converted to a query string, if not already a string.
It's appended to the url for GET-requests. See processData option to
prevent this automatic processing. Object must be Key/Value pairs. If
value is an Array, jQuery serializes multiple values with same key
based on the value of the traditional setting (described below).
The traditional setting appears to be whether it url-encodes as key[]=val1&key[]=val2 or just key=val1&key=val2. You can give it a try, YMMV.
Or you could make your life a lot easier and just serialize the json yourself, instead of messing with jquery's url-encoding.
*Edit: In answer to your question about best practices: Back before JavaScript form submissions became popular, the two standard ways of submitting a form were application/x-www-form-urlencoded or multipart/form-data. The latter was mostly used if you had file(s) you were submitting with a form.
However with the advent of JavaScript XHR (ajax) form submissions, it has become much more common/popular to use JSON instead of either of these formats. So there is absolutely nothing at all wrong with doing something like data: JSON.stringify(object) when you submit your data, and then just instruct your server to read the JSON.
In fact it's probably both easier and faster. And it is a very popular method, so no worries about going against modern best practices.
I'm trying to make a request with Content-Type x-www-form-urlencoded that works perfectly in postman but does not work in Azure Logic App I receive a Bad Request response for missing parameters, like I'd not send enything.
I'm using the Http action.
The body value is param1=value1¶m2=value2, but I tried other formats.
HTTP Method: POST
URI : https://xxx/oauth2/token
In Headers section, add the below content-type:
Content-Type: application/x-www-form-urlencoded
And in the Body, add:
grant_type=xxx&client_id=xxx&resource=xxx&client_secret=xxx
Try out the below solution . Its working for me .
concat(
'grant_type=',encodeUriComponent('authorization_code'),
'&client_id=',encodeUriComponent('xxx'),
'&client_secret=',encodeUriComponent('xxx'),
'&redirect_uri=',encodeUriComponent('xxx'),
'&scope=',encodeUriComponent('xxx'),
'&code=',encodeUriComponent(triggerOutputs()['relativePathParameters']['code'])).
Here code is dynamic parameter coming from the previous flow's query parameter.
NOTE : **Do not forget to specify in header as Content-Type ->>>> application/x-www-form-urlencoded**
Answering this one, as I needed to make a call like this myself, today.
As Assaf mentions above, the request indeed has to be urlEncoded and a lot of times you want to compose the actual message payload.
Also, make sure to add the Content-Type header in the HTTP action with value application/x-www-form-urlencoded
therefore, you can use the following code to combine variables that get urlEncoded:
concat('token=', **encodeUriComponent**(body('ApplicationToken')?['value']),'&user=', **encodeUriComponent**(body('UserToken')?['value']),'&title=Stock+Order+Status+Changed&message=to+do')
When using the concat function (in composing), the curly braces are not needed.
First of all the body needs to be:
{ param1=value1¶m2=value2 }
(i.e. surround with {})
That said, value1 and value2 should be url encoded. If they are a simple string (e..g a_b) then this would be find as is but if it is for exmaple https://a.b it should be converted to https%3A%2F%2Fa.b
The easiest way I found to do this is to use https://www.urlencoder.org/ to convert it. convert each param separately and put the converted value instead of the original one.
Here is the screenshot from the solution that works for me, I hope it will be helpful. This is example with Microsoft Graph API but will work with any other scenario:
How can I mock an entire HTML body response for my tests?
I'm using nodejs/mocha/nock.
With nock I can mock JSON responds just fine, for example:
nock('http://myapp.iriscouch.com')
.get('/users/1')
.reply(200, {_id: "123ABC", _rev: "946B7D1C", username: 'pgte'});
I used curl -o to fetch the html I want for the mock, so I have it already in a file - but I don't see how can I pass an HTML file to nock (or something else).
Thanks.
First fetch the HTML content of your test file and put it in a string (using fs.readFile for example)
after that you can do:
nock('http://myapp.iriscouch.com').
get('/users/1').
reply(200, yourFileContent);
This is what worked out for me in the past :)
If you'd like, you can specify the content type explicitly, since you specify the body as a string this will effectively let you mock any non-binary response easily:
nock('http://myapp.iriscouch.com').
get('/users/1').
reply(200, yourFileContent, {'content-type': 'text/html'});
If you want a more general approach, I've asked a more general question about a similar issue and got some interesting responses.