nodejs/mocha/nock - mocking an entire html response - node.js

How can I mock an entire HTML body response for my tests?
I'm using nodejs/mocha/nock.
With nock I can mock JSON responds just fine, for example:
nock('http://myapp.iriscouch.com')
.get('/users/1')
.reply(200, {_id: "123ABC", _rev: "946B7D1C", username: 'pgte'});
I used curl -o to fetch the html I want for the mock, so I have it already in a file - but I don't see how can I pass an HTML file to nock (or something else).
Thanks.

First fetch the HTML content of your test file and put it in a string (using fs.readFile for example)
after that you can do:
nock('http://myapp.iriscouch.com').
get('/users/1').
reply(200, yourFileContent);
This is what worked out for me in the past :)
If you'd like, you can specify the content type explicitly, since you specify the body as a string this will effectively let you mock any non-binary response easily:
nock('http://myapp.iriscouch.com').
get('/users/1').
reply(200, yourFileContent, {'content-type': 'text/html'});
If you want a more general approach, I've asked a more general question about a similar issue and got some interesting responses.

Related

My Request Body is so long. How do you handle it while doing API testing via Cypress

My request body for an endpoint is so long:
1st question: I read that we can use some request.body.js file for storing our request body and then call it where ever we need it. But unfortunately, I could not find any sample framework/tutorial to learn it.
2nd question: in my project the properties of the request body (especially names of properties) are not exactly matching with the response body that is gaven in the Swagger document. What can be the reason? What would be your approach?
I would appreciate it if you could help me to ridd off the question in the best possible way. Thank you!
It's quite straight forward, take a look at this login example:
cy.fixture('users.json').then((userdata) => {
cy.request({
method: 'POST',
url: <auth_url>,
form: true,
body: userdata
});
});
You can export this as cypress function and then have it available in all your test spec files.
users.json file in fixtures folder looks like this:
{
"username": "...",
"password": "..."
}
Hope that answers the first question at least.

Handling UTF8 characters in express route parameters

I'm having an issue with a NodeJS REST api created using express.
I have two calls, a get and a post set up like this:
router.get('/:id', (request, response) => {
console.log(request.params.id);
});
router.post('/:id', (request, response) => {
console.log(request.params.id);
});
now, I want the ID to be able to contain special characters (UTF8).
The problem is, when I use postman to test the requests, it looks like they are encoded very differently:
GET http://localhost:3000/api/â outputs â
POST http://localhost:3000/api/â outputs â
Does anyone have any idea what I am missing here?
I must mention that the post call also contains a file upload so the content type will be multipart/form-data
You should encode your URL on the client and decode it on the server. See the following articles:
What is the proper way to URL encode Unicode characters?
Can urls have UTF-8 characters?
Which characters make a URL invalid?
For JavaScript, encodeURI may come in handy.
It looks like postman does UTF-8 encoding but NOT proper url encoding. Consequently, what you type in the request url box translates to something different than what would happen if you typed that url in a browser.
I'm requesting: GET localhost/ä but it encodes it on the wire as localhost/ä
(This is now an invalid URL because it contains non ascii characters)
But when I type localhost/ä in to google chrome, it correctly encodes the request as localhost/%C3%A4
So you could try manually url encoding your request to http://localhost:3000/api/%C3%A2
In my opinion this is a bug (perhaps a regression). I am using the latest version of PostMan v7.11.0 on MacOS.
Does anyone have any idea what I am missing here?
yeah, it doesn't output â, it outputs â, but whatever you're checking the result with, think you're reading something else (iso-8859-1 maybe?), not UTF-8, and renders it as â
Most likely, you're viewing the result in a web browser, and the web server is sending the wrong Content-Type header. try doing header("Content-type: text/plain;charset=utf-8"); or header("Content-type: text/html;charset=utf-8"); , then your browser should render your â correctly.

Azure Logic App: how to make a x-www-form-encoded?

I'm trying to make a request with Content-Type x-www-form-urlencoded that works perfectly in postman but does not work in Azure Logic App I receive a Bad Request response for missing parameters, like I'd not send enything.
I'm using the Http action.
The body value is param1=value1&param2=value2, but I tried other formats.
HTTP Method: POST
URI : https://xxx/oauth2/token
In Headers section, add the below content-type:
Content-Type: application/x-www-form-urlencoded
And in the Body, add:
grant_type=xxx&client_id=xxx&resource=xxx&client_secret=xxx
Try out the below solution . Its working for me .
concat(
'grant_type=',encodeUriComponent('authorization_code'),
'&client_id=',encodeUriComponent('xxx'),
'&client_secret=',encodeUriComponent('xxx'),
'&redirect_uri=',encodeUriComponent('xxx'),
'&scope=',encodeUriComponent('xxx'),
'&code=',encodeUriComponent(triggerOutputs()['relativePathParameters']['code'])).
Here code is dynamic parameter coming from the previous flow's query parameter.
NOTE : **Do not forget to specify in header as Content-Type ->>>> application/x-www-form-urlencoded**
Answering this one, as I needed to make a call like this myself, today.
As Assaf mentions above, the request indeed has to be urlEncoded and a lot of times you want to compose the actual message payload.
Also, make sure to add the Content-Type header in the HTTP action with value application/x-www-form-urlencoded
therefore, you can use the following code to combine variables that get urlEncoded:
concat('token=', **encodeUriComponent**(body('ApplicationToken')?['value']),'&user=', **encodeUriComponent**(body('UserToken')?['value']),'&title=Stock+Order+Status+Changed&message=to+do')
When using the concat function (in composing), the curly braces are not needed.
First of all the body needs to be:
{ param1=value1&param2=value2 }
(i.e. surround with {})
That said, value1 and value2 should be url encoded. If they are a simple string (e..g a_b) then this would be find as is but if it is for exmaple https://a.b it should be converted to https%3A%2F%2Fa.b
The easiest way I found to do this is to use https://www.urlencoder.org/ to convert it. convert each param separately and put the converted value instead of the original one.
Here is the screenshot from the solution that works for me, I hope it will be helpful. This is example with Microsoft Graph API but will work with any other scenario:

Nock + multipart form data = No match for request

I have a problem with testing my node application using using Nock. I record all requests via nock.recorder.rec, but among them there multipart request. I use form-data. This module put the boundary to request body, when i use function form.append. The problem is that the boundary is always different and when i run tests with recorded data Nock can't find match for request (because boyundary in request body not what was when recording). What can be done? Sorry for my bad English.
I came across a similar problem. What you can do is use the second argument as as a function instead and match the object you're trying to send as form data. Example:
nock('localhost')
.post('/url', function(body) {
return JSON.stringify(body) === JSON.stringify(params);
})
.reply(200, 'some data');
More on that in the documentation here: https://github.com/pgte/nock#specifying-request-body
Another solution would be to use a RegExp:
nock(baseUrl)
.post(`/url', /form-data; name="field"[^]*value/m)
.reply(200, 'some data');
Note
the beginning of input character (^) within the regexp (because the form data might contain line breaks)
the m flag for multiline
the example above corresponds to: form.append('field', value);
for a real-world example see here. This also shows how to use a variable within the regex, using the RegExp class.

Jade template, how to pass concrete object to pages?

I have a jade template for my node.js project. I would like to send an object to the jade template and pass it to a function inside the page (to render something).
I am sure I send the right stuff from the server like this
res.render(__dirname + '/pages/viz.jade', {
vizJson: newJson,
});
in the client I do something like this:
script
sunburst(#{vizJson})
Thus, inside a script function, I want to call a function that creates my visualization with some json I created on the server side.
The problem is that when rendered I have something like sunburst([Object object]). I also tried to send the stringified version of the JSON but when I do JSON.parse(#{vizJson}) it complains like Unexpected token &.
The json I send is always different and has different level of depths.
Does anyone knows what to do?
Thanks
I hope this is going to help someone. I solved it like this:
script
sunburst(!{JSON.stringify(vizJson)})
Notice the ! and the {...} wrapping the stringify method.
For this to work, you need to stringify on the server.
res.render(__dirname + '/pages/viz.jade', {
vizJson: JSON.stringify(newJson),
});
Then, as you mentioned, parse the JSON on the client.
script
sunburst(JSON.parse(#{vizJson}))
Hope that helps!
Oddly enough, for me the solution involved no calls to JSON.parse. I stringified my object on the server and just used the !{vizJson} method and got my object clientside.
Per the docs, unescaped string interpolation: http://jade-lang.com/reference/interpolation/
On the JS side, you send back
res.render(__dirname + '/pages/viz.jade', {
vizJson: JSON.stringify(newJson),
});
On the HTML side, I have found that something like:
JSON.parse( '!{vizJson}' )
works.

Resources