Sinon fake server not intercepting requests - node.js

Trying to use Sinon for the first time because of its fake server functionality that lets me stub an API response. Test itself is written for Mocha
However, the fake server doesn't seem to be intercepting the requests.
Code:
describe('when integrated', function() {
var server;
beforeEach(function() {
server = sinon.createFakeServer();
});
afterEach(function() {
server.restore();
});
it('can send a message to the notification service', function() {
server.respondWith("POST", new RegExp('.*/api/notificationmanager/messages.*'),
[200,
{ "Content-Type": "application/json" },
'{ "messageId":23561}'
]);
var messageOnly = new PushMessage(initMessageObj);
var originalUrl = PushMessage.serverUrl;
messageOnly.setServerAPI("http://a.fake.server/api/notificationmanager/messages");
console.log("fake server is: ", server);
messageOnly.notify()
.then(function(response) {
messageOnly.setServerAPI(originalUrl);
return response;
})
.then(function(response) {
response.should.be.above(0);
})
console.log(server.requests);
server.respond();
})
});
For reference, PushMessage is an object that has a static property serverUrl. I'm just setting the value to a fake URL & then resetting it.
The notify() function makes a post message using request-promise-native to the serverUrl set in the PushMessage's static property.
What seems to be happening, is that the POST request ends up being properly attempted against the URL of http://a.fake.server/api/notificationmanager/messages, resulting in an error that the address doesn't exist...
Any idea what I'm doing wrong...? Thanks!

There have been several issues on the Sinon GitHub repository about this. Sinon's fake server:
Provides a fake implementation of XMLHttpRequest and provides several interfaces for manipulating objects created by it.
Also fakes native XMLHttpRequest and ActiveXObject (when available, and only for XMLHTTP progids). Helps with testing requests made with XHR.
Node doesn't use XHR requests, so Sinon doesn't work for this use case. I wish it did too.
Here's an issue that breaks it down: https://github.com/sinonjs/sinon/issues/1049
Nock is a good alternative that works with Node: https://www.npmjs.com/package/nock

Related

Node.js GET API is getting called twice intermittently

I have a node.js GET API endpoint that calls some backend services to get data.
app.get('/request_backend_data', function(req, res) {
---------------------
}
When there is a delay getting a response back from the backend services, this endpoint(request_backend_data) is getting triggered exactly after 2 minutes. I have checked my application code, but there is no retry logic written anywhere when there is a delay.
Does node.js API endpoint gets called twice in any case(like delay or timeout)?
There might be a few reasons:
some chrome extensions might cause bugs. Those chrome extensions have been causing a lot of issues recently. run your app on a different browser. If there is no issue, that means it is chrome-specific problem.
express might be making requests for favicon.ico. In order to prevent this, use this module : https://www.npmjs.com/package/serve-favicon
add CORS policy. Your server might sending preflight requests Use this npm package: https://www.npmjs.com/package/cors
No there is no default timeouts in nodejs or something like that.
Look for issue at your frontend part:
can be javascript fetch api with 'retry' option set
can be messed up RxJS operators chain which emits events implicitly and triggers another one REST request
can be entire page reload on timeout which leads to retrieve all neccessary data from backend
can be request interceptors (in axios, angular etc) which modify something and re-send
... many potential reasons, but not in backend (nodejs) for sure
Just make simple example and invoke your nodejs 'request_backend_data' endpoint with axois or xmlhttprequest - you will see that problem is not at backend part.
Try checking the api call with the code below, which includes follwing redirects. Add headers as needed (ie, 'Authorization': 'bearer dhqsdkhqd...etc'
var https = require('follow-redirects').https;
var fs = require('fs');
var options = {
'method': 'GET',
'hostname': 'foo.com',
'path': '/request_backend_data',
'headers': {
},
'maxRedirects': 20
};
var req = https.request(options, function (res) {
var chunks = [];
res.on("data", function (chunk) {
chunks.push(chunk);
});
res.on("end", function (chunk) {
var body = Buffer.concat(chunks);
console.log(body.toString());
});
res.on("error", function (error) {
console.error(error);
});
});
req.end();
Paste into a file called test.js then run with node test.js.

Using cookies with axios and Vue

I have created a Node.js express server that connects to Salesforce.com using the SOAP interface provided by 'jsforce'. It uses session cookies for authorization via the 'express-session' package. So far, it has a POST method for login and a GET to perform a simple query. Testing with Postman has proven that this server is working as expected.
As the browser interface to this server, I have wrttien a Vue application that uses axios to perform the GET and POST. I need to save the session cookie created during login POST then attach attach the cookie to subsequent CRUD operations.
I have tried various methods to handle the cookies. One method I have tried is using axios response interceptors on the POST
axios.interceptors.response.use(response => {
update.update_from_cookies();
return response;
});
The function 'update_from_cookies' attempts to get the cookie named 'js-force' but it does not find it although I know it is being sent
import Cookie from 'js-cookie';
import store from './store';
export function update_from_cookies() {
let logged_in = Cookie.get('js-force');
console.log('cookie ' + logged_in);
if (logged_in && JSON.parse(logged_in)) {
store.commit('logged_in', true);
} else {
store.commit('logged_in', false);
}
}
I have also seen various recommendations to add parameters to the axios calls but these also do not work.
I would appreciate some advice about how to handle cookies using axios or some similar package that works with Vue
Thanks
The problem has been resolved. I was using the wrong syntax for the axios call
The correct syntax has the {withCredentials: true} as the last parameter
this.axios.post(uri, this.sfdata, {withCredentials: true})
.then( () => {
this.$router.push( {name : 'home' });
})
.catch( () => {
});

Why we need nock to do http request unit test?

Below is the sample code from redux document
describe('async actions', () => {
afterEach(() => {
nock.cleanAll()
})
it('creates FETCH_TODOS_SUCCESS when fetching todos has been done', () => {
nock('http://example.com/')
.get('/todos')
.reply(200, { body: { todos: ['do something'] }})
const expectedActions = [
{ type: types.FETCH_TODOS_REQUEST },
{ type: types.FETCH_TODOS_SUCCESS, body: { todos: ['do something'] } }
]
const store = mockStore({ todos: [] })
return store.dispatch(actions.fetchTodos())
.then(() => { // return of async actions
expect(store.getActions()).toEqual(expectedActions)
})
})
})
Why we nee to use nock for this unit test?
I did not see any where use the data from nock in this sample code.
Nock is used to mock http requests - if you mock http request it means that your code doesn't perform real http requests to the server.
Nock (and any other http mocking library) overrides native http requests methods so that real http requests will be never sent. It has many benefits - for example you don't have to wait for actual server response because mocked request returns response in no time and of course your test are independent of server. You can focus on testing application code and don't worry about server - even if server doesn't work you test can be run.
You don't have to explictly use data returned by mocked request if you don't need to test it - the main reason of using nock in your code sample is to prevent actual http request to the server that FETCH_TODOS_REQUEST action would normally sent. Besides, even if mocked response data is not explicily used in tests it's probably used in the application code (probably FETCH_TODOS_SUCCESS action expects todos array to be returend) so you have to mock response data so that your application gets data it expects.
If nock wasn't used the test would take much more time because real http request to the server would be sent.
Mainly because in this test we're interested in the actions that get produced by actions.fetchTodos(). This action will make a call to the /todos endpoint, thus returning actions with some data. Since we're just interested in the data contained in the actions, we just mock it.
Nock internally intercepts the real fetch call to /todos and returns a successful 200 code, making it possible for the redux store to continue.
The data you're looking for is
{ todos: ['do something'] }
This is mocked and also expected later on

POST not working for Mocha/Chai tests on Express REST API

I'm having a problem with Mocha tests on my Express app. I have a REST API set up that I know works in the browser, but it's getting a bit large and therefore manual testing has become tedious.
Anyway, all my GET tests work just fine, but when I add tests for my POST requests, they fail:
Uncaught AssertionError: expected { Object (domain, _events, ...) } to have status code 200 but got 400
at testPostSingle (test-app.js:297:21)
at test-app.js:195:21
at Test.Request.callback (/home/jacobd/healthboard/node_modules/chai-http/node_modules/superagent/lib/node/index.js:603:3)
at Stream.<anonymous> (/home/jacobd/healthboard/node_modules/chai-http/node_modules/superagent/lib/node/index.js:767:18)
at Unzip.<anonymous> (/home/jacobd/healthboard/node_modules/chai-http/node_modules/superagent/lib/node/utils.js:108:12)
at _stream_readable.js:944:16
I'm using Mocha, with the Chai should library and chai-http for requests.
My relevant code:
models.forEach(function(model, i) {
var url = '/api/v1/' + model;
var list = lists[i];
/****************************** API POST TESTS ******************************/
describe(util.format('API: /%s POST', model), function() {
var minArgsObj = {
title: 'Test ' + model + ' Title'
};
// Initialize list at start
before(function(done) {
instances._init(function(err) {
if (!err) done();
});
});
// Nuke list before each test
beforeEach(function(done) {
list.clear();
done();
});
// Make sure POST single object works
it('should create and add model on ' + url + ' with minimal arguments', function(done) {
var req = {};
req.options = _.clone(minArgsObj);
chai.request('server')
.post(url)
.send(req)
.end(function(err, res) {
testPostSingle(res, minArgsObj);
list.list.length.should.equal(1);
res.body.should.eql(list.list[0]);
done();
});
});
});
});
function testPostSingle(res, ref) {
res.should.have.status(200);
...
}
When I put a log message in the POST route declaration, it doesn't show up, so that tells me that my request is getting stopped before even hitting my server. Maybe the route isn't getting mounted properly in Mocha? It works when I make the request outside of Mocha, but I can't figure out why it won't work in a test environment.
Thanks in advance for your help, and let me know if you need any more info!
The relevant code is in testPostSingle. Your call is returning 400 bad request. Are you sure the route is correct or is set up to handle POST as well as GET? Are you sure parameters are correct? Make testPostSingle print out the body of the HTTP response so you know the details. You can also add some debug code in the route for that request on your server.
I'm terrible and issue is very simple:
Instead of chai.request('server')
It needs to be chai.request(server)
Thanks to Jason Livesay for pointing me at the right line!

Modify HTTP responses from a Chrome extension

Is it possible to create a Chrome extension that modifies HTTP response bodies?
I have looked in the Chrome Extension APIs, but I haven't found anything to do this.
In general, you cannot change the response body of a HTTP request using the standard Chrome extension APIs.
This feature is being requested at 104058: WebRequest API: allow extension to edit response body. Star the issue to get notified of updates.
If you want to edit the response body for a known XMLHttpRequest, inject code via a content script to override the default XMLHttpRequest constructor with a custom (full-featured) one that rewrites the response before triggering the real event. Make sure that your XMLHttpRequest object is fully compliant with Chrome's built-in XMLHttpRequest object, or AJAX-heavy sites will break.
In other cases, you can use the chrome.webRequest or chrome.declarativeWebRequest APIs to redirect the request to a data:-URI. Unlike the XHR-approach, you won't get the original contents of the request. Actually, the request will never hit the server because redirection can only be done before the actual request is sent. And if you redirect a main_frame request, the user will see the data:-URI instead of the requested URL.
I just released a Devtools extension that does just that :)
It's called tamper, it's based on mitmproxy and it allows you to see all requests made by the current tab, modify them and serve the modified version next time you refresh.
It's a pretty early version but it should be compatible with OS X and Windows. Let me know if it doesn't work for you.
You can get it here http://dutzi.github.io/tamper/
How this works
As #Xan commented below, the extension communicates through Native Messaging with a python script that extends mitmproxy.
The extension lists all requests using chrome.devtools.network.onRequestFinished.
When you click on of the requests it downloads its response using the request object's getContent() method, and then sends that response to the python script which saves it locally.
It then opens file in an editor (using call for OSX or subprocess.Popen for windows).
The python script uses mitmproxy to listen to all communication made through that proxy, if it detects a request for a file that was saved it serves the file that was saved instead.
I used Chrome's proxy API (specifically chrome.proxy.settings.set()) to set a PAC as the proxy setting. That PAC file redirect all communication to the python script's proxy.
One of the greatest things about mitmproxy is that it can also modify HTTPs communication. So you have that also :)
Like #Rob w said, I've override XMLHttpRequest and this is a result for modification any XHR requests in any sites (working like transparent modification proxy):
var _open = XMLHttpRequest.prototype.open;
window.XMLHttpRequest.prototype.open = function (method, URL) {
var _onreadystatechange = this.onreadystatechange,
_this = this;
_this.onreadystatechange = function () {
// catch only completed 'api/search/universal' requests
if (_this.readyState === 4 && _this.status === 200 && ~URL.indexOf('api/search/universal')) {
try {
//////////////////////////////////////
// THIS IS ACTIONS FOR YOUR REQUEST //
// EXAMPLE: //
//////////////////////////////////////
var data = JSON.parse(_this.responseText); // {"fields": ["a","b"]}
if (data.fields) {
data.fields.push('c','d');
}
// rewrite responseText
Object.defineProperty(_this, 'responseText', {value: JSON.stringify(data)});
/////////////// END //////////////////
} catch (e) {}
console.log('Caught! :)', method, URL/*, _this.responseText*/);
}
// call original callback
if (_onreadystatechange) _onreadystatechange.apply(this, arguments);
};
// detect any onreadystatechange changing
Object.defineProperty(this, "onreadystatechange", {
get: function () {
return _onreadystatechange;
},
set: function (value) {
_onreadystatechange = value;
}
});
return _open.apply(_this, arguments);
};
for example this code can be used successfully by Tampermonkey for making any modifications on any sites :)
Yes. It is possible with the chrome.debugger API, which grants extension access to the Chrome DevTools Protocol, which supports HTTP interception and modification through its Network API.
This solution was suggested by a comment on Chrome Issue 487422:
For anyone wanting an alternative which is doable at the moment, you can use chrome.debugger in a background/event page to attach to the specific tab you want to listen to (or attach to all tabs if that's possible, haven't tested all tabs personally), then use the network API of the debugging protocol.
The only problem with this is that there will be the usual yellow bar at the top of the tab's viewport, unless the user turns it off in chrome://flags.
First, attach a debugger to the target:
chrome.debugger.getTargets((targets) => {
let target = /* Find the target. */;
let debuggee = { targetId: target.id };
chrome.debugger.attach(debuggee, "1.2", () => {
// TODO
});
});
Next, send the Network.setRequestInterceptionEnabled command, which will enable interception of network requests:
chrome.debugger.getTargets((targets) => {
let target = /* Find the target. */;
let debuggee = { targetId: target.id };
chrome.debugger.attach(debuggee, "1.2", () => {
chrome.debugger.sendCommand(debuggee, "Network.setRequestInterceptionEnabled", { enabled: true });
});
});
Chrome will now begin sending Network.requestIntercepted events. Add a listener for them:
chrome.debugger.getTargets((targets) => {
let target = /* Find the target. */;
let debuggee = { targetId: target.id };
chrome.debugger.attach(debuggee, "1.2", () => {
chrome.debugger.sendCommand(debuggee, "Network.setRequestInterceptionEnabled", { enabled: true });
});
chrome.debugger.onEvent.addListener((source, method, params) => {
if(source.targetId === target.id && method === "Network.requestIntercepted") {
// TODO
}
});
});
In the listener, params.request will be the corresponding Request object.
Send the response with Network.continueInterceptedRequest:
Pass a base64 encoding of your desired HTTP raw response (including HTTP status line, headers, etc!) as rawResponse.
Pass params.interceptionId as interceptionId.
Note that I have not tested any of this, at all.
While Safari has this feature built-in, the best workaround I've found for Chrome so far is to use Cypress's intercept functionality. It cleanly allows me to stub HTTP responses in Chrome. I call cy.intercept then cy.visit(<URL>) and it intercepts and provides a stubbed response for a specific request the visited page makes. Here's an example:
cy.intercept('GET', '/myapiendpoint', {
statusCode: 200,
body: {
myexamplefield: 'Example value',
},
})
cy.visit('http://localhost:8080/mytestpage')
Note: You may also need to configure Cypress to disable some Chrome-specific security settings.
The original question was about Chrome extensions, but I notice that it has branched out into different methods, going by the upvotes on answers that have non-Chrome-extension methods.
Here's a way to kind of achieve this with Puppeteer. Note the caveat mentioned on the originalContent line - the fetched response may be different to the original response in some circumstances.
With Node.js:
npm install puppeteer node-fetch#2.6.7
Create this main.js:
const puppeteer = require("puppeteer");
const fetch = require("node-fetch");
(async function() {
const browser = await puppeteer.launch({headless:false});
const page = await browser.newPage();
await page.setRequestInterception(true);
page.on('request', async (request) => {
let url = request.url().replace(/\/$/g, ""); // remove trailing slash from urls
console.log("REQUEST:", url);
let originalContent = await fetch(url).then(r => r.text()); // TODO: Pass request headers here for more accurate response (still not perfect, but more likely to be the same as the "actual" response)
if(url === "https://example.com") {
request.respond({
status: 200,
contentType: 'text/html; charset=utf-8', // For JS files: 'application/javascript; charset=utf-8'
body: originalContent.replace(/example/gi, "TESTING123"),
});
} else {
request.continue();
}
});
await page.goto("https://example.com");
})();
Run it:
node main.js
With Deno:
Install Deno:
curl -fsSL https://deno.land/install.sh | sh # linux, mac
irm https://deno.land/install.ps1 | iex # windows powershell
Download Chrome for Puppeteer:
PUPPETEER_PRODUCT=chrome deno run -A --unstable https://deno.land/x/puppeteer#16.2.0/install.ts
Create this main.js:
import puppeteer from "https://deno.land/x/puppeteer#16.2.0/mod.ts";
const browser = await puppeteer.launch({headless:false});
const page = await browser.newPage();
await page.setRequestInterception(true);
page.on('request', async (request) => {
let url = request.url().replace(/\/$/g, ""); // remove trailing slash from urls
console.log("REQUEST:", url);
let originalContent = await fetch(url).then(r => r.text()); // TODO: Pass request headers here for more accurate response (still not perfect, but more likely to be the same as the "actual" response)
if(url === "https://example.com") {
request.respond({
status: 200,
contentType: 'text/html; charset=utf-8', // For JS files: 'application/javascript; charset=utf-8'
body: originalContent.replace(/example/gi, "TESTING123"),
});
} else {
request.continue();
}
});
await page.goto("https://example.com");
Run it:
deno run -A --unstable main.js
(I'm currently running into a TimeoutError with this that will hopefully be resolved soon: https://github.com/lucacasonato/deno-puppeteer/issues/65)
Yes, you can modify HTTP response in a Chrome extension. I built ModResponse (https://modheader.com/modresponse) that does that. It can record and replay your HTTP response, modify it, add delay, and even use the HTTP response from a different server (like from your localhost)
The way it works is to use the chrome.debugger API (https://developer.chrome.com/docs/extensions/reference/debugger/), which gives you access to Chrome DevTools Protocol (https://chromedevtools.github.io/devtools-protocol/). You can then intercept the request and response using the Fetch Domain API (https://chromedevtools.github.io/devtools-protocol/tot/Fetch/), then override the response you want. (You can also use the Network Domain, though it is deprecated in favor of the Fetch Domain)
The nice thing about this approach is that it will just work out of box. No desktop app installation required. No extra proxy setup. However, it will show a debugging banner in Chrome (which you can add an argument to Chrome to hide), and it is significantly more complicated to setup than other APIs.
For examples on how to use the debugger API, take a look at the chrome-extensions-samples: https://github.com/GoogleChrome/chrome-extensions-samples/tree/main/mv2-archive/api/debugger/live-headers
I've just found this extension and it does a lot of other things but modifying api responses in the browser works really well: https://requestly.io/
Follow these steps to get it working:
Install the extension
Go to HttpRules
Add a new rule and add a url and a response
Enable the rule with the radio button
Go to Chrome and you should see the response is modified
You can have multiple rules with different responses and enable/disable as required. I've not found out how you can have a different response per request though if the url is the same unfortunately.

Resources