When using httpmock in Rust, can one mock server handle multiple paths? - rust

I'm testing the engine's register and poll(health check at regular intervals) behavior by making a mock server that replaces the admin server.
let admin_mock_server = admin_server.mock(|when, then| {
when.path("/register")
.header("content-type", "application/json")
.header_exists("content-type")
.json_body_partial(
r#"
{
"engineName": "engine_for_mock"
}
"#,
);
then.status(200)
.header("content-type", "application/json")
.json_body(get_sample_register_response_body());
});
After performing the register operation, a poll message is sent to the same admin server. Therefore, to test this behavior, I must send a poll message to the same mock server.
Is there a way to set up two pairs of request-response(when-then) on one mock server?
when.path("/poll");
then.status(200)
.header("content-type", "application/json")
.json_body(get_sample_poll_response_body());

If you look a the doc here:
https://docs.rs/httpmock/latest/httpmock/struct.MockServer.html
you can see that the mockfunction returns a Mock object on the mock server.
By this, i guess you can add more mocks on the same server by simply doing the same operations you did for the registerendpoint.
let mock_register = admin_server.mock(|when, then| {
...
});
let mock_poll = admin_server.mock(|when, then| {
...
});
In this way you would have two different mocks on the same admin_server, and then you can put in you test
mock_register.assert()
...
mock_poll.assert()
to interact with those two different endpoints.

Related

I don't know why my http request doesn't work in my angular project?

I created a RESTful API with node.js and when I tested it with postman it worked properly and showed correct result.
But my problem is in request from my angular application. when I send a request, there is no reaction in API and it seems no request is sent to server at all!
My API url is:
http://localhost:3000/api/menus/menujsonexport
And when I send a request via postman it return a json correctly.
And here is my request script in angular:
private requestMenu(type: 'listsidemenu'): Observable<any> {
let base;
if (type === 'listsidemenu') {
base = this.http.get<any>('http://localhost:3000/api/menus/menujsonexport'
, { headers: { Authorization: `Bearer ${this.getToken()}` }});
}
const requestMenu = base.pipe(
map((data: any) => {
return data;
})
);
return requestMenu;
}
I called the request with this method :
public fetchjsonmenu() {
this.authserv.listSideMenu()
.pipe(
finalize(() => {console.log('finished'); }),
tap(x => {
console.log(x);
})
);
}
But there is no reaction in my nodejs API.
Do you have any idea?
Please tell me if there is lack of information to answer to this question.
An Observable instance begins publishing values only when someone subscribes to it. You subscribe by calling the subscribe() method of the instance, passing an observer object to receive the response.
.subscribe is not an Angular2 thing.
It's a method that comes from rxjs library which Angular is using internally.
If you can imagine yourself when subscribing to a newsletter and after subscribing, every time that there is a new newsletter, they will send it to your home (the method inside subscribe gets called).
That's what happens when you subscribing to a source of magazines ( which they call it Observable in rxjs library)
All the AJAX calls in Angular is using this library behind the scene and in order to use any of them, you've got to use the method name, e.g get, and then call subscribe on it, because get returns and Observable.
Also, when you're doing this <button (click)="doSomething()"> Angular is using Observables behind the scene and subscribes you to that source of thing, which in this case is a click event.
Back to our analogy of Observables and newsletter stores, after you've subscribed, as soon as and as long as there is a new magazine, they'll send it to you unless you go and unsubscribe from them which for that to happen you've got to remember the subscription number or id, which in rxjs it would be like :
let subscription = magazineStore.getMagazines().subscribe(
(newMagazine)=>{
console.log('newMagazine',newMagazine);
});
And when you don't want to get the magazines anymore:
subscription.unsubscribe();
Also, the same goes for
this.route.paramMap
which is returning an Observable and then you're subscribing to it.
My personal view is rxjs was one of the greatest things that were brought to JavaScript world and it's even better in Angular.
There are 150~ rxjs methods ( very similar to lodash methods) and the one that you're using is called switchMap
You need to add .subscribe() in your code after the get call.For more information check the link.
So, now your script should look something like this.
let base;
if (type === 'listsidemenu') {
base = this.http.get<any>('http://localhost:3000/api/menus/menujsonexport'
, { headers: { Authorization: `Bearer ${this.getToken()}` }}).subscribe();
}
const requestMenu = base.pipe(
map((data: any) => {
return data;
})
);
return requestMenu;
}```

Nodejs proxy request coalescing

I'm running into an issue with my http-proxy-middleware stuff. I'm using it to proxy requests to another service which i.e. might resize images et al.
The problem is that multiple clients might call the method multiple times and thus create a stampede on the original service. I'm now looking into (what some services call request coalescing i.e. varnish) a solution that would call the service once, wait for the response and 'queue' the incoming requests with the same signature until the first is done, and return them all in a single go... This is different from 'caching' results due to the fact that I want to prevent calling the backend multiple times simultaneously and not necessarily cache the results.
I'm trying to find if something like that might be called differently or am i missing something that others have already solved someway... but i can't find anything...
As the use case seems pretty 'basic' for a reverse-proxy type setup, I would have expected alot of hits on my searches but since the problemspace is pretty generic i'm not getting anything...
Thanks!
A colleague of mine has helped my hack my own answer. It's currently used as a (express) middleware for specific GET-endpoints and basically hashes the request into a map, starts a new separate request. Concurrent incoming requests are hashed and checked and walked on the separate request callback and thus reused. This also means that if the first response is particularly slow, all coalesced requests are too
This seemed easier than to hack it into the http-proxy-middleware, but oh well, this got the job done :)
const axios = require('axios');
const responses = {};
module.exports = (req, res) => {
const queryHash = `${req.path}/${JSON.stringify(req.query)}`;
if (responses[queryHash]) {
console.log('re-using request', queryHash);
responses[queryHash].push(res);
return;
}
console.log('new request', queryHash);
const axiosConfig = {
method: req.method,
url: `[the original backend url]${req.path}`,
params: req.query,
headers: {}
};
if (req.headers.cookie) {
axiosConfig.headers.Cookie = req.headers.cookie;
}
responses[queryHash] = [res];
axios.request(axiosConfig).then((axiosRes) => {
responses[queryHash].forEach((coalescingRequest) => {
coalescingRequest.json(axiosRes.data);
});
responses[queryHash] = undefined;
}).catch((err) => {
responses[queryHash].forEach((coalescingRequest) => {
coalescingRequest.status(500).json(false);
});
responses[queryHash] = undefined;
});
};

Adding object to supertest

I'm trying to test one of my routes, which would usually expect an object on the request object (e.g. req.exampleData = { }).
I've tried looking for examples, but I've only found .set which attaches to the req header.
Ideally, I would want something like:
await request(app)
.get('/api/testRoute)
.attach('exampleData', { })
Is such a thing possible for supertest?
Perhaps you can use field as in
await request(app)
.get('/api/testRoute)
.field('exampleData', {})
on your server
req.body.exampleData // {}

How to mock external service when testing a NodeJS API

I have JSON API built with koa which I am trying to cover with integration tests.
A simple test would look like this:
describe("GET: /users", function() {
it ("should respond", function (done) {
request(server)
.get('/api/users')
.expect(200, done);
});
});
Now the issue comes when the actions behind a controller - lets say saveUser at POST /users - use external resources. For instance I need to validate the users phone number.
My controller looks like this:
save: async function(ctx, next) {
const userFromRequest = await parse(ctx);
try {
// validate data
await ctx.repo.validate(userFromRequest);
// validate mobile code
await ctx.repo.validateSMSCode(
userFromRequest.mobile_number_verification_token,
userFromRequest.mobile_number.prefix + userFromRequest.mobile_number.number
);
const user = await ctx.repo.create(userFromRequest);
return ctx.data(201, { user });
} catch (e) {
return ctx.error(422, e.message, e.meta);
}
}
I was hoping to be able to mock the ctx.repo on the request object but I can't seem to able to get a hold on it from test, which means that my tests are actually hitting the phone number verification service.
Are there any ways I could go around hitting that verification service ?
Have you considered using a mockup library like https://github.com/mfncooper/mockery?
Typically, when writing tests requiring external services, I mock the service client library module. For example, using mocha:
mockery = require('mockery');
repo = require('your-repo-module');
before(function() {
mockery.enable();
repo.validateSMSCode = function() {...};
mockery.registerMock('your-repo-module', repo);
}
This way, every time you require your-repo-module, the mocked module will be loaded rather than the original one. Until you disable the mock, obviously...
app.context is the prototype from which ctx is created from. You may
add additional properties to ctx by editing app.context. This is
useful for adding properties or methods to ctx to be used across your
entire app, which may be more performant (no middleware) and/or easier
(fewer require()s) at the expense of relying more on ctx, which could
be considered an anti-pattern.
app.context.someProp = "Some Value";
app.use(async (ctx) => {
console.log(ctx.someProp);
});
For your sample your re-define app.context.repo.validateSMSCode like this, assuming that you have following setup lines in your test:
import app from '../app'
import supertest from 'supertest'
app.context.repo.validateSMSCode = async function(ctx, next) {
// Your logic here.
};
const request = supertest.agent(app.listen())
After re-defining app.context.repo.validateSMSCode method that your will define in your test, will work, instead of original method.
https://github.com/koajs/koa/blob/v2.x/docs/api/index.md#appcontext
https://github.com/koajs/koa/issues/652

fetch data from multiple table by sending only one request

I am using node.js for server side development and backbone.js for client side development. i want to fetch data from multiple table(more than 3) by sending only one request to node.js. but i cant merge all that result with each other beacuse of asynchronous execution of node.js. i have done this but it sending a lots of get request to node js for getting data from all the tables and because of these performance of my site is become slower. please help me if anyone having any idea.
I would create a method which aggregates the results from each of the requests and sends the response back. Basically each of your three async db calls would pass their data to the same method. That method would check to see if it had all of the data it needed to complete the request, and if it did, send the response.
Here is a pseudo code example:
function handleRequest(req, res) {
var results = {};
db.getUsers(function(data) {
aggregate('users', data);
});
db.getPosts(function(data) {
aggregate('posts', data);
});
db.getComments(function(data) {
aggregate('comments', data);
});
function aggregate(name, data) {
results[name] = data;
if(results.users && results.posts && results.comments) {
res.send(results);
}
}
}
This is simplified greatly, you should also of course check for errors and timeouts to the db calls, but this will allow you to wait for all the async commands to complete before sending the data.

Resources