I've got a simple FeathersJS server. I want to change certain things in its config without restarting it, such as disallow creating new users.
So, basically, I want to change a couple of variables at runtime.
It definitely sounds like something that's got complete solutions written somewhere.
My thoughts are to expose a callback that, according to its args, changes the variables of interest, then call the callback with something like inquirer or expose WebSockets that execute the callback accordingly.
Please tell me how to best solve this problem, and - if possible - point me to some existing solutions as well.
To change something inside your nodejs process from outside the process, you need to expose some external connectivity that can make the change for you. There are an infinite set of choices, but a webSocket/socket.io server or an http server are probably the most common.
If this is typically just a transaction, not a whole sequence of changes, then a simple http server on a port that is not accessible from the outside world is probably the most appropriate way to do this. You then just create routes on the http server that accept the desired changes and the code in the routes themselves changes the internal variables accordingly. If you chose some other interface such as a webSocket or socket.io server, the concept would be the same.
Related
I'm very new to Node.js, so I might just not be getting it, but after searching quite a bit, and trying a few different solutions, I am still not able to find a decent way to mock API responses using Node for acceptance testing.
I've got a javascript app (written in elm actually) that interacts with an API (pretty common, I imagine), and I want to write some acceptance tests... so I setup WebdriverIO with selenium and mocha, write some tests, and of course now I need to mock some API responses so that I can setup some theoretical scenarios to test under.
mock-api-server: Looked pretty nice, but there's no way to adjust the headers getting sent back from the server!
mock-http-server: Also looked pretty nice, lets me adjust headers, but there's no way to reset the mock responses without shutting down the whole server... !? And that has issues because the server won't shut down while the browser window is still open, so that means I have to close and relauch the browser just to clear the mocks!
json-server: Simple and decent way to mock some responses, but it relies entirely on files on disk for the responses. I want something I can configure from within a test run without reading and writing files to disk.
Am I missing something? Is this not how people do acceptance testing in the Node universe? Does everyone just use a fixed set of mock data for their entire test suite? That just sounds insane to me... Particularly since it seems like it wouldn't be that hard to write a good one based on express server that has all the necessary features... does it exist?
Necessary Features:
Server can be configured and launched from javascript
Responses(including headers) can be configured on the fly
Responses can also be reset easily on the fly, without shutting down the server.
I hit this problem too, so I built one: https://github.com/pimterry/mockttp
In terms of the things you're looking for, Mockttp:
Lets you start & reconfigure the server dynamically from JS during the test run, with no static files.
Lets you adjust headers
Lets you reset running servers (though I'd recommend shutting down & restarting anyway - with Mockttp that takes milliseconds, is clear & easily automatable, and gives you some nice guarantees)
On top of that, it:
Is configurable from both Node & browsers with identical code, so you can test universally
Can handle running tests in parallel for quicker testing
Can fake HTTPS servers, self-signing certificates automatically
Can mock as an intercepting proxy
Has a bunch of nice debugging support for when things go wrong (e.g. unmatched requests come back with a readable explanation of the current configuration, and example code that would make the request succeed)
Just to quickly comment on the other posts suggesting testing in-process: I really wouldn't. Partly because a whole bunch of limitations (you're tied to a specific environment, potentially even specific node version, you have to mock for the whole process, so no parallel tests, and you can't mock subprocesses), but mainly because it's not truly representative. For a very very small speed cost, you can test with real HTTP, and know that your requests & responses will definitely work in reality too.
Is this not how people do acceptance testing in the Node universe? Does everyone just use a fixed set of mock data for their entire test suite?
No. You don't have to make actual HTTP requests to test your apps.
All good test frameworks lets you fake HTTP by running the routes and handlers without making network requests. Also you can mock the functions that are making the actual HTTP requests to external APIs, which should be abstracted away in the first place, so no actual HTTP requests need to take place here as well.
And if that's not enough you can always write a trivially simple server using Express, Hapi, Restify, Loopback or some other frameworks, or plain http, or even net module (depending on how much control do you need - for example you should always test invalid responses that don't use HTTP protocol correctly, or broken connections, incomplete connections, slow connections etc. and for that you may need to use lower lever APIs in Node) to provide the mock data yourself.
By the way, you also always need to test responses with invalid JSON because people often wrongly assume that the JSON they get is always valid which it is not. See this answer to see why it is particularly important:
Calling a JSON API with Node.js
Particularly since it seems like it wouldn't be that hard to write a good one based on express server that has all the necessary features... does it exist?
Not everything that "wouldn't be that hard to write" necessarily has to exist. You may need to write it. Hey, you even have a road map ready:
Necessary Features:
Server can be configured and launched from javascript
Responses(including headers) can be configured on the fly
Responses can also be reset easily on the fly, without shutting down the server.
Now all you need is choose a name, create a repo on GitHub, create a project on npm and start coding.
You now, even "it wouldn't be that hard to write" it doesn't mean that it will write itself. Welcome to the open source world where instead of complaining that something doesn't exist people just write it.
You could try nock. https://github.com/node-nock
It supports all of your feature requests.
I have a question regarding the examples out there when using Nodejs, Express and Jade for templates.
All the examples show how to build some sort of a user administrative interface where you can add user profiles, delete them and manage them.
Those are considered beginner's guides to NodeJs. My question is around the fact that if I have have 10 users concurrently accessing the same interface and doing the same operations, surely NodeJs will block the requests for the other users as they are running on the same port.
So let's say I am pulling out a list of users which may be something like 10000. Yes I can do paging, but that is not the point. While I am getting the list from the server another 4 users want to access the application. They have to wait for my process to end. That is my question - how can one avoid that using NodeJS & Express?
I am on this issue for a couple of months! I currently have something in place that does the following:
Run the main processing of stuff on a port
Run a Socket.io process on a different port
Use a sticky session
The idea is that I do a request (like getting a list of items), and immediately respond with some request reference but without the requested items, thus releasing the port.
In the background "asynchronously" I then do the process of getting the items. Upon which when completed, I do an http request from one node to the socket node port node SENDING the items through.
When that is done I then perform a socket.io emit WITH the data and the initial request reference so that the correct user gets the message.
On the client side I have an event listening for the socket which then completes the ajax request by populating the list.
I have SOME success in doing this! It actually works to a degree! I have an issue online which complicates matters due to ip addresses, and socket.io playing funny.
I also have multiple workers using clustering. I use it in the following manner:
I create a master worker
I spawn workers
I take any connection request and pass it to the relevant worker.
I do that for the main node request as well as for the socket requests. Like I said I use 2 ports!
As you can see I have had a lot of work done on this and I am not getting a proper solution!
My question is this - have I gone all around the world 10 times only to have missed something simple? This sounds way to complicated to achieve a non-blocking nodejs only website.
I asked myself - surely all these tutorials would have not missed on something as important as this! But they did!
I have researched, read, and tested a lot of code - this is my very first time I ask anything on stackoverflow!
Thank you for any assistance.
P.S. One example of the same approach is this: I request a report using jasper, I pass parameters, and with the "delayed ajax response" approach as described above I simply release the port, and in the background a very intensive report is being generated (and this can be very intensive process as a lot of calculations are being performed)..! I really don't see a better approach - any help will be super appreciated!
Thank you for taking the time to read!
I'm sorry to say it, but yes, you have been going around the world 10 times only to have been missing something simple.
It's obvious that your previous knowledge/experience with webservers are from a blocking point of view, and if this was the case, your concerns had been valid.
Node.js is a framework focused around using a single thread to execute code, which means if it does any blocking operations, no one else would be able to get anything done.
There are some operations that can do this in node, like reading/writing to disk. However, most node operations will be asynchronous.
I believe you are familiar with the term, so I won't go into details. What asynchronous operations allows node to do, is to keep this single thread idle as much as possible. By idle I mean open for other work. If your code is fully asynchronous, then handling 4 concurrent users (or even 400) shouldn't be a problem, even for a single thread.
Now, in regards to your initial problem of ports: Once a request is received on a given port, node.js execute whatever code you have written for it, until it encounters an asynchronous operation as soon as that happens, it is available to to pick up more requests on the same port.
The second problem you inquire about, is the database operation. In this case, node-js would send the query to the database (which takes no time at all) and the database does that actual execution of the query. In the meantime, node is free to do whatever it wants, until the database is finished, and lets node know there is a result to fetch.
You can recognize async operations by their structure: my_function(..., ..., callback). Function that uses a callback function, is in most cases asynch.
So bottom line: Don't worry about the problems around blocking IO, as you will hardly encounter any in node. Use a single port if you want (By creating multiple child processes, you can even have multiple node instances on the same port).
Hope this explains it good enough. If you have any further questions, let me know :)
I'm struggling with a technical issue, and because of I'm pretty new on NodeJS world I think I don't have the proper good practise and tools to help me solve this.
Using the well known request module, I'm making a stream proxy from a remote server to the client. Almost everything is fine and working properly until a certain point, if there is too much requests at the same time the server does no longer respond. Actualy it does get the client request but is unable to go through the stream process and serve the content.
What I'm currently doing:
Creating a server with http module with http.createServer
Getting remote url from a php script using exec
Instanciate the stream
How I did it:
http://pastebin.com/a2ZX5nRr
I tried to investigate on the pooling stuff and did not understand everything, same thing the pool maxSocket was recently added, but did not helped me. I was also seting before the http.globalAgent to infinity, but I read that this was no longer limited in nodeJS from a while, so it does not help.
See here: https://nodejs.org/api/http.html#http_http_globalagent
I also read this: Nodejs Max Socket Pooling Settings but I'm wondering what is the difference between a custom agent and the global one.
I believed that it could come from the server but I tested it on a very small one and a bigger one and it was not coming from there. I think it definitely coming from my app that has to be better designed. Indeed each time I'm restarting the app instance it works again. Also if I'm starting a fork of the server meanwhile the other is not serving anything on another port it will work. So it might not be about ressources.
Do you have any clue, tools or something that may help me to understand and debug what is going on?
NPM Module that can help handle stream properly:
https://www.npmjs.com/package/pump
I made few tests, and I think I've found what I was looking for. The unpipe things more info here:
https://nodejs.org/api/stream.html#stream_readable_unpipe_destination
Can see and read this too, it leads me to understand few things about pipe remaining open when target failed or something:
http://www.bennadel.com/blog/2679-how-error-events-affect-piped-streams-in-node-js.htm
So what I've done, i'm currently unpiping pipes when stream's end event is fired. However I guess you can make this in different ways, it depends on how you want to handle the thing but you may unpipe also on error from source/target.
Edit: I still have issues, it seams that the stream is now unpiping when it does not have too. I'll have to doubile check this.
User case:
My nodejs server start with a configuration wizard that allow user to change the port and scheme. Even more, update the express routes
Question:
Is it possible to apply the such kind of configuration changes on the fly? restart the server can definitely bring all the changes online but i'm not sure how to trigger it from code.
Changing core configuration on the fly is rarely practiced. Node.js and most http frameworks do not support it neither at this point.
Modifying configuration and then restarting the server is completley valid solution and I suggest you to use it.
To restart server programatically you have to execute logics outside of the node.js, so that this process can continue once node.js process is killed. Granted you are running node.js server on Linux, the Bash script sounds like the best tool available for you.
Implementation will look something like this:
Client presses a switch somewhere on your site powered by node.js
Node.js then executes some JavaScript code which instructs your OS to execute some bash script, lets say it is script.sh
script.sh restarts node.js
Done
If any of the steps is difficult, ask about it. Though step 1 is something you are likely handling yourself already.
I know this question was asked a long time ago but since I ran into this problem I will share what I ended up doing.
For my problem I needed to restart the server since the user is allowed to change the port on their website. What I ended up doing is wrapping the whole server creation (https.createServer/server.listen) into a function called startServer(port). I would call this function at the end of the file with a default port. The user would change port by accessing endpoint /changePort?port=3000. That endpoint would call another function called restartServer(server,res,port) which would then call the startServer(port) with the new port then redirect user to that new site with the new port.
Much better than restarting the whole nodejs process.
I'm trying to run some server only code from an event on the client in derby.js
I'm using x-bind to bind the event on the view like so:
click me
and on the app:
exports.func=function(e,el,next){
// i want to run some server code here, but it runs on the client only
}
So:
Can this be done in any way?
if not, is there any way to use sockets in a 'native' way on derby.js
I simply don't want to fall back to ajax with server routes when all the rest is real time.
You can route the request onto the server via the model ( model.fetch() & model.subscribe() ). If it's just retrieving some data from the server, you are basically all set. Keep a reference to the model for when you need it (in app.ready callback as pointed out by switz).
To use sockets directly or to extend the model (which uses sockets in the background) see
https://groups.google.com/forum/?pli=1#!topic/derbyjs/60gouek7334