Let's have NodeJS Express web app:
Is there any performance hit when using node-sass for compilation manually (by watcher in WebStorm for eyample) or by making Express middleware, so it's get compiled on http request?
The first example has a performance hit once while you're developing. The second example has the same performance hit every time an HTTP request occurs (or at least once if you use a CDN to cache assets). So, the 2nd is a bad idea IMO. Sass compilation should occur 1 time before deploying. You could even say the production environment need not even have these files.
I think you could actually use the node-sass-middleware, it will spare you the overhead setup of watching in development or the building process. Just run the middleware only in development environment, otherwise omit it.
if(process.env.NODE_ENV === "development"){
app.use(/** You Sass Middleware */)
}
Easy setup + No performance worries
Related
I have an express server inside a Cloud Run Docker container.
I'm getting those logs:
Are those generated by the express package? Or are those logs somehow generated by cloud run?
Here is what express docs says: https://expressjs.com/en/guide/debugging.html
Express uses the debug module internally to log information about route matches, middleware functions that are in use, application mode, and the flow of the request-response cycle.
But it does not give much detail on what those logs are and how you enable or disable them.
Should I leave them on? Won't it hurt my server's performance if it's going to log every request like that? This is NODE_ENV === "production.
These logs entry are generated by the Cloud Run runtime platform. It's not your Express server. The performance aren't impacted by this log, and in any cases, you can't deactivate them.
You could exclude them to save space in logging capacity (and save money), but I don't recommend this. They brings 3 of 4 golden signals of your application (error rate, latency, traffic). Very important for production monitoring
I'm very new to Node.js, so I might just not be getting it, but after searching quite a bit, and trying a few different solutions, I am still not able to find a decent way to mock API responses using Node for acceptance testing.
I've got a javascript app (written in elm actually) that interacts with an API (pretty common, I imagine), and I want to write some acceptance tests... so I setup WebdriverIO with selenium and mocha, write some tests, and of course now I need to mock some API responses so that I can setup some theoretical scenarios to test under.
mock-api-server: Looked pretty nice, but there's no way to adjust the headers getting sent back from the server!
mock-http-server: Also looked pretty nice, lets me adjust headers, but there's no way to reset the mock responses without shutting down the whole server... !? And that has issues because the server won't shut down while the browser window is still open, so that means I have to close and relauch the browser just to clear the mocks!
json-server: Simple and decent way to mock some responses, but it relies entirely on files on disk for the responses. I want something I can configure from within a test run without reading and writing files to disk.
Am I missing something? Is this not how people do acceptance testing in the Node universe? Does everyone just use a fixed set of mock data for their entire test suite? That just sounds insane to me... Particularly since it seems like it wouldn't be that hard to write a good one based on express server that has all the necessary features... does it exist?
Necessary Features:
Server can be configured and launched from javascript
Responses(including headers) can be configured on the fly
Responses can also be reset easily on the fly, without shutting down the server.
I hit this problem too, so I built one: https://github.com/pimterry/mockttp
In terms of the things you're looking for, Mockttp:
Lets you start & reconfigure the server dynamically from JS during the test run, with no static files.
Lets you adjust headers
Lets you reset running servers (though I'd recommend shutting down & restarting anyway - with Mockttp that takes milliseconds, is clear & easily automatable, and gives you some nice guarantees)
On top of that, it:
Is configurable from both Node & browsers with identical code, so you can test universally
Can handle running tests in parallel for quicker testing
Can fake HTTPS servers, self-signing certificates automatically
Can mock as an intercepting proxy
Has a bunch of nice debugging support for when things go wrong (e.g. unmatched requests come back with a readable explanation of the current configuration, and example code that would make the request succeed)
Just to quickly comment on the other posts suggesting testing in-process: I really wouldn't. Partly because a whole bunch of limitations (you're tied to a specific environment, potentially even specific node version, you have to mock for the whole process, so no parallel tests, and you can't mock subprocesses), but mainly because it's not truly representative. For a very very small speed cost, you can test with real HTTP, and know that your requests & responses will definitely work in reality too.
Is this not how people do acceptance testing in the Node universe? Does everyone just use a fixed set of mock data for their entire test suite?
No. You don't have to make actual HTTP requests to test your apps.
All good test frameworks lets you fake HTTP by running the routes and handlers without making network requests. Also you can mock the functions that are making the actual HTTP requests to external APIs, which should be abstracted away in the first place, so no actual HTTP requests need to take place here as well.
And if that's not enough you can always write a trivially simple server using Express, Hapi, Restify, Loopback or some other frameworks, or plain http, or even net module (depending on how much control do you need - for example you should always test invalid responses that don't use HTTP protocol correctly, or broken connections, incomplete connections, slow connections etc. and for that you may need to use lower lever APIs in Node) to provide the mock data yourself.
By the way, you also always need to test responses with invalid JSON because people often wrongly assume that the JSON they get is always valid which it is not. See this answer to see why it is particularly important:
Calling a JSON API with Node.js
Particularly since it seems like it wouldn't be that hard to write a good one based on express server that has all the necessary features... does it exist?
Not everything that "wouldn't be that hard to write" necessarily has to exist. You may need to write it. Hey, you even have a road map ready:
Necessary Features:
Server can be configured and launched from javascript
Responses(including headers) can be configured on the fly
Responses can also be reset easily on the fly, without shutting down the server.
Now all you need is choose a name, create a repo on GitHub, create a project on npm and start coding.
You now, even "it wouldn't be that hard to write" it doesn't mean that it will write itself. Welcome to the open source world where instead of complaining that something doesn't exist people just write it.
You could try nock. https://github.com/node-nock
It supports all of your feature requests.
i'm wondering if node.js is using cache for the follow scenario or if a module for that is existing:
When you have for example a web-portal which shows you at the startpage 20 products with images, every time the server has to fetch the images from the hdd or in best case from a ssd.
for every single image just to find it the server need about 5-7 ms.when you have 50 user are visiting the startpage at the same time it would take 20img * 5ms * 50 usr = 5000ms just to find the images on the hdd.
So it would be nice if is there was a way to keep all often used files like images, css, html and so on in the memory.so you just define the cache size. For example 50MB and the module/node.js keep the often used files in the cache.
Node.js itself (by default) doesn't do any caching, although OS and other lower layer elements (e.g., HDD) may do it, speeding up consecutive reads significantly.
If you want to enable cache'ing http responses in nodejs there is a http-cache library - https://www.npmjs.com/package/http-cache and request-caching library - https://www.npmjs.com/package/node-request-caching. For caching files you can use filecache https://www.npmjs.com/package/filecache and for serving static files - serve-static (https://www.npmjs.com/package/serve-static).
If you're using a framework such as Express it's not that simple anymore - for example, running Express in production mode causes it to cache some stuff (like templates or CSS). Also note that res.sendFile streams the file directly to the customer (possibly proxy server, such as nginx)
However, even Express' webpage (http://expressjs.com/en/advanced/best-practice-performance.html) advises to use a separate proxy:
Cache request results
Another strategy to improve the performance in
production is to cache the result of requests, so that your app does
not repeat the operation to serve the same request repeatedly.
Use a caching server like Varnish or Nginx (see also Nginx Caching) to
greatly improve the speed and performance of your app.
For other recommendations about speeding up nodejs you can take a look at https://engineering.gosquared.com/making-dashboard-faster or http://www.sitepoint.com/10-tips-make-node-js-web-app-faster/
The builtin Web server on Nodejs doesn't implement any content cache. Like #prc322 says the specific answer depend on your technology stack on top of Nodejs (framework, middleware, etc.). On the other hand Nginx is more widely used as Web server for static assets. Normally they are combined in some way that the static assets are server from Nginx and the application logic (rest services, etc.) are handled by the Nodejs application. Content caching is part of the HTTP protocol standard and you can benefit of it by setting a HTTP cache like varninsh in front of your Web application.
If there's any. Im not really into web technologies, but have to understand some awful code written by someone else in Node.
All node.js apps are npm modules once you execute npm init. After that point, you can publish the module by executing npm publish assuming you gave it a unique name in the package.json.
If the app isn't meant to return anything, then there is no need to export anything. However, it's almost always worth exporting something to allow for unit testing deeper than just starting the app as an http server and sending it requests.
It is also sometimes useful to modify the way your app runs based on whether it is being required as a module or executed as an app. For example, say i have an express rest api server. I can run it as a standalone server on api.example.com, and then require it into another application and run it from that application directly to avoid CORS issues without having to duplicate code or deal with git submodules, instead I simply npm install the api into the application that needs it and attach it just like you would a router. www.example.com/api
app.use('/api', require('#myusername/api.example.com'))
Going to run Karma+Jasmine to test angularjs client with real backend. Since Karma is using its own express but I need to access real nodejs backend with DB and other stuff, I'm thinking on adding interceptor into $httpProvider.interceptors that will just replace my calls to /api and redirec them into real backend location. Is there a better way?
You don't want to do that on unit tests (and personally I wouldn't even do that with E2E tests).
When doing unit tests, the $httpBackend get swapped with a dummy version that is not capable of doing real requests. That is intentional. You shouldn't do any test with the real backend.
On the other hand, there are E2E tests (where you test all your system together) where you could use the real backend in your tests (there are people who likes that, people who doesn't).
Keep in mind that unit test are all about units in isolation, that means that you don't care about dependencies and much much less about a backend.