node js file upload download and streaming data - node.js

I have been reading about Node and how it is single threaded. If I have a large file(500mb) to upload to server or download the file from a server , I am guessing this cannot happen as asynchronous at the server side . Is this a bad use case to use nodejs in this scenario ? or is there a solution where this can be done without blocking the event loop ?

There's one user thread but there are other threads in node.
Most IO operations are done for you behind the scene and you only act on events. Typically, you'll receive events with chunks of data and, if other requests happen at the same time, they may be interlaced with other events. If you don't do a lot in the main thread (which is usually the case), there's no reason your program blocks during an upload.

Related

NodeJS - Reliability

One thing I know for a fact is that Node.js shouldn't be used to intensive CPU tasks. Now, imagine that I have a node.js server receiving audio stream from several clients (from MIC). This audio is buffered in a c/c++ addon by doing memcpy (which is very fast). But when the endevent is triggered, this addon will convert "audio-to-command" and send it, to client. This conversion consumes 75ms (max). Can Node.js be considered an reliable solution for this problem? 75ms can be considered an intensive task in node.js? What is the maximum time recommended to blocking operations?
Blocking is not a Node.js way.
You can make this operation asynchronously (in a separate thread), without any blocking, and invoke a callback from your addon when the operation will be finished, so the main node.js thread will not be blocked and will be able to handle other requests.
There are good helpers like AsyncWorker and AsyncQueueWorker in NAN.
https://github.com/nodejs/nan
Also there are C++ libraries to work with WebSockets, so I would think about a direct connection between clients and the addon.

What is the best way doing intervals without blocking the event loop in Node Js server?

im really new to node.js and i have a beginners question.
I plan to creat a node server that will execute a http request for a json file every 1-2 seconds.
The reason for doing a request so fast is because the json file im requesting is changing constantly.
What is the correct way doing that and not blocking the event loop?
Is it safe to put the request code in a function and call it in a setTimeout() function?
Should i run the requests in a child process?
The whole point of asynchronicity is that it is asynchronous. When you issue the HTTP request, it is sent off more or less instantly and node returns to other business, waking up only when the response is received. The only thing that could cause a "blocking" of the event loop in your case is very intensive processing either preparing the request or processing the response.
That is why on the front page of the node website it says "node uses an event-driven, non-blocking I/O model".

node js on heroku - request timeout issue

I am using Sails js (node js framework) and running it on Heroku and locally.
The API function reads from an external file and performs long computations that might take hours on the queries it read.
My concern is that after a few minutes it returns with timeout.
I have 2 questions:
How to control the HTTP request / response timeout (what do I really need to control here?)
Is HTTP request considered best practice for this target? or should I use Socket IO? (well, I have no experience on Socket IO and not sure if I am not talking bullshit).
You should use the worker pattern to accomplish any work that would take more than a second or so:
"Web servers should focus on serving users as quickly as possible. Any non-trivial work that could slow down your user’s experience should be done asynchronously outside of the web process."
"The Flow
Web and worker processes connect to the same message queue.
A process adds a job to the queue and gets a url.
A worker process receives and starts the job from the queue.
The client can poll the provided url for updates.
On completion, the worker stores results in a database."
https://devcenter.heroku.com/articles/asynchronous-web-worker-model-using-rabbitmq-in-node

How does Node.js behave against a low latency mobile network?

I want to develop a mobile app that reads and occasionally writes tiny chunks of text and images of no more than 1KB. I was thinking to use node.js for this (I think fits perfectly), but I heard that node.js uses one single thread for all request in an async model. It's ok, but what if a mobile through a very low latency network is reading byte by byte (I mean very slowly) one of that chunks of text? Does this mean that if the mobile needs 10 seconds after completing the read, the rest of the connections has to wait 10 seconds before node.js replies them? I really hope that no.
No — incoming streams are evented. The events will be handled by the main thread as they come in. Your JavaScript code is executed only in this main thread, but I/O is handled outside that thread and raises events that trigger your callbacks in the main thread.

What is most efficient approach processing data read from socket?

I would like to use libev for a streaming server I am writing.
This is how everything is supposed to work:
client opens a TCP socket connection to server
server receives connection
client sends a list of images they would like
server reads request
server loops through all of the images
server reads image from NAS
server processes image file meta data
server sends image data to client
I found sample code that allows me to read and write from the socket using libev I/O events (epoll under the hood). But, I am not sure how to handle the read from NAS and processing.
This could take some time. And I don't want to block the server while this is happening.
Should this be done in another thread, and have the thread send the
image data back to the client?
I was planning on using a thread pool. But, perhaps libev can support a processing step without blocking?
Any ideas or help would be greatly appreciated!
You'll need a file I/O library (such as Boost::ASIO) that supports asynchronous reads. The underlying APIs are aio_read, aio_suspend, lio_listio.

Resources