So, I have a route function like the following:
var http = require('http').createServer(start);
function start(req, res){
//routing stuff
}
and below that,I have a socket.io event listener:
io.sockets.on('connection', function(socket){
socket.on('event', function(data){
//perform an http response
}
}
When the socket event 'event' is called, I would like to perform an http response like the following:
res.writeHead(200, {'Content-disposition': 'attachment; filename=file.zip'})
res.writeHead(200, { 'Content-Type': 'application/zip' });
var filestream = fs.createReadStream('file.zip');
filestream.on('data', function(chunk) {
res.write(chunk);
});
filestream.on('end', function() {
res.end();
});
This last part, when performed within the routing function works just fine, but unfortunately when it is called from the socket event, it of course does not work, because it has no reference to the 'req' or 'res' objects. How would I go about doing this? Thanks.
Hmmm... interesting problem:
It's not impossible to do something like what you're trying to do, the flow would be something like this:
Receive http request, don't respond, keep res object saved somewhere.
Receive websocket request, do your auth/"link" it to the res object saved earlier.
Respond with file via res.
BUT it's not very pretty for a few reasons:
You need to keep res objects saved, if your server restarts a whole bunch of response objects are lost.
You need to figure out how to link websocket clients to http request clients. You could do something with cookies/localstorage to do this, I think.
Scaling to another server will become a lot harder / will you proxy clients to always be served by the same server somehow? Otherwise the linking will get harder.
I would propose a different solution for you: You want to do some client/server steps using websockets before you let someone download a file?
This question has a solution to do downloads via websocket: receive file via websocket and initiate download dialog
Sounds like it won't work on older browsers / IE, but a nice option.
Also mentions downloading via hidden iframe
Check here whether this solution is cross-browser enough for you: http://caniuse.com/#feat=datauri
Another option would be to generate a unique URL for the download, and only append it to the browser's window (either as a hidden iframe download, or as a simple download button) once you've done your logic via websocket. This option would be more available cross-browser and easier to code.
Related
I'm using ws version 7.4.0 and I would want to display a console log or perfom operations between the moment where the client is sending a message to the server and before the server fire the on event message.
To represent it:
webserver.on('example', function callback(msg){console.log(msg);}); //act before the call of callback
client------server---[here]---callback
The only way I see right now would be to use a "root" function before the callback of all my events like this:
function callback(msg){console.log(msg);}
webserver.on('example', function root(msg) {console.log('example msg'); callback(msg);});
I don't know if this is a real and/or good solution I really wish to write a clean and organized application.
If someone could give me some advise or a real solution? Thank you.
You could make a wrapper for all of your callbacks like so:
function makeCallback(fn) {
return function(msg) {
if (!environment.prod) console.log(msg);
fn(msg)
};
}
var myCallback = makeCallback(function (msg) {
// something
});
webserver.on('example', myCallback);
Or I think the better solution is to stream the requets into your stdout although I don't know the implications of using this method.
And I want to address the naming of your websocket server. Even though a web socket server is technically a web server, it only responds to the websocket protocol and naming it webserver could be misleading, I would recommend using the naming like in their documents wss.
I have a long running process which needs to send data back at multiple reponse. Is there some way to send back multiple responses with express.js
I want have "test" after 3 seconde a new reponse "bar"
app.get('/api', (req, res) =>
{
res.write("test");
setTimeout(function(){
res.write("bar");
res.end();
}, 3000);
});
with res.write a wait a 3050 of have a on response
Yes, you can do it. What you need to know is actually chunked transfer encoding.
This is one of the old technics used a decade ago, since Websockets, I haven't seen anyone using this.
Obviously you only need to send responses in different times maybe up to some events will be fired later in chunks. This is actually the default response type of express.js.
But there is a catch. When you try to test this, assuming you are using a modern browser or curl which all of them buffer chunks, so you won't be seeing expected result. Trick is filling up chunk buffers before sending consecutive response chunks. See the example below:
const express = require('express'),
app = express(),
port = 3011;
app.get('/', (req, res) => {
// res.write("Hello\n");
res.write("Hello" + " ".repeat(1024) + "\n");
setTimeout(() => {
res.write("World");
res.end();
}, 2000);
});
app.listen(port, () => console.log(`Listening on port ${port}!`));
First res.write with additional 1024 spaces forces browser to render chunk. Then you second res.write behaves as you expected. You can see difference with uncommenting the first res.write.
On network level there is no difference. Actually even in browser you can achieve this by XHR Object (first AJAX implementation) Here is the related answer
An HTTP request response has one to one mapping. You can return HTTP response once, however long it may be. There are 2 common techniques:
Comet responses - Since Response is a writable stream, you can write data to it from time to time before ending it. Ensure the connection read timeout is properly configured on the client that would be receiving this data.
Simply use web sockets - These are persistent connection that allow 2 way messaging between server and client.
Here is a short snippet of node.js code (express.js and socket.io). Could sending POST requests and emitting socket responces be considered as a bad practice and why?E.g.:
var io = require('socket.io')(http);
app.post('/tickets', jsonParser, function(req, res) {
io.emit('ticket', req.body);
return res.sendStatus(200);
}
I see no problem with that. I actually created a notification system that receives the message and destination as a post and sends notifications to multiple sockets like that.
From your code it looks like that's what your are doing, someone creates a ticket and you send a notification to all listeners.
That seems to be the most practical way and added bonus of being a proper api for use with external server like php or .net. If you're just using it from your own node app than perhaps you could just make it a socket event instead unless you are planning on getting requests from outside your app.
I'd like to add a live functionality to a PHP based forum - new posts would be automatically shown to users as soon as they are created.
What I find a bit confusing is the interaction between the PHP code and NodeJS+socket.io.
How would I go about informing the NodeJS server about new posts and have the server inform the clients that are watching the thread in which the post was posted?
Edit
Tried the following code, and it seems to work, my only question is whether this is considered a good solution, as it looks kind of messy to me.
I use socket.io to listen on port 81 to clients, and the server running om port 82 is only intended to be used by the forum - when a new post is created, a PHP script sends a POST request to localhost on port 82, along with the data.
Is this ok?
var io = require('socket.io').listen(81);
io.sockets.on('connection', function(socket) {
socket.on('init', function(threadid) {
socket.join(threadid);
});
});
var forumserver = require('http').createServer(function(req, res) {
if (res.socket.remoteAddress == '127.0.0.1' && req.method == 'POST') {
req.on('data', function(chunk) {
data = JSON.parse(chunk.toString());
io.sockets.in(data.threadid).emit('new-post', data.content);
});
}
res.end();
}).listen(82);
Your solution of a HTTP server running on a special port is exactly the solution I ended up with when faced with a similar problem. The PHP app simply uses curl to POST to the Node server, which then pushes a message out to socket.io.
However, your HTTP server implementation is broken. The data event is a Stream event; Streams do not emit messages, they emit chunks of data. In other words, the request entity data may be split up and emitted in two chunks.
If the data event emitted a partial chunk of data, JSON.parse would almost assuredly throw an exception, and your Node server would crash.
You either need to manually buffer data, or (my recommendation) use a more robust framework for your HTTP server like Express:
var express = require('express'), forumserver = express();
forumserver.use(express.bodyParser()); // handles buffering and parsing of the
// request entity for you
forumserver.post('/post/:threadid', function(req, res) {
io.sockets.in(req.params.threadid).emit('new-post', req.body.content);
res.send(204); // HTTP 204 No Content (empty response)
});
forumserver.listen(82);
PHP simply needs to post to http​://localhost:82/post/1234 with an entity body containing content. (JSON, URL-encoded, or multipart-encoded entities are acceptable.) Make sure your firewall blocks port 82 on your public interface.
Regarding the PHP code / forum's interaction with Node.JS, you probably need to create an API endpoint of sorts that can listen for changes made to the forum. Depending on your forum software, you would want to hook into the process of creating a new post and perform the API callback to Node.js at this time.
Socket.io out of the box is geared towards visitors of the site being connected on the frontend via Javascript. Upon the Node server receiving notification of a new post update, it would then notify connected clients of this new post and its details, at which point it would probably add new HTML to the DOM of the page the visitor is viewing.
You may want to arrange the Socket.io part of things so that users only subscribe to specific events being emitted by them being in a specific room such as "subforum123" so that they only receive notifications of applicable posts.
I am using node-curl as a HTTPS client to make requests to resources on the web and the code runs on a machine behind a proxy facing the internet.
The code I am using to co:
var curl = require('node-curl');
//Call the curl function. Make a curl call to the url in the first argument.
//Make a mental note that the callback to be invoked when the call is complete
//the 2nd argument. Then go ahead.
curl('https://encrypted.google.com/', {}, function(err) {
//I have no idea about the difference between console.info and console.log.
console.info(this.body);
});
//This will get printed immediately.
console.log('Got here');
node-curl detects the proxy settings from the environment and gives back the expected results.
The challenge is: the callback gets fired after the entire https-response gets downloaded, and as far as I can tell there are no parallels for the 'data' and 'end' events from the http(s) modules.
Further, after going through the source code, I found that indeed the node-curl library receives the data in chunks: reference line 58 in https://github.com/jiangmiao/node-curl/blob/master/lib/CurlBuilder.js . It seems that no events are emitted presently in this case.
I need to forward the possibly-sizable-response back to the another computer on my LAN for processing, so this is a clear concern for me.
Is using node-curl recommended for this purpose in node?
If yes, how can I handle this?
If no, then what would be a suitable alternative?
I would go for the wonderful request module, at least if the proxy settings are no more advanced than what it supports. Just read the proxy settings from the environment yourself:
var request = require('request'),
proxy = request.defaults({proxy: process.env.HTTP_PROXY});
proxy.get('https://encrypted.google.com/').pipe(somewhere);
Or if you don't want to pipe it:
var req = proxy.get({uri: 'https://encrypted.google.com/', encoding: 'utf8'});
req.on('data', console.log);
req.on('end', function() { console.log('end') });
Above, I also pass the encoding I expect in the response. You could also specify that in the defaults (the call to request.defaults() above), or you could leave it in which case you will get Buffers in the data event handler.
If all you want to do is to send it to another URL, request is perfect for that:
proxy.get('https://encrypted.google.com/').pipe(request.put(SOME_URL));
Or if you'd rather POST it:
proxy.get('https://encrypted.google.com/').pipe(request.post(SOME_URL));
Or, if you want to proxy the request to the destination server as well:
proxy.get('https://encrypted.google.com/').pipe(proxy.post(SOME_URL));