Here is a short snippet of node.js code (express.js and socket.io). Could sending POST requests and emitting socket responces be considered as a bad practice and why?E.g.:
var io = require('socket.io')(http);
app.post('/tickets', jsonParser, function(req, res) {
io.emit('ticket', req.body);
return res.sendStatus(200);
}
I see no problem with that. I actually created a notification system that receives the message and destination as a post and sends notifications to multiple sockets like that.
From your code it looks like that's what your are doing, someone creates a ticket and you send a notification to all listeners.
That seems to be the most practical way and added bonus of being a proper api for use with external server like php or .net. If you're just using it from your own node app than perhaps you could just make it a socket event instead unless you are planning on getting requests from outside your app.
Related
I'm building my app entirely over websocket. While I see benefit of being able to send data to client without having client to request data to me. There are still benefits to req/res type of interaction as you can put all logic together.
Ex:
Take registration. You send data, server sends a OK or ERROR if something is wrong.
Right there it's three events! DataFromClient, RegistrationFailed, RegistrationSuccess. But with REST I could have made one POST request and handle if else in one function.
It shouldn't be too hard to create a library that allows you do to push notification + Req/Res type of interaction. It'd be even better if routes could be defined like express routes.
There is no "standard" way to implement request/response with webSocket. It is not part of the webSocket standard. It can certainly be done (more below on this), but there is no "standard" way of doing it.
The socket.io library which is built on top of webSocket has a response option built into any message it sends.
The sender does something like this:
socket.emit("msgName", someData, function(data) {
// data is the "response" to sending this message
console.log(data);
});
The recipient of a message that is expecting a response does something like this to cause the response to be sent:
socket.on("msgName", (someData, fn) => {
// send response
fn(someOtherData);
});
You could implement your own response scheme in webSocket. You'd have to add to your data format a uniqueID for each request so you could send that same uniqueID back in the response so the receiver would know which response belongs with which request.
I'm writing an express app.js with socket.io, and came across a problem.
I can't figure out how to use the routes.
I want the client to write for example localhost:3000/?id=3 and get something according to the id.
But in the socket.io connection event I dont know the url or the params (or is there a way?)
io.on('connection', function (socket) {/*should be something according to the id in the url*/});
untill now I just checked the id with
app.get('/', function (req, res) {
//req.query.id
});
Anyone knows a way around this?
Thank you!
It appears you may be a bit confused about how you use webSockets. If you want to make an http request such as localhost:3000/?id=3, then you don't use webSockets. You use the normal routing mechanisms in Express.
A webSocket connection is created and then persists. From then on, you define messages with optional data as arguments for those messages and you can send these messages either direction on the webSocket. webSocket messages are sent on an existing webSocket, not to a URL. You could create a message for sending URLs from client to server if you wanted. If that was the case, you could do this in the client:
socket.emit("sendURL", url);
And, then you would listen for the "sendURL" message on the server.
Is it possible to send a response from express, and wait for a return response before continuing?
A typical scenario is something like this
Server A sends a request to server B.
Server B processes the request and sends to back to server A
Server B waits for a response from server A before continuing
Server A does further processing of the response from Server B and sends it back to Server B
Server B then handles the rest of the processing required.
My understanding is that normally this is handled with callbacks. In express I would expect to do something like
res.write('response', callback);
function callback() {
//do stuff
}
I don't see that this is possible with the res.write method though. Is there another method I can use with express to get this functionality? I've never used socket.io before, but this seems like a scenario where websockets would be useful. Am I wrong in this assumption?
res.on('finish', callback);
is sent when the last of the data is given to the OS to deal with.
http://nodejs.org/api/http.html#http_event_finish
If you need to know when the client receives/processes the data, however, the client must send something back to the server, in which case socket.io could help.
I appreciate all the responses and help from everyone, I ended up using sessions to get what I needed.
var session = require('express-session');
Thanks again
I'd like to add a live functionality to a PHP based forum - new posts would be automatically shown to users as soon as they are created.
What I find a bit confusing is the interaction between the PHP code and NodeJS+socket.io.
How would I go about informing the NodeJS server about new posts and have the server inform the clients that are watching the thread in which the post was posted?
Edit
Tried the following code, and it seems to work, my only question is whether this is considered a good solution, as it looks kind of messy to me.
I use socket.io to listen on port 81 to clients, and the server running om port 82 is only intended to be used by the forum - when a new post is created, a PHP script sends a POST request to localhost on port 82, along with the data.
Is this ok?
var io = require('socket.io').listen(81);
io.sockets.on('connection', function(socket) {
socket.on('init', function(threadid) {
socket.join(threadid);
});
});
var forumserver = require('http').createServer(function(req, res) {
if (res.socket.remoteAddress == '127.0.0.1' && req.method == 'POST') {
req.on('data', function(chunk) {
data = JSON.parse(chunk.toString());
io.sockets.in(data.threadid).emit('new-post', data.content);
});
}
res.end();
}).listen(82);
Your solution of a HTTP server running on a special port is exactly the solution I ended up with when faced with a similar problem. The PHP app simply uses curl to POST to the Node server, which then pushes a message out to socket.io.
However, your HTTP server implementation is broken. The data event is a Stream event; Streams do not emit messages, they emit chunks of data. In other words, the request entity data may be split up and emitted in two chunks.
If the data event emitted a partial chunk of data, JSON.parse would almost assuredly throw an exception, and your Node server would crash.
You either need to manually buffer data, or (my recommendation) use a more robust framework for your HTTP server like Express:
var express = require('express'), forumserver = express();
forumserver.use(express.bodyParser()); // handles buffering and parsing of the
// request entity for you
forumserver.post('/post/:threadid', function(req, res) {
io.sockets.in(req.params.threadid).emit('new-post', req.body.content);
res.send(204); // HTTP 204 No Content (empty response)
});
forumserver.listen(82);
PHP simply needs to post to http​://localhost:82/post/1234 with an entity body containing content. (JSON, URL-encoded, or multipart-encoded entities are acceptable.) Make sure your firewall blocks port 82 on your public interface.
Regarding the PHP code / forum's interaction with Node.JS, you probably need to create an API endpoint of sorts that can listen for changes made to the forum. Depending on your forum software, you would want to hook into the process of creating a new post and perform the API callback to Node.js at this time.
Socket.io out of the box is geared towards visitors of the site being connected on the frontend via Javascript. Upon the Node server receiving notification of a new post update, it would then notify connected clients of this new post and its details, at which point it would probably add new HTML to the DOM of the page the visitor is viewing.
You may want to arrange the Socket.io part of things so that users only subscribe to specific events being emitted by them being in a specific room such as "subforum123" so that they only receive notifications of applicable posts.
i have a project and I'm using socket.io with express ,
so what i need (i tried) is broadcasting a message but from an express action.
is this possible i don't know how to get a reference to send or broadcast.
app.get('/', function(req, res) {
//i need to send messages from here
});
Other things like using both express+socket.io is working with me :)
As long as I understand,
Why not use the socket message type as an event instead of a http get or post? On the client side you would send a message via the websocket with let's say an event property.
So in your case:
<script>
// Initialize socket.io ...
// and then
socket.send({event: 'homepage loaded', foo: 'bar'});
</script>
And on the server side:
var io = io.listen(server);
io.on('connection', function (client) {
client.on('message', function (message) {
if (message.event == 'homepage loaded') {
client.broadcast(...);
}
});
});
You might want to have a look at my socket.io + Express primer. What you want is covered in detail there.
// Send to all connected sockets
io.sockets.send('Hello, World!');
// Send to all sockets in a specified room
io.sockets.in('room').send('Hello, Room!');
Where io is the value returned by the call to socketio.listen(). You can place that code anywhere in your application, eg in your app.get callbacks.
Check out my example repo where I use ExpressJS + Juggernaut(pubsub over socket.io):
http://github.com/shripadk/express-juggernaut-demo
This might be overkill for what you need as it uses Publish/Subscribe. But it does, to a certain extent, solve your issue of using regular ExpressJS routes. Checkout the master branch after cloning the repository:
git checkout master
I Found a nice example how to make what i need but with faye it's here http://nodecasts.org/.
I don't know the difference between Juggernaut ,Faye and direct Socket.io but Faye is good
for my case .And i think both of them use Socket.io internally.