I have a REST api using express server and I want to receive notifications from certain routes using Socket.io as a service.
Can I emit from withing an express route on a already running express server
and have a client listening to my api on it's root to log notifications?
What I'm trying to do:
in my route controller
exports.authenticate = (req, res) => {
... on success:
socket = io()
socket.emit('logged in', "new user logged in")
}
my socket io client:
let socket = io('localhost:3000', {cors: { origin: '*' }})
socket.on('logged in', () => {
console.log("login notification")
})
I'm running my api alone, and then I'm running my client script using node client.
but I'm not receiving anything
Related
Currently working on a project which is Vue on top of rails 4. I am consuming webhooks from the square API, and I want to be able to get that data into vue so that my data can be updated in real time as square data changes. I've asked this question before, but I'm at a slightly different point in the problem; this is getting down to the nuts and bolts.
Currently, on the server side, I have webhooks setup to fire off to a rails controller, and that works well, i can see that data coming in.
On the client side, I have a socket open and listening to that same rails controller endpoint.
What I'm having trouble with is that even though I can see the webhook hit the controller, and the socket is active, I cant seem to get the socket to pick up on the controller endpoint emitting data. Doesnt seem like the controller endpoint is passing the data along as I expect, and I suspect there is both a gap in my knowledge of rails about how to emit data properly, and how to properly consume it with a socket.
What am I missing to be able to connect the dots here?
Caveats:
I realize this might be possible with ActionCable, and as a last resort I may go with that, but my client is very against upgrading from rails 4 (heavy lift). If this happens to be the only way to do it, so be it, but I want to explore all other options first.
Im no rails expert, and have very little experience with sockets. So this whole approach might, and probably IS foolish. I also may be misunderstanding some parts of how some of these technologies work.
I am unfortunately, bound to using rails and vue.
I am working on localhost, and using ngrok to create a proper URL for the webhooks to hit. I doubt its a problem, but maybe?
I have explored using a node server behind the scenes and sending webhooks directly to that and listening to that server with the socket. Couldnt get that to work either, but tips on how to achieve that if its a good idea are also welcome.
For reference:
Rails Controller:
class WebhooksController < ApplicationController
skip_forgery_protection
def order_update
p request
p params
if request.headers['Content-Type'] == 'application/json'
data = JSON.parse(request.body.read)
else
# application/x-www-form-urlencoded
data = params.as_json
end
render json: data
end
end
Client Code (Vue && Socket):
import { createApp } from 'vue';
import SquareOrders from '../views/SquareOrders.vue';
import VueSocketIO from 'vue-socket.io';
import { io } from "socket.io-client";
const socket = io("http://localhost:3000", {
transports: ["polling", "flashsocket"],
withCredentials: true,
path: '/webhooks/order_update'
});
export default function loadOrdersApp(el, pinia) {
const app = createApp(SquareOrders);
app.use(pinia)
.use(new VueSocketIO({
debug: true,
connection: socket
}))
.mount(el);
}
Suggestions on better approaches are appreciated, as are corrections to my basic knowledge if I am misunderstanding something.
So, after a lot of trial and error, i managed to answer this with a different approach.
as far as I can tell, actually listening to the rails endpoint without something like ActionCable seems impossible, so I booted up and express server to run in the background and ingest incoming webhooks.
const express = require('express')
const bodyParser = require('body-parser')
const cors = require('cors');
// Create a new instance of express
const app = express()
// Tell express to use the body-parser middleware for JSON
app.use(bodyParser.json())
// ALLOW OUR CLIENT SIDE TO ACCESS BACKEND SERVER via CORS
app.use(cors({
origin: 'http://localhost:3000'
}));
// Tell our app to listen on port 3000
const server = app.listen(8080, function (err) {
if (err) {
throw err
}
console.log('Server started on port 8080')
})
const io = require('socket.io')(server);
app.set('io', io)
// Route that receives a POST request to /webhook
app.post('/webhook', function (req, res, next) {
const io = req.app.get('io');
console.log(req.body)
io.sockets.emit('orderUpdate', req.body)
//res.send(req.body)
res.sendStatus( 200 );
next();
})
io.on('connection', function(socket){
console.log('A connection is made');
});
With that in place, I pointed my webhook to localhost through ngrok, and then added logic in Vue through https://www.npmjs.com/package/vue-socket.io-extended
import { io } from "socket.io-client";
import SquareOrders from '../views/SquareOrders.vue';
import VueSocketIOExt from 'vue-socket.io-extended';
import socketStore from '../stores/sockets.js';
const socket = io("http://localhost:8080", {
transports: [ "websocket", "polling"],
});
export default function loadOrdersApp(el, pinia) {
const app = createApp(SquareOrders);
app.use(pinia)
//.use(socketStore)
.use(VueSocketIOExt, socket)
.mount(el);
}
To then listen for incoming socket emits on the socket.io channel.
With those both in place, i was able to ingest those socket emissions in my frontend app
export default {
name: "SquareOrders",
components: {
Filters,
GlobalLoader,
OrdersList,
FilterGroup,
Filter,
OrderActions
},
// LISTENING FOR THE SOCKETS IN VUE COMPONENT
sockets: {
connect() {
console.log('socket connected in vue')
},
orderUpdate(val) {
console.log(val)
debugger;
console.log('this method was fired by the socket server. eg: io.emit("customEmit", data)')
}
},
mounted() {
const os = orderStore();
os.getOrders();
os.getSourcesList();
os.getSandboxOrders();
console.log(this.$socket)
this.$socket.client.io.on('orderUpdate', (payload) => {
console.log(payload)
})
setInterval(function () {
os.getOrders();
}, 60000);
},
computed: {
...mapState(orderStore, {
orders: store => store.orders,
initalLoad: store => store.flags.loaded,
filterDefs: store => store.filters.definitions,
sandbox: store => store.sandbox
})
},
};
I'm using socket.io to communicate between a server and many clients. Locally everything works fine. However, on production it doesn't work as desired. We use microservices, and this specific microservice is part of a suite of services that run together using webpack module federation.
When accessing the app directly through the k8s ingress, I can see that the clients are connecting to the server. However, clients running through module federation (in the bigger app that consists of my microapp and others), do not connect to the server, and every few seconds a 404 error is printed to the console for a GET request to:
http://<BASE_URL>/socket.io/?EIO=4&transport=polling&t=XXXXXX
where XXXXXX is some random string. I believe that I need to redirect the clients' sockets somehow to reach the server, but I don't know how to do so.
Relevant Client Code
const socket = io.connect("");
socket.on("someEvent", (param) => {
doSomething()
});
Relevant Server Code
let server = app.listen(port, (err) => {
if (err) throw err;
// Express server Ready on http://localhost:port
});
...
let io;
const initSocket = (server) => {
io = require("socket.io")(server, {
cors: { origin: "*" }
});
io.on('connection', (socket) => {
logging.mainLogger.info(`successfully connected to socket
${JSON.stringify(socket.id)}`);
});
io.listen(server);
}
const emitSomeEvent = (param) => {
io.emit("someEvent", param);
}
I have a route in my express API where I want to emit messages using a websocket to a client. In this case, the client is another Node.js app. In this Node.js app, I try to connect to the socket and print messages received. Both the API and the Node app are on different ports. Can someone help me make this work?
Here's how I pass my socket to my express routes:
const server = app.listen(PORT, () => {
console.log(`Server on port ${PORT}`);
});
const io = require("socket.io")(server);
app.set("socketio", io);
Here's my REST API route:
exports.getAll = function(req,res){
var io = req.app.get('socketio');
io.emit('hi!');
}
Here's my socket io client, it uses socket.io-client
const socket = io('http://localhost:3000');
socket.on("message", data => {
console.log(data);
});
Unfortunately, I don't receive the 'hi' message from my API.
When I call /api/getAll I don't receive the message in my client app.
When emitting an event via socket.io you have you define the event name before the data.
Example:
exports.getAll = function(req, res){
var io = req.app.get("socketio");
io.emit("message", "hi!");
}
Now you'll be able to receive the message event from the client.
Reference:
https://socket.io/docs/v4/emitting-events/
I have a REST API backend. I'm trying to send WebSocket messages back to the client app when a site administrator invokes a route (i.e. updates user credentials);
let express = require("express");
let router = express.Router();
//update user credentials
router.put('/admin/user/:id', (req, res)=>{
// 1. Admin updates target user's profile/credentials
// 2. WebSocket message sent to target user client
// 3. Route sends res.status(200) to admin
});
I've seen a few WebSocket examples using 'ws', 'net', 'websocket' libraries, but they all show a simple event handling socket server that responds to socket messages outside of any express routes - let alone responds to a separate client.
Also, the event notification should be visible only to the target user and not all the other users connected to the socket server.
Figured it out. The WebSocket server is independent of the route.
const WebSocket = require("ws");
const wss = new WebSocket.Server({port: 5555});
// handle socket communications
wss.on('connection', (session)=> {
session.send("HELLO SESSION: "+ session.userid);
session.on('message', (message)=> {
console.log(`MSG: "${message}" recived.`);
session.send("Got it.");
});
});
// close
wss.on('close', function close() {
clearInterval(interval);
});
module.exports = wss;
Then, in the route, just include the service and use it to iterate through web socket connections like so:
let wss = require('./mysocketservice')
...
function sendAMessageTo(user, msg) {
wss.clients.forEach(function each(s) {
if(session.user = user)
session.send("still there?.. ");
}
}
I am trying to make a game server with node.js, socket.io.
The basic idea likes below.
Initialize socket.io instance when the server starts
Store instance in global scope, so controllers can access it
When API calls, we trigger some socket.io event in the controller or some other points
Here is the implementation I made ...
First, in server.js - entry point
let GlobalVars = require('./state/GlobalVars');
const apiRouters = require('./router');
...
app.use('/api', apiRouters);
app.get('/', (req, res) => {
res.sendFile(`${__dirname}/test/simpleClient.html`)
});
const httpServer = http.createServer(app);
let socketIOInstance = socketIO(httpServer);
socketIOInstance.on('connection', (socket) => {
console.log('SOCKET.IO A USER CONNECTED');
socket.on('create', (data) => {
console.log('SOCKET.IO create called', socket);
socket.join(data.room);
socketIOInstance.emit('message', 'New people joined');
});
socket.on('join', (data) => {
console.log('SOCKET.IO join called', data);
})
socket.emit('message', 'Hi');
});
GlobalVars.socketIO = socketIOInstance;
// Add to global, so the controllers can manage own actions like create, join ...
httpServer.listen(port, () => {
console.log(`Server Listening on the port ${port}`);
})
...
When I access from a client, I am able to see SOCKET.IO A USER CONNECTED and Hi in the browser console.
Second, In api controller.
let GlobalVars = require('../state/GlobalVars');
...
router.post('/create', (req, res) => {
console.log('GenerateGameSokect');
let game = new Game();
let gameId = game.gameId;
// console.log('Global vars ', GlobalVars.socketIO);
GlobalVars.socketIO.emit('create', {
room: gameId
});
res.json({
result : 'SUCCESS',
game : game
})
});
I imported GlobalVars which contains socketIO instance. So what I expected was, socket create event triggered from the statement GlobalVars.socketIO.emit('create', Object) but could not find message in the server logs.
I got no clue what I was missing.
The final form I pursue is something like...
When user call create API, I creates socket connection and room
API will called in HTTP protocol, but in the API, the server publishes some events. - pubsub like.
Thanks for reading my questions b. Here is full source code till now(bitbucket public)
================== EDIT ====================
I got understood (maybe...)
The user-flow I wanted was ...
The client call API
(In the server) Checking validation in API and if valid emit to socket.io
If event accepted send new status to all clients
However, creating socket.io connection in the server looks strange for me, the solution is up to the client.
New user-flow I will change
The client call a validation API
If return is valid, the client emit socket.io event. This time server only do validation, not emit socket.io
In socket event, send new status to all other users
================== EDIT #2 ====================
This is a kind of conclusion. It looks I just misunderstanding the concept of socket communication. Like answer and replies say, Socket and HTTP are totally different channels, there is no way to connect both. (At least, without open new connection from http server to socket)
If this is wrong, you could add reply, Thanks
Now I understand you. Or at least I think!
Let's put it this way: there are two (asymetric) sides on a socket, server and client. What I called, respectively, "global manager" and "socket" in my comment to your post.
const server = require('socket.io')(yourHttpServer);
// client is installed as well when `npm i socket.io`
const client = require('socket.io-client')('http://localhost:' + yourServerPort);
// `socket` is the server side of the socket
server.on('connection', (socket) => {
// this will be triggered by client sides emitting 'create'
socket.on('create', (data) => {
console.log('a client socket just fired a "create" event!');
});
});
// this will be triggered by server side emitting 'create'
client.on('create', (data) => {
server.emit('create', {content: 'this will result in an infinite loop of "create" events!'});
});
In your /create route, when you GlobalVars.socketIO.emit('create', ...), the server-side socket handler isn't triggered, however if you have clients connected through a browser (or, like I showed above, if you connect a client socket directly from the server) then these will trigger their 'create' listener, if any.
Hope this helps you get on the right tracks!