I'm using socket.io to communicate between a server and many clients. Locally everything works fine. However, on production it doesn't work as desired. We use microservices, and this specific microservice is part of a suite of services that run together using webpack module federation.
When accessing the app directly through the k8s ingress, I can see that the clients are connecting to the server. However, clients running through module federation (in the bigger app that consists of my microapp and others), do not connect to the server, and every few seconds a 404 error is printed to the console for a GET request to:
http://<BASE_URL>/socket.io/?EIO=4&transport=polling&t=XXXXXX
where XXXXXX is some random string. I believe that I need to redirect the clients' sockets somehow to reach the server, but I don't know how to do so.
Relevant Client Code
const socket = io.connect("");
socket.on("someEvent", (param) => {
doSomething()
});
Relevant Server Code
let server = app.listen(port, (err) => {
if (err) throw err;
// Express server Ready on http://localhost:port
});
...
let io;
const initSocket = (server) => {
io = require("socket.io")(server, {
cors: { origin: "*" }
});
io.on('connection', (socket) => {
logging.mainLogger.info(`successfully connected to socket
${JSON.stringify(socket.id)}`);
});
io.listen(server);
}
const emitSomeEvent = (param) => {
io.emit("someEvent", param);
}
Related
Currently working on a project which is Vue on top of rails 4. I am consuming webhooks from the square API, and I want to be able to get that data into vue so that my data can be updated in real time as square data changes. I've asked this question before, but I'm at a slightly different point in the problem; this is getting down to the nuts and bolts.
Currently, on the server side, I have webhooks setup to fire off to a rails controller, and that works well, i can see that data coming in.
On the client side, I have a socket open and listening to that same rails controller endpoint.
What I'm having trouble with is that even though I can see the webhook hit the controller, and the socket is active, I cant seem to get the socket to pick up on the controller endpoint emitting data. Doesnt seem like the controller endpoint is passing the data along as I expect, and I suspect there is both a gap in my knowledge of rails about how to emit data properly, and how to properly consume it with a socket.
What am I missing to be able to connect the dots here?
Caveats:
I realize this might be possible with ActionCable, and as a last resort I may go with that, but my client is very against upgrading from rails 4 (heavy lift). If this happens to be the only way to do it, so be it, but I want to explore all other options first.
Im no rails expert, and have very little experience with sockets. So this whole approach might, and probably IS foolish. I also may be misunderstanding some parts of how some of these technologies work.
I am unfortunately, bound to using rails and vue.
I am working on localhost, and using ngrok to create a proper URL for the webhooks to hit. I doubt its a problem, but maybe?
I have explored using a node server behind the scenes and sending webhooks directly to that and listening to that server with the socket. Couldnt get that to work either, but tips on how to achieve that if its a good idea are also welcome.
For reference:
Rails Controller:
class WebhooksController < ApplicationController
skip_forgery_protection
def order_update
p request
p params
if request.headers['Content-Type'] == 'application/json'
data = JSON.parse(request.body.read)
else
# application/x-www-form-urlencoded
data = params.as_json
end
render json: data
end
end
Client Code (Vue && Socket):
import { createApp } from 'vue';
import SquareOrders from '../views/SquareOrders.vue';
import VueSocketIO from 'vue-socket.io';
import { io } from "socket.io-client";
const socket = io("http://localhost:3000", {
transports: ["polling", "flashsocket"],
withCredentials: true,
path: '/webhooks/order_update'
});
export default function loadOrdersApp(el, pinia) {
const app = createApp(SquareOrders);
app.use(pinia)
.use(new VueSocketIO({
debug: true,
connection: socket
}))
.mount(el);
}
Suggestions on better approaches are appreciated, as are corrections to my basic knowledge if I am misunderstanding something.
So, after a lot of trial and error, i managed to answer this with a different approach.
as far as I can tell, actually listening to the rails endpoint without something like ActionCable seems impossible, so I booted up and express server to run in the background and ingest incoming webhooks.
const express = require('express')
const bodyParser = require('body-parser')
const cors = require('cors');
// Create a new instance of express
const app = express()
// Tell express to use the body-parser middleware for JSON
app.use(bodyParser.json())
// ALLOW OUR CLIENT SIDE TO ACCESS BACKEND SERVER via CORS
app.use(cors({
origin: 'http://localhost:3000'
}));
// Tell our app to listen on port 3000
const server = app.listen(8080, function (err) {
if (err) {
throw err
}
console.log('Server started on port 8080')
})
const io = require('socket.io')(server);
app.set('io', io)
// Route that receives a POST request to /webhook
app.post('/webhook', function (req, res, next) {
const io = req.app.get('io');
console.log(req.body)
io.sockets.emit('orderUpdate', req.body)
//res.send(req.body)
res.sendStatus( 200 );
next();
})
io.on('connection', function(socket){
console.log('A connection is made');
});
With that in place, I pointed my webhook to localhost through ngrok, and then added logic in Vue through https://www.npmjs.com/package/vue-socket.io-extended
import { io } from "socket.io-client";
import SquareOrders from '../views/SquareOrders.vue';
import VueSocketIOExt from 'vue-socket.io-extended';
import socketStore from '../stores/sockets.js';
const socket = io("http://localhost:8080", {
transports: [ "websocket", "polling"],
});
export default function loadOrdersApp(el, pinia) {
const app = createApp(SquareOrders);
app.use(pinia)
//.use(socketStore)
.use(VueSocketIOExt, socket)
.mount(el);
}
To then listen for incoming socket emits on the socket.io channel.
With those both in place, i was able to ingest those socket emissions in my frontend app
export default {
name: "SquareOrders",
components: {
Filters,
GlobalLoader,
OrdersList,
FilterGroup,
Filter,
OrderActions
},
// LISTENING FOR THE SOCKETS IN VUE COMPONENT
sockets: {
connect() {
console.log('socket connected in vue')
},
orderUpdate(val) {
console.log(val)
debugger;
console.log('this method was fired by the socket server. eg: io.emit("customEmit", data)')
}
},
mounted() {
const os = orderStore();
os.getOrders();
os.getSourcesList();
os.getSandboxOrders();
console.log(this.$socket)
this.$socket.client.io.on('orderUpdate', (payload) => {
console.log(payload)
})
setInterval(function () {
os.getOrders();
}, 60000);
},
computed: {
...mapState(orderStore, {
orders: store => store.orders,
initalLoad: store => store.flags.loaded,
filterDefs: store => store.filters.definitions,
sandbox: store => store.sandbox
})
},
};
I hope to get some answers here as i have been trying to debug for few days now.
The expected behaviour:
socket in server connected once when user is in my website.
The actual behaviour:
socket in server connected twice when user is in my website.
I am using Next.js as the front end and node server as backend.
_app.js
const MyApp = () => {
useEffect(() => {
socket.once('connect', () => {
console.log('Connected');
});
return () => {
socket.disconnect();
};
}, []);}
server.js
let connections = [];
module.exports = function (io) {
io.on('connection', async (socket) => {
console.log(socket.id + ' connected', '\n');})}
after hours of research, i finally found the answer!
How To Implement React hook Socketio in Next.js
I followed the top answer and it worked! because nextjs render once in its server and its client.
I'm running a React app with node/express on an AWS EC2 instance using an Elastic Load Balancer.
Socket io runs fine when I access the EC2 instance directly but as soon as I access the app using the ELB url (http or https) it returns with GET current url net::ERR_CONNECTION_REFUSED pinged every second or so.
I have used AWS Certificate Manager to create a public certificate for the load balancer in order to be able to access the app via https. Here are the listeners I have enabled for ELB:
Listeners on ELB
On the client side I have the socket set up like so (used with react context):
let socket
const getMsg = (dispatch) => {
if(!socket){
socket = io(':3000', {secure: true})
socket.on('chat msg', function(msg){
dispatch({type:'RECEIVE_MSG', payload: msg})
})
}
}
const sendChat = (dispatch) => {
return async (value) => {
socket.emit('chat msg', value)
getMsg()
}
Here's part of the back-end code:
const app = express()
const http = require('http').createServer(app)
const io = require('socket.io')(http);
app.use(cors())
app.use(bodyParser.json())
app.use(methodOverride('_method'))
io.on('connection', socket =>{
console.log("new connection...")
socket.on('chat msg', async (msg) => {
io.emit('chat msg', msg)
});
})
const mongoUri = //hidden
mongoose.connect(mongoUri, {
useNewUrlParser: true,
useCreateIndex: true,
useUnifiedTopology: true
})
mongoose.connection.on('connected', () => {
console.log('you are connected')
})
mongoose.connection.on('error', (err) => {
console.log('connection error, please check your network settings')
})
http.listen(3000, ()=>{
console.log('Listening on 3000')
})
Do I need to setup my backend to accept https requests? Everything else within the app works fine using http on the node/express side even though I'm accessing the app using an https url. The only issue I'm having with requests to the express server is just with socket.io.
I'll add a note here, because I think this is valuable info: I ran into a similar issue a while back, you cannot use HTTP(s) listeners when trying to support web sockets on Classic ELBs.
Instead you can setup TCP SSL termination, and send the TCP to your backend which would allow regular http and ws traffic to work.
(Described here: https://www.built.io/blog/websockets-on-aws-s-elb#:~:text=AWS%20ELB%20doesn't%20support,IP%20information%20is%20not%20obtained)
Otherwise, you could opt for an ALB which does support the listeners as configured. I think ALB is likely the best option here, as it's sort of the "supported" solution from a infrastructure POV.
I'm making a REST API that works with routes and actions like /api/route/action. But I want to add WebSocket functionalities. So I want WebSockets to also be addressable by url.
I have this code:
const socketio = require('socket.io');
//server is a http.createServer()
module.exports = server => {
const io = socketio(server, { route: '/socketapi/test' );
io.on('connection', s => {
s.on('a', () => s.emit('b'));
s.emit('message', 'You connected to /test.');
});
const io2 = socketio(server, { route: '/socketapi/something_else' });
io2.on('connection', s => {
s.on('z', () => s.emit('y'));
s.emit('message', 'Hi');
});
};
The reason why I want to split them is so I don't have to keep track of event names I've already used, and so I can separate the logic in the connection event.
But it seems this is not possible. If I have two socket.io instances running I can't connect to either.
Is this possible or will I have to use some tricks and perhaps an event that the client can send to let me know what it wants to subscribe to?
You can use a built in feature of socket.io called namespaces to achieve this behaviour.
Here is a basic example:
Server side:
const nsp = io.of('/my-namespace');
nsp.on('connection', function(socket){
console.log('someone connected');
});
nsp.emit('hi', 'everyone!');
Client side:
const socket = io('/my-namespace');
Now the client can emit and receive messages which are specific to a namespace. With the use of namespaces your problem of name conflicts of the events, will be solved.
The problem I am trying to solve is
client makes a restful POST to node server.
node server communicates with another external server via socket.
when the socket response comes back from the other server - node server responds to client with the data received.
I can communicate to the client via REST and separately I can communicate to the external server via socket (response time is ~100ms). But combining these results yields nothing.
const sjsc = require('sockjs-client');
app.post('/form', function(req, res) {
const srvc = sjsc('http://external.server:port/path');
srvc.onopen = function () {
srvc.send(testData);
}
srvc.onmessage = function(data) {
console.log('received ', data);
res.send(data);
};
});
const srvc = sjsc('http://external.server:port/path');
this needed to be a let. this is the only thing i changed and works perfectly.
let srvc = sjsc('http://external.server:port/path');