I'm trying to set up a realtime application using socket.io in Angular and node.js, which is not working as intended.
Whenever a client is making a new post, the other clients won't update until you interact with the client (e.g. clicking somewhere on the page, or clicking on the browsers tab).
However, having console open in the browser, I can see the new post in the console when I log the posts/objects - without the need to interact with the clients.
Angular:
import io from 'socket.io-client';
const socket = io('http://localhost:3000');
posts: Post[] = [];
...
// Inside ngOnInit:
socket.on('data123', (res) => {
console.log('Updating list..', res);
this.postService.getPosts();
this.postsSub = this.postService.getPostUpdateListener()
.subscribe((posts: Post[]) => {
this.posts = posts;
});
});
Displaying in the template:
<... *ngFor="let item of posts">
Inside PostsService:
getPosts() {
this.http.get<{ message: string, posts: Post[] }>('http://localhost:3000/api/posts')
.subscribe((postData) => {
this.posts = postData.posts;
this.postsUpdate.next([...this.posts]);
});
}
Node.js - this socket.io solution is not yet sending the actual list:
const io = socket(server);
io.sockets.on('connection', (socket) => {
console.log(`new connection id: ${socket.id}`);
sendData(socket);
})
function sendData(socket){
socket.emit('data123', 'TODO: send the actual updated list');
setTimeout(() => {
console.log('sending to client');
sendData(socket);
}, 3000);
}
What worked as intended:
Using setInterval instead "socket.on(..)" on the front-end gave the intended result, meaning the clients will update automatically without the need of interacting. I'm fully aware this solution is horrible, but I assume this pinpointing that it's something wrong with socket solution above in Angular part.
wait, every time when socket.on('data123', (res) => {... you are creating new subscribe? it's wrong way...you must create subscribe in your socket connect feature
Related
I am facing this weird issue. I am not a veteran of using Socket.io. I have been exploring this library as the app I am building needs a remote playing feature wherein players create invitations to other players so that they can use those invitations to join the game remotely. I am using React on the front end (client-side), and on the server side, I am using the Nodejs Express framework with Socket.io. I have also installed client-side Socket.io for React. The basic implementation is all working fine. When there is a new user accessing the client-side app, Server-side socket.io listens to the connection. Any events triggered by the client also get reported on the server side. I am also able to broadcast the events back to all the connected clients using the socket.broadcast.emit() method.
I am trying to store the past events (basically, these are the invitations created by the currently connected players) in an array and then emit the stored array for the new connections so that the new users will see the past events(invitations). Below is my implementation on the server side:
//Array to store previously emitted events
const activeInvites = [];
//SocketIO connections
io.on("connection", (socket) => {
console.log(`âĄ: ${socket.id} user just connected!`);
//Listen to the new invites
socket.on("newInvite", (invite) => {
activeInvites.push(invite);
socket.broadcast.emit("newPrivateInvites", invite);
});
//Publish all previously created invites to the new connections
io.emit("activeInvites", activeInvites); //new connections emit this event however the client won't listen to "activeInvites" event
socket.on("disconnect", () => {
console.log(`ðĨ: ${socket.id} user disconnected`);
destroy();
});
function destroy() {
try {
socket.disconnect();
socket.removeAllListeners();
socket = null; //this will kill all event listeners working with socket
//set some other stuffs to NULL
} catch (ex) {
console.error("Error destroying socket listeners:", ex.message);
}
}
});
And, below is my client-side implementation:
useEffect(() => {
socket.on("activeInvites", (invite) => {
console.log(invite);
}); //a new connection client skips listening to this event. Can't understand why.
socket.on("newPrivateInvites", (invite) => {
setPrivateInvites((existingInvites) => [...existingInvites, invite]);
});
//I have commented below code. Even if I uncomment it, no difference
// return () => {
// socket.off("newPrivateInvites");
// socket.off("activeInvites");
// socket.removeAllListeners();
// };
}, [socket, privateInvites]);
//Below is the handler function I use to open up a Sweetalert2 dialog to create an invite
const createMyGameInviteHandler = () => {
swalert
.fire({
title: "New Invite",
text: "This will create a new game invite and unique joining code that you can share with your friends",
iconHtml: '<img src="/images/invite.png" />',
showCancelButton: true,
confirmButtonColor: "#3085d6",
cancelButtonColor: "#d33",
confirmButtonText: "Yeh! Let's Go!",
customClass: {
icon: "no-border",
},
})
.then((result) => {
if (result.isConfirmed) {
player.gameId = "1234";
setMyGameInvite(player);
socket.emit("newInvite", player); //This is where I create a new invitation event
}
});
};
In the above code, the "activeInvites" event is getting skipped by the new client even after socket.io on the server side triggers a new event after the new connection is created. Note that I am using io.emit() to emit the event to all the connected clients. So, even new clients should also listen. I am not able to see where the problem is. Could you please help me with this?
I tried to store the events generated by the client and consumed by the server in the past so that I could serve those events to the new clients when they establish the connection. I was expecting that io.emit() method would emit the event that will be consumed by all the clients including the new clients. However, new clients are skipping listening to this event. I am using useEffect hook in a react component.
I am trying to implement a Websocket connection from a React TypeScript app using RTK query. At the moment I am just trying to connect to a local socket.io server BUT ultimately it will be an AWS API Gateway with Cognito auth. In any case I having some problems getting this to work as a simple starting point. I have a few elements at play that may be causing the issue/s:-
MSW is being used to intercept http requests to mock a restful API locally. I wonder if this is one of the issues
I am adding the Websocket as a query to an RTK Query createApi object with other queries and mutations. In reality the Websocket query will need to hit a different API Gateway to the one that is being set as the baseQuery baseUrl currently. Do I need to create a new and separate RTK Query api using createApi() for the Websocket query?
Anyhow, here is the server code:-
// example CRA socket.io from https://github.com/socketio/socket.io/blob/main/examples/create-react-app-example/server.js
const getWebsocketServerMock = () => {
const io = require('socket.io')({
cors: {
origin: ['http://localhost:3000']
}
});
io.on('connection', (socket: any) => {
console.log(`connect: ${socket.id}`);
socket.on('hello!', () => {
console.log(`hello from ${socket.id}`);
});
socket.on('disconnect', () => {
console.log(`disconnect: ${socket.id}`);
});
});
io.listen(3001);
setInterval(() => {
io.emit('message', new Date().toISOString());
}, 1000);
console.log('Websocket server file initialised');
};
getWebsocketServerMock();
export {};
My RTK Query api file looks like this:-
reducerPath: 'someApi',
baseQuery: baseQueryWithReauth,
endpoints: (builder) => ({
getWebsocketResponse: builder.query<WebsocketResult, void>({
query: () => ``,
async onCacheEntryAdded(arg, { updateCachedData, cacheDataLoaded, cacheEntryRemoved }) {
try {
// wait for the initial query to resolve before proceeding
await cacheDataLoaded;
const socket = io('http://localhost:3001', {});
console.log(`socket.connected: ${socket.connected}`);
socket.on('connect', () => {
console.log('socket connected on rtk query');
});
socket.on('message', (message) => {
console.log(`received message: ${message}`);
// updateCachedData((draft) => {
// draft.push(message);
// });
});
await cacheEntryRemoved;
} catch {
// no-op in case `cacheEntryRemoved` resolves before `cacheDataLoaded`,
// in which case `cacheDataLoaded` will throw
}
}
}),
getSomeOtherQuery(.....),
getSomeOtherMutation(....),
Any advice or thoughts would be greatly appreciated! I guess my main question is should I be able to combine the websocket query in the same createApi function with other queries and mutations that need to use a different baseQuery url as they need to hit different API Gateways on AWS?
Much thanks,
Sam
You can circumvent the baseQuery from being used by specifying a queryFn instead of query on your endpoint.
In the most simple version, that just returns null as data so you can modify it later - but if you have an initial websocket request you can also do that in the queryFn.
queryFn: async () => { return { data: null } },
I'm making a tweet deleter, and I want to update the user on the progress.
I'm new to socket.io, I managed to connect the react frontend to the nodejs/express backend.
io.on("connection", (socket) => {
console.log("new connection");
socket.on("disconnect", () => console.log("disconnected"));
});
When a user clicks the delete button, a delete request goes to the backend, the file containing the tweets is then processed and thanks to Bull, each tweet is queued as a job.
because I added ÃŽo to my routes, I can use it inside of them, but io.emit() emits to connected clients, and I only want emit to sender by using socket.emit() inside my routes as well as inside my jobs in the queue.
The approach I tried was to write a function inside io.on("connection") like this and to make it global :
io.on("connection", (socket) => {
console.log("new connection");
socket.on("disconnect", () => console.log("disconnected"));
global.emitCustom = function (event, payload) {
socket.emit(event, payload);
};
});
which allowed me to use in the queue process function :
const deletionProcess = async ({
data: { tweetId, tokens, twitterId, numberOfTweets, index },
}) => {
emitCustom("deleting", {
type: "deleting",
progress: Math.round(((index + 1) / numberOfTweets) * 100),
});
};
Is there a better way to do this? is my approach wrong or does it present any flaws?
It worked in the few tests I did.
I have 3 components device, server and frontend (admin).
Server
Starts socket.io server with 2 namespaces /admin and /client.
If socket from /admin namespace sends data, server passes it along to /client namespace. If socket from /client namespace sends data, server passes it along to /admin namespace.
const io = require('socket.io');
const device = io.of('/device');
const admin = io.of('/admin');
device.on('connection', (socket) => {
socket.on('data', (data) => {
console.log("PASSING DATA FROM [DEVICE] TO [ADMIN]")
admin.emit('data', data);
})
});
admin.on('connection', (socket) => {
socket.on('data', (data) => {
console.log("PASSING DATA FROM [ADMIN] TO [DEVICE]")
device.emit('data', data);
});
});
io.listen(80);
Device
Uses socket.io-client to connect to socket.io server.
Starts interactive shell session using node-pty.
const io = require('socket.io-client');
const socket = io('http://localhost:80/client');
const os = require('os');
const pty = require('node-pty');
const shell = os.platform() === 'win32' ? 'powershell.exe' : 'bash';
const ptyProcess = pty.spawn(shell, [], {
name: 'xterm-color',
cols: 80,
rows: 30
});
socket.on('connect', () => {
});
// INPUT DATA
socket.on('data', (data) => {
ptyProcess.write(data);
});
// OUTPUTING DATA
ptyProcess.onData = (data) => {
socket.emit('data', data)
}
Frontend
Finally I have the frontend which uses xterm.js to create a terminal inside the browser. I am using vue. The browser client as well connects to socket.io server on the /admin namespace. Basically I have this :
<template>
<div id="app">
<div id="terminal" ref="terminal"></div>
</div>
</template>
<script>
import { Terminal } from 'xterm';
import { FitAddon } from 'xterm-addon-fit';
import { io } from 'socket.io-client';
export default {
mounted() {
const term = new Terminal({ cursorBlink : true });
term.open(this.$refs.terminal);
const socket = io('http://localhost:80/admin');
socket.on('connect', () => {
term.write('\r\n*** Connected to backend***\r\n');
term.onData((data) => {
socket.emit('data', data);
})
socket.on('data', (data) => {
term.write(data);
});
socket.on('disconnect', () => {
term.write('\r\n*** Disconnected from backend***\r\n');
});
});
}
}
</script>
Problem
â Starting the pty session seems to work, at least there are now errors reported. However it seems the onData listener callback is never fired, even when I ptyProcess.write() something.
â Getting input from xterm all the way to the device ptyProcess.write does not seem to work. I can see the data passed along through the socket.io sockets all the way to the device. But from there nothing happens. What do I miss ? Also I don't see my input in the xterm window as well.
After switching from child_process to using node-pty to create an interactive shell session I almost had it right. Following the node-pty documentation it marked the on('data') eventhandler as deprecated. Instead I should use .onData property of the process to register a callback. Like this:
ptyProcess.onData = function(data) {
socket.emit('data', data);
};
But that didn't do anything. So I switched back to the depracated way of adding an event listener:
ptyProcess.on('data', function(data) {
socket.emit('data', data);
});
Now I have a working interactive shell session forwarded from a remote device through websocket inside my browser â
.
UPDATE
Did more digging for onData property. Realized it's not a property but a method so I used it wrong. This would be the prefered way :
ptyProcess.onData(function(data) {
socket.emit('data', data);
});
Which also works as expected ð
In my case, Socket.io acts very unstably. It is needed to make it more debuggable and stable.
Introduction
There is a Node'js based backend (Strapi CMS), wherein bootstrap.js I have a right to run my custom code - in my case the socket server:
module.exports = async () => {
process.nextTick(() =>{
var io = require('socket.io')(strapi.server);
io.on('connection', async function(socket) {
console.log(`a user connected`)
// send a message on user connection
socket.emit('question', {message: await strapi.services.question.findOne({ id: 1 })});
//Send a message after a timeout of 4seconds
setTimeout(function() {
socket.emit('endEvent', { description: 'A custom event named EndEvent!'});
}, 4000);
// listen for user diconnect
socket.on('disconnect', () =>{
console.log('a user disconnected')
});
});
strapi.io = io; // register socket io inside strapi main object to use it globally anywhere
})
};
This server is expected to send a question from the database to the socket via custom event -question. The frontend runs with ReactJS. Whenever the page is loaded the socket is created, it establishes a connection with the server and receives the question event from the server, and the data of the quiz question, which is printed as an object.
import React, { useState, useEffect } from "react";
import io from "socket.io-client";
const ENDPOINT = "http://localhost:1337";
const socket = io(ENDPOINT);
function Question() {
const [questionID,setQuestionID] = useState(0)
useEffect(() => {
socket.on('message', (data)=> {setResponse(data)});
socket.on('endEvent', ()=>{console.log('End event done.')})
socket.on('question', (msg, cb) => {
console.log( msg.message)
});
// CLEAN UP THE EFFECT
return () => socket.disconnect();
}, [questionID]);
return(<p>The question debugable from console</p>);
}
export default Question;
Problem.
When I refresh the page, the console.log( msg.message) does print on time, but not each time. Sometimes it just prints when some other component of React is being triggered. For detailed info please see screenshots. This is expected:
It prints the object of question and also End event done. Which is triggered by the server in 4 seconds right after the connection. Instability shows itself when both of the commands run twice, or the object of the question itself delays to display anyhow.
I am not sure if the question is not being sent from the server or it is not printed in the console on client-side... The useEffect() is loaded at first, then triggered each time whenever questionID is changed.
Unexplained behavior
When the countdown time component triggers that the time is expired, you may mention that the 2 important commands of printing and 4 seconds session delay are somehow triggered, which is unexplainable for me, how two independent components of ReactJS may "not fit".
On the server-side, I receive information "connected" and "disconnected" when refreshed. Please comment if an update is needed or have a suggestion for a better title.