Using socketio with react redux - node.js

Im building a small chat application using react,redux,socketio and node with mongoose. Normally redux flows through actions (which makes API calls and receive data) and dispatch the data. But in my case the socket will emit to a certain event but it would not return data until we manually emit the data from the back-end. so to achieve the proper redux flow should i add a socket event on actions to retrieve the data (coming from back-end) and then dispatch it or is there any other proper way to achieve this?
Here is a sample code of what i'm planing to do in
Actions file
function sendMessage(data) {
return {
type: SEND_MESSAGE,
payload: data
};
}
export const sendNewMessage = (socket,data) => {
return dispatch => {
socket.emit("send message",data);
socket.on("new message",function(data){
dispatch(sendMessage(data));
});
};
};

That seems perfectly reasonable to me. I would suggest using thunk's "extra argument" for this such that your components do not need to know about the actual socket object:
const store = createStore(
reducer,
applyMiddleware(thunk.withExtraArgument({ socket }))
)
export const sendNewMessage = (data) =>
(dispatch, getState, { socket }) => {
socket.emit("send message", data)
socket.on("new message", (data) => {
dispatch(sendMessage(data))
})
}

Related

Websockets with RTK Query configuration issues

I am trying to implement a Websocket connection from a React TypeScript app using RTK query. At the moment I am just trying to connect to a local socket.io server BUT ultimately it will be an AWS API Gateway with Cognito auth. In any case I having some problems getting this to work as a simple starting point. I have a few elements at play that may be causing the issue/s:-
MSW is being used to intercept http requests to mock a restful API locally. I wonder if this is one of the issues
I am adding the Websocket as a query to an RTK Query createApi object with other queries and mutations. In reality the Websocket query will need to hit a different API Gateway to the one that is being set as the baseQuery baseUrl currently. Do I need to create a new and separate RTK Query api using createApi() for the Websocket query?
Anyhow, here is the server code:-
// example CRA socket.io from https://github.com/socketio/socket.io/blob/main/examples/create-react-app-example/server.js
const getWebsocketServerMock = () => {
const io = require('socket.io')({
cors: {
origin: ['http://localhost:3000']
}
});
io.on('connection', (socket: any) => {
console.log(`connect: ${socket.id}`);
socket.on('hello!', () => {
console.log(`hello from ${socket.id}`);
});
socket.on('disconnect', () => {
console.log(`disconnect: ${socket.id}`);
});
});
io.listen(3001);
setInterval(() => {
io.emit('message', new Date().toISOString());
}, 1000);
console.log('Websocket server file initialised');
};
getWebsocketServerMock();
export {};
My RTK Query api file looks like this:-
reducerPath: 'someApi',
baseQuery: baseQueryWithReauth,
endpoints: (builder) => ({
getWebsocketResponse: builder.query<WebsocketResult, void>({
query: () => ``,
async onCacheEntryAdded(arg, { updateCachedData, cacheDataLoaded, cacheEntryRemoved }) {
try {
// wait for the initial query to resolve before proceeding
await cacheDataLoaded;
const socket = io('http://localhost:3001', {});
console.log(`socket.connected: ${socket.connected}`);
socket.on('connect', () => {
console.log('socket connected on rtk query');
});
socket.on('message', (message) => {
console.log(`received message: ${message}`);
// updateCachedData((draft) => {
// draft.push(message);
// });
});
await cacheEntryRemoved;
} catch {
// no-op in case `cacheEntryRemoved` resolves before `cacheDataLoaded`,
// in which case `cacheDataLoaded` will throw
}
}
}),
getSomeOtherQuery(.....),
getSomeOtherMutation(....),
Any advice or thoughts would be greatly appreciated! I guess my main question is should I be able to combine the websocket query in the same createApi function with other queries and mutations that need to use a different baseQuery url as they need to hit different API Gateways on AWS?
Much thanks,
Sam
You can circumvent the baseQuery from being used by specifying a queryFn instead of query on your endpoint.
In the most simple version, that just returns null as data so you can modify it later - but if you have an initial websocket request you can also do that in the queryFn.
queryFn: async () => { return { data: null } },

I want to use socket.emit() outside io.on("connection"), is this approach correct?

I'm making a tweet deleter, and I want to update the user on the progress.
I'm new to socket.io, I managed to connect the react frontend to the nodejs/express backend.
io.on("connection", (socket) => {
console.log("new connection");
socket.on("disconnect", () => console.log("disconnected"));
});
When a user clicks the delete button, a delete request goes to the backend, the file containing the tweets is then processed and thanks to Bull, each tweet is queued as a job.
because I added ìo to my routes, I can use it inside of them, but io.emit() emits to connected clients, and I only want emit to sender by using socket.emit() inside my routes as well as inside my jobs in the queue.
The approach I tried was to write a function inside io.on("connection") like this and to make it global :
io.on("connection", (socket) => {
console.log("new connection");
socket.on("disconnect", () => console.log("disconnected"));
global.emitCustom = function (event, payload) {
socket.emit(event, payload);
};
});
which allowed me to use in the queue process function :
const deletionProcess = async ({
data: { tweetId, tokens, twitterId, numberOfTweets, index },
}) => {
emitCustom("deleting", {
type: "deleting",
progress: Math.round(((index + 1) / numberOfTweets) * 100),
});
};
Is there a better way to do this? is my approach wrong or does it present any flaws?
It worked in the few tests I did.

How to assert that app sends correct data to API server with POST request

I am writing React.js application talking to API server. I have read tons of articles on how to mock these calls and send some fake response from API. I can do testing using #testing-library/react, I can easily mock axios with axios-mock-adapter and test fetch requests using HTTP GET method. But I cannot find anywhere how to make sure that my app, when it sends some POST request, sends correct data to API, i.e. that my app sends json payload with e.g. "id" field, or "name" field set to "abc", or something like this.
I am new to React.js. Please advise how to make tests asserting what the app sends to API. Is it possible?
Let's say that I have a function named doSomething, like below, called with onClick of some button.
const doSomething = async (userId, something) => {
try {
await REST_API.post('doSomething', {
user_id: userId,
something: something
});
return true;
} catch (error) {
window.alert(error);
return false;
}
};
REST_API above is axios instance.
How can I ensure that the I (or some other developer) didn't make a typo and didn't put "userId" instead of "user_id" in the payload of the request?
If you have to be sure you call correctly the api, I'd use jest as follow:
jest.mock('axios', () => ({
post: jest.fn(),
}));
describe('test', () => {
it('doSomething', () => {
const userId = 123;
const something = 'abc';
doSomething(userId, something);
expect(axios.post).toBeCalledWith(
'doSomething', {
user_id: userId,
something,
},
);
});
});
or if you use instance, define it in another file (axios_instance.js) and using the follow test:
jest.mock('./axios_instance', () => ({
instance: {
post: jest.fn(),
},
}));
describe('test', () => {
it('doSomething', () => {
const userId = 123;
const something = 'abc';
doSomethingInstance(userId, something);
expect(instance.post).toBeCalledWith(
'doSomething', {
user_id: userId,
something,
},
);
});
});
For your need I would use Swagger and its tooling. You would kill three birds with one stone :
Have a proper API documentation : https://swagger.io/tools/swagger-ui/
Protect Backend : Ensure inputs/outputs are valid, and throw detailed exception if a client sends bad data : https://github.com/cdimascio/express-openapi-validator-example
Protect Frontend : Use client api generation to genrate js classes used by your clients .. That way they won't arbitrarily create objects manually and send them to server (crossing fingers) but use a dedicated API with setters : https://github.com/swagger-api/swagger-codegen
That way you have a rock solid Frontend + Backend + Documentation combo ..

Socket.io: Other client only updates when being interacted

I'm trying to set up a realtime application using socket.io in Angular and node.js, which is not working as intended.
Whenever a client is making a new post, the other clients won't update until you interact with the client (e.g. clicking somewhere on the page, or clicking on the browsers tab).
However, having console open in the browser, I can see the new post in the console when I log the posts/objects - without the need to interact with the clients.
Angular:
import io from 'socket.io-client';
const socket = io('http://localhost:3000');
posts: Post[] = [];
...
// Inside ngOnInit:
socket.on('data123', (res) => {
console.log('Updating list..', res);
this.postService.getPosts();
this.postsSub = this.postService.getPostUpdateListener()
.subscribe((posts: Post[]) => {
this.posts = posts;
});
});
Displaying in the template:
<... *ngFor="let item of posts">
Inside PostsService:
getPosts() {
this.http.get<{ message: string, posts: Post[] }>('http://localhost:3000/api/posts')
.subscribe((postData) => {
this.posts = postData.posts;
this.postsUpdate.next([...this.posts]);
});
}
Node.js - this socket.io solution is not yet sending the actual list:
const io = socket(server);
io.sockets.on('connection', (socket) => {
console.log(`new connection id: ${socket.id}`);
sendData(socket);
})
function sendData(socket){
socket.emit('data123', 'TODO: send the actual updated list');
setTimeout(() => {
console.log('sending to client');
sendData(socket);
}, 3000);
}
What worked as intended:
Using setInterval instead "socket.on(..)" on the front-end gave the intended result, meaning the clients will update automatically without the need of interacting. I'm fully aware this solution is horrible, but I assume this pinpointing that it's something wrong with socket solution above in Angular part.
wait, every time when socket.on('data123', (res) => {... you are creating new subscribe? it's wrong way...you must create subscribe in your socket connect feature

How to subscribe to stream in Angular2?

I'm having my streaming web-service running on localhost:8080/stream, which response when any new message added to subscribed mqtt stream. I want to consume this web-service in my Angular2 app. I'm using RxJS to consume NodeJS APIs in Angular2. I tried following code which calls localhost:8080/stream once and ends response. I want my observable to listen continuously to web-service.
var headers = new Headers();
headers.append('Access-Control-Allow-Origin', '*');
headers.append('Content-Type', 'application/json');
let options = new RequestOptions({ headers: headers }); // Create a request option
return this.http.get("http://localhost:8080/stream", options) // ...using post request
.map((res: Response) => res.json()) // ...and calling .json() on the response to return data
.catch((error: any) => Observable.throw(error.json().error));
If I understand your question right, you want to consume data from stream where new messages arrive at some period of time.
To achieve this You need add subscribe to the service.
return this.http.get("http://localhost:8080/stream", options) // ...using post request
.map((res: Response) => res.json()) // ...and calling .json() on the response to return data
.catch((error: any) => Observable.throw(error.json().error)
.subscribe(result => this.result =result));
Result will be updated as new data arrives, and you can use it the way want.
Note: It is best practice to make http calls separate in services and subscribe the service in your component.
For your reference I am adding an example I have worked on for demo purpose.
Create a service for http calls
#Injectable()
export class JsonPlaceHolderService{
constructor(private http:Http){}
getAllPosts():Observable<Post[]>{
return this.http.get("https://jsonplaceholder.typicode.com/posts")
.map(res=>res.json())
}
}
From your component call service and listen to changes continuously.
export class PostsComponent implements OnInit{
constructor(private _service:JsonPlaceHolderService){}
jphPosts:Post[];
title:string="JsonPlaceHolder's Post data";
ngOnInit():void{
this._service.getAllPosts().subscribe(
(data) =>this.jphPosts = data,
(err) => console.log(err),
()=>console.log("service call completed")
);
}
}
You should use websocket on angular and make it listen to your service URL after that you should listen to its events (open, close, message) and then create your own Subject stream using Rxjs to push the new data to the subscribers.
Please check the URL below:
https://medium.com/#lwojciechowski/websockets-with-angular2-and-rxjs-8b6c5be02fac
Streaming data from nodejs to angular with socket.io
This is something that would have been of great use when I was trying to do this. Following contains code from a socket.io package for angular credit goes to original author. This is taken from a working solution and may need some tweaking.
Server side
var app = express(),
http = require('http'),
ioServer = require('socket.io');
var httpServer = http.createServer(app);
var io = new ioServer();
httpServer.listen(1337, function(){
console.log('httpServer listening on port 1337');
});
io.attach(httpServer);
io.on('connection', function (socket){
console.log(Connected socket ' + socket.id);
});
//MQTT subscription
client.on('connect', function () {
client.subscribe(topic, function () {
console.log("subscribed to " + topic)
client.on('message', function (topic, msg, pkt) {
io.sockets.emit("message", {topic: topic, msg: msg, pkt: pkt});
});
});
});
Client Side
Create a customService in angular with following
import * as io from 'socket.io-client';
declare
private ioSocket: any;
private subscribersCounter = 0;
inside service class
constructor() {
this.ioSocket = io('socketUrl', {});
}
on(eventName: string, callback: Function) {
this.ioSocket.on(eventName, callback);
}
removeListener(eventName: string, callback?: Function) {
return this.ioSocket.removeListener.apply(this.ioSocket, arguments);
}
fromEvent<T>(eventName: string): Observable<T> {
this.subscribersCounter++;
return Observable.create((observer: any) => {
this.ioSocket.on(eventName, (data: T) => {
observer.next(data);
});
return () => {
if (this.subscribersCounter === 1) {
this.ioSocket.removeListener(eventName);
}
};
}).share();
}
In your component, Import customService as service
service.on("message", dat => {
console.log(dat);
});

Resources