How to use Promise with redux and socket.io - node.js

I am a newbie to react-redux. I am stuck at situation wherein i emit an action to the server. The server in turn makes a third party API request which return a promise. My server.js file for handling socket connection in the server looks like the following
import Server from 'socket.io';
export default function startServer(store) {
const io = new Server().attach(8090);
store.subscribe(
() =>io.emit('curr_news', store.getState().toJS()));
io.on('connection', (socket) => {
socket.emit('curr_news', store.getState().toJS());
socket.on('action', store.dispatch.bind(store));
});
}
As you can see the action is what the client emits and the server makes an appropriate request when it receives the necessary action after which it emits the current state. The following is a sample reducer file
export default function reducer(curr_feeds = CURRENT_FEEDS, action) {
switch (action.type) {
case 'GET_TEST1DATA':
return getTest1data(curr_feeds);
case 'GET_TEST2DATA':
return getTest2data(curr_feeds);
}
}
here getTest1data and getTest2data essentially return a promise as it makes a request to some third part API. My problem is that as the the socket emits the curr_news immediately due to which the value of store.getState() is undefined.
My question is how do I make the store to observe and emit the socket once the promise resolves and not before that? Thanks for the help in advance.

When you subscribe to the store you gain a state whenever it updates. If that new state is a promise, you could process that:
Store.subscribe(state => state.then(promisedData => io.emit(promisedData))

Related

Trigger function in app.js from module.js

First time poster — please forgive (and call me out) on any formatting mistakes!
So let's say I have app.js, which requires module.js. In module.js I have an express server running that can receive simple webrequests (GET / POST). Is there any way for me to trigger functions in app.js when I receive a request in module.js?
Something along the lines of:
var webModule = require('./module.js')
webModule.on('GET', async function (req) {
//do stuff with the request
});
The reason I'm not just putting the express server in app.js is that I want to run a certain amount of code to verify that the request is legitimate, and then re-use module.js in separate scripts to minimise the amount of code — and avoid having to update 5-6 scripts everytime I want to update the authentication process.
You can use the Event system that exists in Node.js. If you set it up correctly you can emit an event on each call and have a listener respond to the event.
Example from the Node.JS documentation:
const EventEmitter = require('events');
class MyEmitter extends EventEmitter {}
const myEmitter = new MyEmitter();
myEmitter.on('event', () => {
console.log('an event occurred!');
});
myEmitter.emit('event');
In your case you could create a class that extends EventEmitter that you use in each request. This class can then emit an event when the request gets call which is then handled in your app.js file by setting up the listener.
I might have a solution, will take some time to code (since my question is obviously very simplified).
In module.js:
var functionActions = {};
module.exports = {
on : (async function(requestType, returnFunction){
functionActions[requestType].do = returnFunction;
});
}
//general express code
app.get('*', function(req,res) {
if (verifyRequest(req) == 'okay'){ //authentication
return functionActions['GET'].do();
} else { //if the request is not authorised
res.status(403).send('Stop trying to hack me you stupid hacker ಠ╭╮ಠ');
}
});
I'll try it out and update this answer once I've figured out the potential kinks.

Socket io emit function making new connection every time

I want to made one connection on one emit function at a time in socket io with react and node js, but it is making new connection every time when emit is called from frontend. Is there any solution for this?
Reactjs code:
socket.emit('getMessages', 1000,this.state.userData._id,id);
Nodejs Code:
client.on('getMessages', (interval,sID,rID) => {
try {
//get messages from mongodb
} catch (err) {
console.log(err);
}
client.on('disconnect', () => {
console.log(`Socket ${client.id} disconnected.`);
});
});
I want to get message IDs but every time I am sending new IDs it makes a new connection and sending previous messages as well.
If you want something to only happen once you could use client.once(....).
If you want the server to not listen for the event anymore after it was emitted once you could add client.removeListener('getMessages') at the end of your listener.
Hope this is helpful. Not sure what behaviour your looking for exactly though.

React Hooks: state not updating when called inside Socket.io handler

const [questionIndex, setQuestionIndex] = useState(0);
...
socket.on('next', () => {
console.log('hey')
setQuestionIndex(questionIndex + 1);
});
...
useEffect(() => {
console.log(questionIndex);
}, [questionIndex]);
I have a page that connects to a websocket using Socket.io. I am attempting to update the value of my questionIndex state variable when I receive a 'next' message from the socket. I seem to be receiving the message because 'hey' is printed, but questionIndex only updates after the first 'hey'. Afterwards, hey is printed, but questionIndex is not (I use the useEffect to print questionIndex when it is updated). Do you guys see anything wrong?
I was also facing this issue. Actually, we should not rely on values from state while updation.
right way to do this is
setQuestionIndex(QuestionIndex=>QuestioIndex+1)
For those wondering, it looks like the socket.on('next') function handler was using the original value of questionIndex (which was 0) every time. It seems like the function handler compiled this variable when it was bound rather than reading it at runtime. Not sure where in the documentation it specifies this behavior.
My solution was to change the function handler as such:
socket.on('next', (newIndex) => {
console.log('hey')
setQuestionIndex(newIndex);
});
This solves the problem by providing the value to set questionIndex to (rather than reading it upon receiving a 'next' event).
It looks like you open new socket every render and never disconnect it.
Try to place socket.on inside useEffect and the return value of that useEffect will be function that disconnect the socket.
useEffect(() => {
socket.on('next', () => {
console.log('hey')
setQuestionIndex(questionIndex + 1);
});
return () => socket.disconnect();
}, [questionIndex]);```

IORedis or (node_redis) callback not firing after calling custom Redis commands/modules

When using redis client (ioredis or node_redis) inside websocket's message event in a nodejs app, the callback for any command is not immediately fired. (the operation does take place on redis server though)
What is strange is that the callback for the first command will fire after i sent a second message, and the callback for the second will fire after i send a third.
wss.on('connection', (socket, request) => {
socket.on('message', (data) => {
console.log("will send test command")
this.pubClient.hset("test10", "f1","v1", (err,value) => {
//callback not firing first time
console.log("test command reply received")
})
})
}
the redis command is working as expected though in other parts of the app and even when inside the on connection directly like below.
wss.on('connection', (socket, request) => {
console.log("will send test command")
this.pubClient.hset("test10", "f1","v1", (err,value) => {
//callback fires
console.log("test command reply received")
})
socket.on('message', (data) => {})
}
UPDATE:
I had this all wrong. The reason for the weird callback behavior is the result of one my custom Redis modules not returning a reply.
And this seems to have caused all callbacks after this call to seem to have some kind of a one step delay.
I had this all wrong. The reason for the weird callback behavior is the result of one my custom Redis modules not returning a reply. And this seems to have caused all callbacks after this call to seem to have some kind of a one step delay.

Node Postgres Module not responding

I have an amazon beanstalk node app that uses the postgres amazon RDS. To interface node with postgres I use node postgres. Code looks like this:
var pg = require('pg'),
done,client;
function DataObject(config,success,error) {
var PG_CONNECT = "postgres://"+config.username+":"+config.password+"#"+
config.server+":"+config.port+"/"+config.database;
self=this;
pg.connect(PG_CONNECT, function(_error, client, done) {
if(_error){ error();}
else
{
self.client = client;
self.done = done;
success();
}
});
}
DataObject.prototype.add_data = function(data,success,error) {
self=this;
this.client.query('INSERT INTO sample (data) VALUES ($1,$2)',
[data], function(_error, result) {
self.done();
success();
});
};
To use it I create my data object and then call add_data every time new data comes along. Within add_data I call 'this/self.done()' to release the connection back to the pool. Now when I repeatedly make those requests the client.query never gets back. Under what circumstance could this lead to a blocking/not responding database interface?
The way you are using pool is incorrect.
You are asking for a connection from pool in the function DataObject. This function acts as a constructor and is executed once per data object. Thus only one connection is asked for from the pool.
When we call add_data the first time, the query is executed and the connection is returned to the pool. Thus the consequent calls are not successful since the connection is already returned.
You can verify this by logging _error:
DataObject.prototype.add_data = function(data,success,error) {
self=this;
this.client.query('INSERT INTO sample (data) VALUES ($1,$2)',
[data], function(_error, result) {
if(_error) console.log(_error); //log the error to console
self.done();
success();
});
};
There are couple of ways you can do it differently:
Ask for a connection for every query made. Thus you'll need to move the code which ask for pool to function add_data.
Release client after performing all queries. This is a tricky way since calls are made asynchronously, you need to be careful that client is not shared i.e. no new request be made until client.query callback function is done.

Resources