How to use Socket.io to update an API live - node.js

Is it possible to adapt a json from a live api according to the changes in the database?
server.js
const connection = mongoose.connection;
connection.once("open", () => {
//Live Stream - Posts
const observePosr_changes = connection.collection("posts").watch();
//Observe change in Data Base
observePosr_changes.on("change", (change) => {
//console.log('changes right now ->',change);
switch (change.operationType) {
//create request
case "insert":
//Create posts -> operationType function
break;
//patch/put request
case "update":
//Update posts -> operationType function
break;
//delete request
case "delete":
//Update posts -> operationType function
break;
}
});
});
I found using the documentation from mongodb a method by which I can detect live the changes in db atnci when post / patch / delete
controller/postController.js
//Create a new post - tahe all value and add into DB
exports.createPost = catchAsync(async(req,res)=>{
const create = await Post.create(req.body);
res.status(201).json({
status:"success",
data:create
});
});
//Get Information from DB
exports.getAllPosts = catchAsync(async(req,res,next)=>{
const getAll = await Post.find()
res.status(200).json({
status:"success",
data:{
post:getAll
}
});
});
Is there a possibility to use the socket in this situation to make the application live.
That is, at the moment the mobile application and the website to see the newly added content must refresh.

you want to configure the server first
io = socket(server); -- server : express or any other
io.on("connection", function (socket) {
//console.log("Made socket connection");
});
so you can connect the socket from your client app using unqiue event name
this.socket = io.connect(YOUR_URL);
this.socket.on(HERE_YOUR_EVENT_NAME, (data: any) => {
-- your get the data here
});
when ever you want to send the data to client app emit the data using event name in server side using below code
io.emit(event_name, data);

Related

Advanced - Socket.io Join Room in external file on HTTP request

I am getting confused with this Node.js, Angular 13 and Socket IO scenario.
First of all let's asume we are already saving all required info in a Database, like roomId, roomOwner, username etc.
So, let's say we want to create an Online Quizz game using sockets to sync all players, 6 max for this scenario. HOWEVER, this is the problem...
On the Angular code there is this Service which is connecting Client
with Back-End
SocketService.ts
export class SocketService {
socket: any;
readonly url: string = "ws://localhost:3000";
constructor() {
this.socket = io(this.url)
}
}
On the Server side index.js inits webSocket
index.js
const app = express();
const io = require('./sockets/websocket')(app);
Inside webSocket.js we create the instance of socketIO to be exported and used across the whole back-end controllers as needed
webSocket.js
module.exports = function(app){
this.server = require('http').createServer(app);
this.socket = require('socket.io');
this.io = socket(server, {
cors: {
origin: "https://localhost:4200",
credentials: true
}
});
this.server.listen(3000, () => {
console.log("Socket IO is lestineng on port 3000");
});
io.on("connection", function (socket) {
console.log("A user connected");
});
this.registerSocketToRoom = function(roomId){
try{
console.log('[socket]','join room :',roomId)
io.join(roomId);
io.sockets.to(roomId).emit('user joined', socket.id);
}catch(e){
console.log('[error]','join room :',e);
io.emit('error','couldnt perform requested action');
}
} }
This is an example controller. We import the exported instance of SocketIO exported from webSocket.js file. Let's say we want to join a room if Client makes an http request to join a room HOWEVER, WE DID NOT joined the room "on socket connection" so we have to do it now. We try to use the exported method {registerSocketToRoom}.
GameRoomManagerController.js
require('../../sockets/websocket');
... // Some code here
exports.joinGameRoom = function(req, res){
const roomId = req.params.roomId;
console.log(roomId);
registerSocketToRoom(roomId);
return res.send({status: "success", msg: `joined Room: ${roomId}` });
}
When executing the process of creating a room -> saving the info to the DB -> Join Room the following error occurs.
TypeError: io.sockets.join is not a function
In theory this sound right to me, but I think I am misunderstanding the difference between io and socket.
Can someone explain to me what's going on here? Is it even possible
to export the same instance of io to be used in any place of the
back-end?
Is it even possible to join a room AFTER the connection was
created?
What's the difference between io and socket?
Before starting the topic, it is better to get acquainted with some terms from the socket.io library
io
In fact, it refers to all sockets connected to the server. You can
send messages individually, in groups, or to all sockets.
Your idea of ​​the socket that is written in this way
io.on('connection', socket => {
socket.on('message', data => {
});
});
In this section, you can only read the information related to this
event or you can transfer this information between sockets
Well, now we are going to solve this problem. The reason for this error is not following this hierarchy in your coding. I suggest you refer to the socket.io document next time and strengthen your foundation.
And finally, I will provide you with a simple example of the correct implementation method
let app = require('express')(),
http = require('http').Server(app),
io = require('socket.io')(http);
let listOfRoom = [];
io.on('connection', socket => {
let joinUserInRoom = (roomId) => {
if (socket.adapter.rooms.has(roomId) === false) {
listOfRoom.push(roomId);
socket.join(roomId);
}
},
leaveUserInRoom = (roomId) => {
if (listOfRoom.includes(roomId)) {
listOfRoom.splice(listOfRoom.indexOf(roomId), 1);
socket.leave(roomId);
}
};
socket.on('joinRoom', data => {
joinUserInRoom(data.roomId);
})
socket.on('disconnect', data => {
leaveUserInRoom(data.roomId);
});
socket.on('messageRoom', data => {
io.to(data.roomId).emit('eventMessageRoom', data); // send data in special room
});
});

Socket IO not sending new sql data after client connects

I am creating a real-time app using. Socket IO, Node.js, Express.js, React for frontend, and Microsoft SQL for database. I only want to send data when the database is updated or when a new client is connected. Though when the client first connects, the IO connection fires off sending my data to the new client. But when I make a change to my database. The data never gets sent. My code is below. I feel as though I am close, but I am just missing something that makes the code work. I appreciate any kind of help.
const app = express();
const httpServer = require('http').createServer(app);
const io = require('socket.io')(httpServer);
const path = __dirname + '/views/';
let sqlQuery = require('./controllers/sqlController').queryDatabase;
let currentQueryData = {};
let connectedSocketID
let objectMatched = true;
app.use(express.static(path))
app.get('/', function (req,res) {
res.sendFile(path + "index.html");
});
// Function to emit data only when a change is found.
const sendData = (data, socket) => {
socket.emit('markerCreation', data);
}
// compare both objects and return a boolean value.
const compareObjects = (object1, object2) => {
return JSON.stringify(object1) === JSON.stringify(object2);
}
httpServer.listen(3001, () => {
console.log(`Server listening at ${3001}`);
})
io.on('connection', async socket => {
// Get new Query Data than compare the object with the currently saved Query data
let newQueryData = await sqlQuery();
objectMatched = compareObjects(currentQueryData, newQueryData)
if(!objectMatched) { // If objects matched is not true take the new data and save it in currentQueryData and send data to client.
currentQueryData = newQueryData;
sendData(currentQueryData, socket);
} else if (connectedSocketID !== socket.id) { // If socket is not already connected saved it in connected sockets and send data to client
connectedSocketID = socket.id;
sendData(currentQueryData, socket);
};
// Issue: Socket IO will stop sending to connected Client. If a new update happens on the sql database the change isn't passed along to
// the client.
});```

Node.js mongoose unable to 'save()' a document on MongoDB database

My problem is quite peculiar, as I believe to have done everything "by the book".
I'm able to successfully connect to a MongoDB cluster I've created through MongoDB Atlas. When I make a 'POST' request to save a choice that was picked from an array of choices, I successfully create a document through the Model specified below. I then try to save that document to MongoDB by calling the 'save()' method, but it hangs and nothing comes out of it (even if I use a 'catch' to see if any errors occurred).
I'm completely lost as to what is wrong, and what I need to do to solve this. I was hoping you could give me some pointers.
MongoDB connection, schema, and model:
const mongoose = require('mongoose');
const URL = process.env.MONGODB_URL;
mongoose.connect(URL, { useNewUrlParser: true, useUnifiedTopology: true })
.then(() => {
console.log('Successfully connected to our MongoDB database.');
}).catch((error) => {
console.log('Could not connect to our MongoDB database.', error.message);
});
const choicesMadeSchema = new mongoose.Schema({
allChoices: Array,
pickedChoice: String
});
const ChoiceMade = mongoose.model('ChoiceMade', choicesMadeSchema);
module.exports = ChoiceMade; // Exports our 'ChoiceMade' constructor, to be used by other modules.
index.js:
/* 1 - Setting things up */
require('dotenv').config();
const express = require('express');
const server = express();
const PORT = process.env.PORT;
const parserOfRequestBody = require('body-parser');
server.use(parserOfRequestBody.json());
/* 2 - Retrieving all the data we need from our 'MongoDB' database */
// Imports the 'mongoose' library, which will allow us to easily interact with our 'MongoDB' database.
const mongoose = require('mongoose');
// Imports our 'ChoiceMade' constructor.
const ChoiceMade = require('./database/database.js');
// Will hold the five latest choices that have been made (and thus saved on our 'MongoDB' database).
let fiveLatestChoicesMade;
// Retrieves the five latest choices that have been made (and thus saved on our 'MongoDB' database).
ChoiceMade.find({}).then((allChoicesEverMade) => {
const allChoicesEverMadeArray = allChoicesEverMade.map((choiceMade) => {
return choiceMade.toJSON();
});
fiveLatestChoicesMade = allChoicesEverMadeArray.slice(allChoicesEverMadeArray.length - 5).reverse();
console.log("These are the five latest choices that have been made:", fiveLatestChoicesMade);
mongoose.connection.close();
});
/* 3 - How the server should handle requests */
// 'GET' (i.e., 'retrieve') requests
server.get('/allChoicesMade', (request, response) => {
console.log("This is the data that will be sent as a response to the 'GET' request:", fiveLatestChoicesMade);
response.json(fiveLatestChoicesMade);
});
// 'POST' (i.e., 'send') requests
server.post('/allChoicesMade', (request, response) => {
const newChoiceMadeData = request.body;
if (Object.keys(newChoiceMadeData).length === 0) {
return response.status(400).json({ error: "No data was provided." });
}
const newChoiceMade = new ChoiceMade({
allChoices: newChoiceMadeData.allChoices,
pickedChoice: newChoiceMadeData.pickedChoice
});
console.log("This is the new 'choice made' entry that we are going to save on our 'MongoDB' database:", newChoiceMade); // All good until here
newChoiceMade.save().then((savedChoiceMade) => {
console.log('The choice that was made has been saved!');
response.json(savedChoiceMade);
mongoose.connection.close();
}).catch((error) => {
console.log('An error occurred:', error);
});
});
/* 4 - Telling the server to 'listen' for requests */
server.listen(PORT, () => {
console.log("Our 'Express' server is running, and listening for requests made to port '" + PORT + "'.");
});
SOLUTION TO THE PROBLEM
In my code's section 2, I was mistakenly closing the connection upon retrieving all the data I need to make my app work. I was doing this (...)
// Retrieves the five latest choices that have been made (and thus saved on our 'MongoDB' database).
ChoiceMade.find({}).then((allChoicesEverMade) => {
const allChoicesEverMadeArray = allChoicesEverMade.map((choiceMade) => {
return choiceMade.toJSON();
});
fiveLatestChoicesMade = allChoicesEverMadeArray.slice(allChoicesEverMadeArray.length - 5).reverse();
console.log("These are the five latest choices that have been made:", fiveLatestChoicesMade);
mongoose.connection.close(); // This should not be here!!!
});
(...) when I should be doing
// Retrieves the five latest choices that have been made (and thus saved on our 'MongoDB' database).
ChoiceMade.find({}).then((allChoicesEverMade) => {
const allChoicesEverMadeArray = allChoicesEverMade.map((choiceMade) => {
return choiceMade.toJSON();
});
fiveLatestChoicesMade = allChoicesEverMadeArray.slice(allChoicesEverMadeArray.length - 5).reverse();
console.log("These are the five latest choices that have been made:", fiveLatestChoicesMade);
// Now that I don't have mongoose.connection.close(), everything's OK!
});
Basically, and in my particular case, I was closing my connection to the MongoDB database after retrieving data from it, and then trying to add a new record to it when I didn't have a connection to it anymore.

How to listen to socketIO private message in React client?

I have a SocketIO instance in an Express app, that listens to a React client requests. A user can send private messages to a specific person. The server receives the private message, and should dispatch it back to both sender & recipient thanks to the io.to(socketId).emit(content) method.
How to listen to this event in React and update the message array? In order to ease the process, I have created a connectedUsers object, whose keys are mongoDB's user._id, and whose values are the unique socketID generated by socketIO. This way, I can easily address message to specific persons in the client. Once sent, the messages are stored in a MongoDB database.
Here is the back-end. The point of interest is io.on("privateMessage")
const connectedUsers = {};
const socketManager = (io) => {
io.on("identifyUser", (user) => {
if (!([user.id] in connectedUsers)) {
connectedUsers[user.id] = io.id;
}
});
io.on("privateMessage", (data) => {
io.to(connectedUsers[data.recipientId]).emit(data.message);
io.to(connectedUsers[data.senderId]).emit(data.message);
});
io.on("disconnect", () => console.log("user disconnected!"));
};
Here is the listening function in React. Everything works but the "privateMessage" part.
async function getUser(socketId) {
try {
const res = await ax.get(`${serverUrl}/login`);
const socket = io(serverUrl);
socketId.current = socket;
socket.on("connect", () => {
socket.emit("identifyUser", { id: res.data._id });
socket.on("privateMessage", (data) =>
console.log("private message received!", data)
);
});
} catch (err) {
throw new Error(err);
}
}
Thanks for your help!
I think you need to put the socket.on("privateMessage") part outside the socket.on("connect") scope.
React must load all events at the beginning.
The backend side must be responsible for the authorization.
For the client there is connection event, not connect.
Subscription to event privateMessage should be outside connection callback.
This code should work. Hope this helps
import io from 'socket.io-client'
async function getUser(socketId) {
try {
const res = await ax.get(`${serverUrl}/login`);
const socket = io(serverUrl);
socketId.current = socket;
socket.on("connection", () => {
socket.emit("identifyUser", { id: res.data._id });
});
socket.on("privateMessage", (data) =>
console.log("private message received!", data)
);
} catch (err) {
throw new Error(err);
}
}

One app with nodejs, different datababase per client

I want to create an application with nodejs in which different companies/clients connect but use different databases.
For example:
Application nodejs running on localhost: 3001
Mongo server running at localhost: 27017
A client (CLIENT1) accesses the nodejs application and modifies data
in its database -> localhost:27017/client1
Another client (CLIENT2) does the same and accesses the application
nodejs but modifies its data in localhost:27017/client2
And so on for every customer who signs up for the application.
--------EDIT----------
I've been testing things to get what I wanted and I think I've come up with a possible solution. The solution would be to create a connection to each database access. And when you have finished that access disconnect. I do not know if it is a good solution but I think it can be worth:
index.js
var express = require('express');
var app = express();
var repository = require('./demoqueryrepository')
app.get('/createdb', function (req, res) {
//TODO: With JWT decode get id client and pass like as param to repository
repository.crearDemo(req.query.id, function (err, resp) {
if (err) console.log(err)
else res.send("resp");
})
});
app.listen(3000, function () {
console.log('Example app listening on port 3000!');
});
demomodel.js
var mongo = require('mongoose');
var Schema = mongo.Schema;
module.exports = mongo.model('demodto', new Schema({
Name: { type: String },
Code: { type: Number },
}));
demoqueryrepository.js
var _demo = require('./demoquerydto');
var mongo = require('mongoose')
var mongoconnect = require('./mongoconnect')
module.exports = {
crearDemo: function (idclient, callback) {
let newdemo = new _demo({
Name: " Demo " + idclient,
Code: idclient
})
mongoconnect.connect(idclient);
newdemo.save(function (error) {
if (error) callback(error, null);
else {
callback(null, "success");
mongo.disconnect();
}
})
}
}
mongoconnect.js
var mongo = require('mongoose')
module.exports = {
connect: function (idclient) {
mongo.connect('mongodb://localhost:27017/' + idclient, { useMongoClient: true }, function (err, res) {
if (err) console.log(err);
else console.log("Connected to db")
});
}
}
when i launch requests:
localhost:3000/createdb?id=12
localhost:3000/createdb?id=13
localhost:3000/createdb?id=14
On the database server the databases are created with those id's
What you are trying to do is make a multi-tenant nodejs application.
The approach you are taking has few disadvantages:
There is one common database for a user-id which will tell which db to connect to and then one per client. This means you have n+1 connection.
Your application will not scale as either you will always over provision/ under provision your databases or worse deploy changes for every new client on-boarding.
Have you considered having just one database as the schema is the same? The common fears of one client having to access data can be taken care of if you put default scope of search per client.
I had the same issue and wrote a blog post about it.

Resources