Mongoose document.save() method is blocking websocket execution - node.js

I have a NodeJS app running a websocket server using the ws npm package. In a callback from the message event ws.on("message", async (rawData, isBinary) => {}), I'm trying to update a document and then save it. The .find() method works fine, but the .save() method blocks execution completely.
const users = await Promise.all(
game.players.map((p) => User.findById(p.id)) // This works fine
);
// mess with users here
await Promise.all(users.map(u => u.save())) // <-- This doesn't work
await users[0].save() // <--- This doesn't work either
users[0].save((err, doc) => {
// This doesn't work either
})
users[0].save().then(doc => {
// console.log(doc) // <-- This doesn't work either
})
For starters, is this even possible ? Or am I thinking about it the wrong way and should I trigger some kind of POST request from the client to hit my http server instead to do the mongodb operations ? In any case, I'm struggling to understand why this is not possible.
Also something interesting is when I tried logging everything everywhere I tried to do the following
console.log(users[0].save())
users[0].save((err, doc) => {
if (err) console.log(err) // <-- no log here
console.log(doc) // <-- no log here
})
which threw mongoose's ParallelSaveError: Can't save() the same doc multiple times in parallel. So I'm guessing the action does take place but not entirely ?
Any insight on this is welcome.

Related

Why does MongoDB timeout on client.connect?

I am trying to connect to my mongodb from node/express and I am receiving a connection timed out error when trying to connect. This is the code I am working with atm to find the solution.
await client.connect().then((res:any) => console.log(res)
And this is the error code given.
MongoServerSelectionError: connect ETIMEDOUT 52.64.110.205:27017
So far I have tried adding additional timeout params including
keepAlive=true&socketTimeoutMS=360000&connectTimeoutMS=360000
I have also tried connecting to another cluster with a different username/password and received the same error. I don't think it's an error with env variables as all the other .env variables are working. And I think it might be worth mentioning that this function was working for the first day or two after I put it in.
Below is the entire function. I have commented some parts out for debugging purposes.It returns the same error either way, so I assume it can only be something to do with the connection.
export const handleCreateRequestDB = async (input: any) => {
console.log(`creating new user in DB # ${input}`)
const createUserAccount = async (client: any, newUser: object) => {
await client.connect().then((res:any) => console.log(res)
// await client.db('onlinestore').collection('user_data').insertOne(newUser).then((result: any) => {
// console.log(result)
// return result
// })
)
}
try {
createUserAccount(client, input)
.then((result) => {return result})
} catch (e) {
console.error(e);
return false
}
// finally {
// await client.close()
// }
}
with the help from everyone here, I was able to solve the issue. It appears that my IP had changed since I had created the Atlas and to fix this I just needed to add my updated IP address. It also appears that a connection timed out error from mongo can be caused by access restrictions, such as username/password or in my case, IP ban from my own service.

How to handle "UnhandledPromiseRejectionWarning" in Fastify without "async / await" or ".catch"

I'm running a simple Fastify server and when I make a request and it fails, I wanna handle that exception in the setErrorHandler.
How can I achieve that?
This doesn't seem to work:
import fastify from 'fastify';
import fetch from 'node-fetch';
const app = fastify();
app.setErrorHandler(async (error, request, response) => {
// I want this to be called whenever there is an error
console.log('setErrorHandler');
return 'error';
});
app.get('/', (request, response) => {
// I do not want to use async / await or .catch here
fetch('https://...').then(() => { response.send(); });
});
I get the error UnhandledPromiseRejectionWarning and the server never sends a response.
I do NOT want to use async, await or .catch on the promise. The reason is I'm simulating a developer error. So if someone forgets to add a .catch and is not using async / await, I still wanna "catch" that error and return status 500 to the client.
I'm willing to change / wrap the request library adding a .catch in it, I just don't wanna change that endpoint handler code in general, since it's sort of out of my control (another developer might code it any way they want).
The reason is I'm simulating a developer error.
You can't manage these errors at runtime to reply to a request because you don't reference the reply object to act accordingly.
You can catch those errors with, but the reply cannot be fulfilled:
process.on('unhandledRejection', (err) => {
console.log(err)
})
As an alternative, I would set connectionTimeout, so the client will get a timeout error at least.
As you wrote, you already know that you should change the handler code to let Fastify be aware of the promise:
// add return
app.get('/', (request, response) => {
return fetch('https://...').then(() => { response.send(); });
})
For this reason, I think the solution to your problem should be taken offline adding a linter rule (like so) and integrate some Static Code Analysis in the CI pipeline to reject bad code.

JSON array from Express route is undefined in React console

I am currently working on a web app to manage an external database. I am not very familiar with express or NodeJS at this point so I wanted to ask how to send a JSON object to the client sides console without getting undefined?
I have this function to connect then select the what I need and afterwards I converted my JSON object to an array of JSON objects. It displays the data fine in the console as well.
async function connect() {
try {
await sequelize.authenticate();
console.log('Connection has been established successfully.');
} catch (err) {
console.error('Unable to connect to the database:', error);
}
info = await sequelize.query('select * from LeadsInformation', { type: QueryTypes.SELECT });
const details = JSON.stringify(info);
console.log(details);
detailsArray = JSON.parse(details);
console.log(detailsArray);
}
Everything works fine in here, I can get the data and display it in the terminal.
This is my GET route:
app.get("/list", (req, res) => {
connect();
res.json(detailsArray)
});
I have tried a couple of suggested ways based on other explanations and code snippets but none of them has worked so far so I left it like that. I thought foreaching through the data itself in the request would be a solution but it did not work. I also tried using the JSON itself and trying to display it and also tried using the body parser library. Though the library has not been updated for two years. Also I am using axios to fetch the data. It works fine when I try sending a simple string like "hello world" for example.
Is there anything that I'm missing or do you have any other solutions? I would also appreciate an explanation as well if possible.
Edit: It might also have to do something with how I am getting the response in the frontend. I'll look into that as well and will update this thread if I sort it out!
This is the way I get the response. I am currently trying to show in the console. I am using axios API.
Axios({
method: "GET",
url: "http://localhost:5000/list",
headers: {
"Content-Type": "application/json"
}
}).then(res => {
console.log(res.data.json);
});
Probably you have undefined in route because connect function doesn't return anything.
Also connect is an async function it means that it returns Promise and you have to call .then method or use await to get value from it.
Here is the code snippet with fixes that I described above.
async function connect() {
try {
await sequelize.authenticate();
console.log('Connection has been established successfully.');
} catch (err) {
console.error('Unable to connect to the database:', error);
}
info = await sequelize.query('select * from LeadsInformation', { type: QueryTypes.SELECT });
const details = JSON.stringify(info);
detailsArray = JSON.parse(details);
return detailsArray;
}
app.get("/list", async (req, res) => {
const result = await connect();
res.json(result)
});
Notice that in the router handler function I also use async and await because I call connect which is an asynchronous function.
The solution above did work and also another problem I had was that I wasn't getting the response correctly.
I ended up getting the response to the frontend after changing my code to the following from:
console.log(res.data.json);
To:
console.log(res.data[1]);

Does nodejs supports updating many collection in one router?

I tried updating my two collections in one router but it keeps on giving me error:
Error: Can't set headers after they are sent.
at validateHeader (_http_outgoing.js:494:11)
at ServerResponse.setHeader (_http_outgoing.js:501:3)
Here is what I did:
api.js
router.put('/editbloodrequest', function(req, res) {
//this works
Bloodrequest.findOne({ _id:"5c1d2c8c68503b0adceffa92"}, function(err, bloodrequest) {
if (err) throw err;
bloodrequest.request_status = "claimed";
});
//but after inserting this, it gives me error
Blooddonation.findOne({ _id:"5c00fa03dadb0c3b00739dd9"}, function(err, blooddonation) {
if (err) throw err;
blooddonation.blood_group = "test";
});
});
First, I strongly suggest that you abandon callbacks and use Mongo's Promise behavior instead. Unless you intended to by executing both requests concurrently, this would be the way you would express this using promises:
Bloodrequest.findOne({ _id:"..." }).then(bloodrequest => {
bloodrequest.request_status = "claimed";
})
.then(() => Blooddonation.findOne({ _id:"..." }).then(blooddonation => {
blooddonation.blood_group = "test";
});
Second, I'm guessing you cut some code from your example; nowhere in there do you actually respond (using res). Based on the error, I'm guessing you attempted to set res in both of your callbacks. You can only respond once, that error typically means you already redirected / responded / etc. and now are attempting to do it again.
Usually you'll do this:
Bloodrequest.findOne({ _id:"..." }).then(bloodrequest => {
bloodrequest.request_status = "claimed";
})
.then(() => Blooddonation.findOne({ _id:"..." }).then(blooddonation => {
blooddonation.blood_group = "test";
}).then(() => {
// set your positive res response
}).catch(error => {
// set your error res response
// since you're using promises, if any of the above steps fail, you'll
// end up here.
});
You need require model of collections which you want to update in api.js
In route handler function, you need a transaction among related collection. You can you fawn npm to make this easier
https://www.npmjs.com/package/fawn

when to disconnect and when to end a pg client or pool

My stack is node, express and the pg module. I really try to understand by the documentation and some outdated tutorials. I dont know when and how to disconnect and to end a client.
For some routes I decided to use a pool. This is my code
const pool = new pg.Pool({
user: 'pooluser',host: 'localhost',database: 'mydb',password: 'pooluser',port: 5432});
pool.on('error', (err, client) => {
console.log('error ', err); process.exit(-1);
});
app.get('/', (req, res)=>{
pool.connect()
.then(client => {
return client.query('select ....')
.then(resolved => {
client.release();
console.log(resolved.rows);
})
.catch(e => {
client.release();
console.log('error', e);
})
pool.end();
})
});
In the routes of the CMS, I use client instead of pool that has different db privileges than the pool.
const client = new pg.Client({
user: 'clientuser',host: 'localhost',database: 'mydb',password: 'clientuser',port: 5432});
client.connect();
const signup = (user) => {
return new Promise((resolved, rejeted)=>{
getUser(user.email)
.then(getUserRes => {
if (!getUserRes) {
return resolved(false);
}
client.query('insert into user(username, password) values ($1,$2)',[user.username,user.password])
.then(queryRes => {
client.end();
resolved(true);
})
.catch(queryError => {
client.end();
rejeted('username already used');
});
})
.catch(getUserError => {
return rejeted('error');
});
})
};
const getUser = (username) => {
return new Promise((resolved, rejeted)=>{
client.query('select username from user WHERE username= $1',[username])
.then(res => {
client.end();
if (res.rows.length == 0) {
return resolved(true);
}
resolved(false);
})
.catch(e => {
client.end();
console.error('error ', e);
});
})
}
In this case if I get a username already used and try to re-post with another username, the query of the getUser never starts and the page hangs. If I remove the client.end(); from both functions, it will work.
I am confused, so please advice on how and when to disconnect and to completely end a pool or a client. Any hint or explanation or tutorial will be appreciated.
Thank you
First, from the pg documentation*:
const { Pool } = require('pg')
const pool = new Pool()
// the pool with emit an error on behalf of any idle clients
// it contains if a backend error or network partition happens
pool.on('error', (err, client) => {
console.error('Unexpected error on idle client', err) // your callback here
process.exit(-1)
})
// promise - checkout a client
pool.connect()
.then(client => {
return client.query('SELECT * FROM users WHERE id = $1', [1]) // your query string here
.then(res => {
client.release()
console.log(res.rows[0]) // your callback here
})
.catch(e => {
client.release()
console.log(err.stack) // your callback here
})
})
This code/construct is suficient/made to get your pool working, providing the your thing here things. If you shut down your application, the connection will hang normaly, since the pool is created well, exactly not to hang, even if it does provides a manual way of hanging,
see last section of the article.
Also look at the previous red section which says "You must always return the client..." to accept
the mandatory client.release() instruction
before accesing argument.
you scope/closure client within your callbacks.
Then, from the pg.client documentation*:
Plain text query with a promise
const { Client } = require('pg').Client
const client = new Client()
client.connect()
client.query('SELECT NOW()') // your query string here
.then(result => console.log(result)) // your callback here
.catch(e => console.error(e.stack)) // your callback here
.then(() => client.end())
seems to me the clearest syntax:
you end the client whatever the results.
you access the result before ending the client.
you don´t scope/closure the client within your callbacks
It is this sort of oposition between the two syntaxes that may be confusing at first sight, but there is no magic in there, it is implementation construction syntax.
Focus on your callbacks and queries, not on those constructs, just pick up the most elegant for your eyes and feed it with your code.
*I added the comments // your xxx here for clarity
You shouldn't disconnect the pool on every query, connection pool is supposed to be used to have "hot" connections.
I usually have a global connection on startup and the pool connection close on (if) application stop; you just have to release the connection from pool every time the query ends, as you already do, and use the same pool also in the signup function.
Sometimes I need to preserve connections, I use a wrapper to the query function that checks if the connection is active or not before perform the query, but it's just an optimization.
In case you don't want to manage open/close connections/pool or release, you could try https://github.com/vitaly-t/pg-promise, it manage all that stuff silently and it works well.
The documentation over node-postgres's github says:
pro tip: unless you need to run a transaction (which requires a single client for multiple queries) or you have some other edge case like streaming rows or using a cursor you should almost always just use pool.query. Its easy, it does the right thing ™️, and wont ever forget to return clients back to the pool after the query is done.
So for non-transactional query, calling below code is enough.
var pool = new Pool()
pool.query('select username from user WHERE username= $1',[username], function(err, res) {
console.log(res.rows[0].username)
})
By using pool.query, the library will take care of releasing the client after the query is done.
Its quite simple, a client-connection (single connection) opens up, query with it, once you are done you end it.
The pool concept is different, in the case of mysql : you have to .release() the connection back to the pool once you are done with it, but it seems that with pg is a different story:
From an issue on the github repo : Cannot use a pool after calling end on the pool #1635
"Cannot use a pool after calling end on the pool"
You can't reuse a pool after it has been closed (i.e. after calling
the .end() function). You would need to recreate the pool and discard
the old one.
The simplest way to deal with pooling in a Lambda is to not do it at
all. Have your database interactions create their own connections and
close them when they're done. You can't maintain a pool across
freeze/thaw cycles anyway as the underlying TCP sockets would be
closed.
If opening/closing the connections becomes a performance issue then
look into setting up an external pool like pgbouncer.
So I would say that your best option is to not end the pool, unless you are shutting down the server

Resources