POST the same UUID into two tables with Node / PostgreSQL - node.js

I am trying to make a POST request that will insert the same UUID value into two tables: 'employee' and 'skill'. I have tried this a few different ways, but have not been able to do so. Here is my query for posting the UUID (and a 'summary') into one table:
app.post("/employees/:id/skills", async(req, res) => {
try {
const { summary } = req.body;
const addEmployeeSkill = await pool.query(
"INSERT INTO skill(skill_uuid, summary)VALUES(uuid_generate_v4(), $1) RETURNING *",
[summary],
);
res.json(addEmployeeSkill.rows[0]);
} catch (err) {
console.error(err.message);
}
});
My question is: how do I get the same UUID that is being generated into the 'skill' table to also insert into the skill_uuid column of the 'employee' table?

Related

Two Knex queries (node.js) and one of it - in forEach loop

One table is UserChats with list of chats-to-users.
The other table is ChatList with all chats.
I need to build an array of chats of a specific user into variable and return it.
_DB("UserChats").select('*').where({
uID: user_id
}).then(chats => {
const userChats = []
chats.forEach(chat => {
_DB('ChatList').where({
id: chat.chatID
}).select('*').then(chat_data => {
userChats.push(chat_data);
});
});
return userChats; // but this variable is empty
});
How can I elegantly rewrite the code so that all the code is executed?
The join query will be as follows:
_DB('UserChats')
.select('*')
.where({ "UserChats.uID": user_id })
.join('ChatList', 'UserChats.chatID', 'ChatList.id')
.then(function(rows) {
return rows;
});

show data after inserted using sequelize raw queries in express

I'm trying to send the inserted data with raw queries using sequelize then show it. Below is my code:
const c_product_post = async (req, res) => {
try {
const sql = `INSERT INTO products (p_name, p_price, p_stock, p_review, "createdAt", "updatedAt")
VALUES ('${req.body.product_name}', ${req.body.product_price}, ${req.body.product_stock}, ${req.body.product_review}, now(), now());`
const postData = await Product.sequelize.query(sql)
// await postData.save()
res.send({
message: "success add new product",
data: postData
})
}
catch (err) {
res.send({
message: err
})
}
}
what I'm trying to achieve is that after the data is inserted then it will be shown (see below image in red):
Add RETURNING clause to your query. Try this
INSERT INTO products (p_name, p_price, p_stock, p_review, "createdAt", "updatedAt")
VALUES ('${req.body.product_name}', ${req.body.product_price}, ${req.body.product_stock}, ${req.body.product_review}, now(), now())
RETURNING *;
Please note that your approach is highly SQLi prone. Consider using prepared statements instead of text substitution.

Delay when removing row in postgres with knex

I have a local postgres database running on my machine. I use node.js to access it. I have a table called 'friends' where every row is a user and a friend. I also have a table called 'users' where every row has all basic info about a user(e.g name and such).
When I want to remove a friendship between two users I have to remove two rows from the 'friends' table. I do this with this function:
const removeFriend = async (clientId, friendId) => {
// initiate transaction
return db
.transaction((trx) => {
// remove friendship from client
trx('friends')
.where({ user_id: clientId, friend_id: friendId })
.del()
.then(() => {
// remove friendship from friend
return trx('friends').where({ user_id: friendId, friend_id: clientId }).del();
})
// if all good then commit
.then(trx.commit)
// if bad then rollback
.catch(trx.rollback);
})
.catch(() => 'error');
};
I call the removeFriend function this way removeFriend(clientId, friendId)
Then when i want to get a list of all friends with their names from the database i use this function:
const getUserFriends = async (clientId) => {
// get friends
return db('friends')
.where({ user_id: clientId })
.join('users', 'users.id', 'friends.friend_id')
.select('friends.friend_id', 'users.name')
.then((friends) => friends)
.catch(() => 'error');
};
I call the getUserFriends function this way await getUserFriends(clientId)
The problem is that when I use removeFriend function and then directly use the getUserFriends function i get a list where the users are still friends. However, If i look in the database the rows have been deleted so naturaly i should get a list where the users are not friends. Do I use the await wrong or something?

postgres update query is not working in node,js

I have an update query which seems to be not working. The underlying database used is postgres. Could you please check why it is not working? I have included my api and schema. Thanks in advance
exports.patch_meetup = async (req, res) => {
const findOneQuery = 'SELECT * FROM meetups WHERE id=$1';
const updateOneQuery = `UPDATE meetups
SET topic=$1, location=$2, body=$3, happeningOn=$4, Tags=$5, meetupImage=$6, createdOn=$7
WHERE id=$8 returning *`;
try {
const {
rows
} = await db.query(findOneQuery, [req.params.id]);
if (!rows[0]) {
return res.status(404).json({
'message': 'meetup not found'
});
}
const values = [
req.body.topic,
req.body.location,
req.body.body,
req.body.happeningOn,
req.body.Tags,
req.file.path,
moment(new Date()),
req.params.id
];
const response = await db.query(updateOneQuery, values);
return res.status(200).json(response.rows[0]);
} catch (err) {
return res.status(400).send(err);
}
};`
Here is my model
const meetupTable = CREATE TABLE IF NOT EXISTS
meetups(
id UUID PRIMARY KEY,
topic VARCHAR(128) NOT NULL,
location VARCHAR(128) NOT NULL,
body TEXT NOT NULL,
happeningOn TIMESTAMPTZ NOT NULL,
Tags TEXT[] NOT NULL,
meetupImage bytea,
createdOn TIMESTAMPTZ DEFAULT Now()
)
I am not sure what the issue might be, but one could be you forgot to add multer middleware to the API endpoint and the other is like I mentioned in the comments, you shouldn't pass moment object, instead you should pass date object as it is.

problem when restarting tables and insert data in bigquery using node api

I have unexpected behaviour when loading data into BigQuery just after creating the schema.
I'm using Node API to insert data with BigQuery streaming API.
In order to reset the data I delete and create the tables before loading any data.
My Problem: the first time it works fine, but if I execute it again it fails.
The process always delete and creates the table schema, but does not insert the data until I wait a moment to execute it again.
This is the code which reproduces the case:
async function loadDataIntoBigquery() {
const {BigQuery} = require('#google-cloud/bigquery')
const tableName = "users"
const dataset = "data_analisis"
const schemaUsers = "name:string,date:string,type:string"
const userData = [{name: "John", date: "20/08/93", type: "reader"}, {
name: "Marie",
date: "20/08/90",
type: "owner"
}]
try {
const bigquery = new BigQuery()
await bigquery.createDataset(dataset).then(err => console.log("dataset created successfully")).catch(err => {
console.log("warn: maybe the dataset already exists")
})
await bigquery.dataset(dataset).table(tableName).delete().then(err => console.log("table deleted successfully")).catch((err) => {
console.log("Error: maybe the table does not exist")
})
await bigquery.dataset(dataset).createTable(tableName, {schema: schemaUsers}).then(() => console.log("table created successfully")).catch(err => console.log("Error: maybe the table already exists"))
await bigquery.dataset(dataset).table(tableName).insert(userData).then((data) => console.log("Ok inserted ", data)).catch(err => console.log("Error: can't insert "))
} catch (err) {
console.log("err", err)
}
}
to verify that the data was inserted I'm using this query
select * from `data_analisis.users`
I have the same issue. As a workaround, i insert data with a query instead :
const query = "INSERT INTO `"+dataset+"."+tableName"` (name, date, type ) VALUES ("+name+",'"+date+"','"+type+"')";
await bigQuery.query({
query: query,
useLegacySql: false,
location: 'EU'
}, (err) => {
console.log("Insertion error : ",err);
})

Resources