I'm using Sequalize ORM for node.js/maria db project.
What I'm trying to do is, generate new table for product data using raw query.
The sequence of my logic is written below.
Step 1. Destroy table to reset data.
Step 2. Insert product data.
Step 3. Update price data in product data.
Step 4. Update stock data in product data.
The problem is step 3, and 4. It is not working!
What I found is... 'Insert' took some time to finish. So, 'Update' could not fulfilled, because there's no product data yet.
Is there any idea to invoke step 3~4, soon after step 2 is finished?
Thanks in advance.
const generateProductList = () => {
return new Promise( async (resolve, reject) => {
try {
await ProductPresentation.destroy({ truncate: true })
const productSql = `INSERT INTO m_product_presentation (productId, sku, name, status, urlKey, category, shortDescription, imageSmall, imageThumbnail)
SELECT id, sku, name, status, urlKey, category, shortDescription, imageSmall, imageThumbnail FROM m_product;`
const priceSql = `UPDATE m_product_presentation INNER JOIN m_price
ON m_product_presentation.productId = m_price.productId
SET m_product_presentation.priceRrp = m_price.priceRrp, m_product_presentation.priceRegular = m_price.priceRegular, m_product_presentation.priceSpecial = m_price.priceSpecial;`
const stockSql = `UPDATE m_product_presentation INNER JOIN m_inventory
ON m_product_presentation.productId = m_inventory.productId
SET m_product_presentation.stockAvailability = m_inventory.stockAvailability, m_product_presentation.stockQty = m_inventory.stockQty;`
// What I want is, Create initial data first. And then, update price and stock info. But, It fail...
await ProductPresentation.sequelize.query(productSql, { type: QueryTypes.INSERT })
await ProductPresentation.sequelize.query(priceSql, { type: QueryTypes.UPDATE })
await ProductPresentation.sequelize.query(stockSql, { type: QueryTypes.UPDATE })
resolve()
} catch(err) {
reject(err)
logger.error(err)
}
})
}
What you can do is that before updating the values which is dependent on the same product, you can check whether the product is already entered or not.
const generateProductList = () => {
return new Promise(async (resolve, reject) => {
try {
await ProductPresentation.destroy({ truncate: true })
const productSql = `INSERT INTO m_product_presentation (productId, sku, name, status, urlKey, category, shortDescription, imageSmall, imageThumbnail)
SELECT id, sku, name, status, urlKey, category, shortDescription, imageSmall, imageThumbnail FROM m_product;`
const priceSql = `UPDATE m_product_presentation INNER JOIN m_price
ON m_product_presentation.productId = m_price.productId
SET m_product_presentation.priceRrp = m_price.priceRrp, m_product_presentation.priceRegular = m_price.priceRegular, m_product_presentation.priceSpecial = m_price.priceSpecial;`
const stockSql = `UPDATE m_product_presentation INNER JOIN m_inventory
ON m_product_presentation.productId = m_inventory.productId
SET m_product_presentation.stockAvailability = m_inventory.stockAvailability, m_product_presentation.stockQty = m_inventory.stockQty;`
// Inserting the product in the table
const product = await ProductPresentation.sequelize.query(productSql, { type: QueryTypes.INSERT })
// To find if the product is existing the table or not
const isProductEntered = await sequelize.query(`
select exists(select 1 from "m_product_presentation" where "productId"=$1)
`, {
bind: [`${product[0][0].productId}`],
type: QueryTypes.SELECT
});
if (isProductEntered[0].exists) {
await ProductPresentation.sequelize.query(priceSql, { type: QueryTypes.UPDATE })
await ProductPresentation.sequelize.query(stockSql, { type: QueryTypes.UPDATE })
}
resolve()
} catch (err) {
reject(err)
logger.error(err)
}
})
}
If the product will be entered only then the update queries will be executed.
I found that another source code (Some react code in frontend) cause this issue. I'll close this issue.
Related
I got products with the option to choose size,
So If I add 'X' product to the cart with sizes 4 and 5 it will add 2 items to the cart like this :
My goal is when you succeed buying that items from your cart the sizes that you just bought will be removed from main product page.
It works good only if I trying to buy 2 different items.
If I will try to buy 2 same items with different size only the first size will be filtered and I will get this error :
(node:21336) UnhandledPromiseRejectionWarning: VersionError: No matching document found for id "62bee4ce92e7c57a195686ae" version 0 modifiedPaths "size"
this is the success buy code :
createOrder: async (_, {}, context) => {
const userAuth = await auth(context);
const cart = await Cart.findOne({ userId: userAuth._id });
cart.cartProducts.forEach(async (p) => { // Here's the filtering product functionallity
const products = await Product.findById(p.productId);
if (products) {
products.size = products.size.filter((s) => s !== +p.size);
}
await products.save();
});
if (!cart) {
throw new UserInputError('No available order!');
}
const newOrder = new Order({
orderProducts: cart.cartProducts,
purchasedBy: userAuth._id,
datePurchased: new Date().toISOString(),
});
await newOrder.save();
return newOrder;
},
as I said it works only if you add 2 different items to your cart.
edited:
Now I am getting product.save is not a function
const userAuth = await auth(context);
const cart = await Cart.findOne({ userId: userAuth._id });
const products = await Product.find({
_id: cart.cartProducts.map((c) => c.productId),
});
cart.cartProducts.forEach(async (p) => {
if (products) {
products.map((product) => {
return (product.size = product.size.filter(
(size) => size !== +p.size
));
});
}
});
await products.save();
The issue is that you are using findById which will return the first matching document. So that is why when you add same product with different size in cart. It will always pick the first matching id of the product. You can try using find operator as it returns the list matching the condition.
Product.find({_id:p.productId});
It will return all of the products instead of matching only the first one.
Actually am getting details from cart table it display an array of objects , that object contains product id and quantity , after that i want to include price and name for my local display so i want to update the object for my reference, but name and price contains another table . here my code
exports.cartList = async (req, res) => {
const CartId = req.profile.CartId;
await Cart.findById(CartId).then(data =>
{
const products = data.products;
const exprd = products;
const length = products.length;
exprd.map((item, index) =>
{
const prodID = item.productId;
const quantity = item.quantity;
Product.findById(prodID).then(prdDet => {
const price = prdDet.price;
const inven = prdDet.inventory;
const name = prdDet.name;
console.log(name);
console.log('CHECKERSP');
if(quantity < inven)
{
({ ...exprd, stock: 'stock', })
}
else
{
({ ...exprd, stock: 'Out of Stock', })
}
({ ...exprd, name: name, price: price })
});
console.log('ex2',exprd);
})
console.log('ex1',exprd);
res.json(exprd)
});
}```
but object is not updated , i don't know what i did wrong because am very new to these things . Help me to overcome this problem, Thanks in advance.
First I think you have missed await before "Product.findById(prodID)"
Further,
In if and else block you have to assign the value after declaration.
Example:
let obj ={name:'XYZ'};
Either it should be:
let obj1 = {...obj, name:'ABC'};
Or
obj = {...obj, name:'ABC'};
Furthermore, If you are doing await inside a loop,
traditional loops are better. Map, filter and all can be tricky and await might not act as desired.
Kindly let me know if there is anything missing or wrong...
Just working things according to the funny experiences I acquired.
Hope this helps.
I have to read a really large CSV file so search through the google and get to know about createReadStream. I am using a program that read the csv file data and insert it into the mongoDB.
process I am following
process the data using createReadStream (I think it read the file line by line).
Storing data into an array.
Insert the data into mongoDB using insertMany
Now the problem is whole file is first get stored into an array and then I insert into the database.
But what I think is the better approach would be I only store first 500 line/rows into an array insert it into the DB and again follow the same step for the next 500 records
Is it possible to achieve this ?
and also is it the right way to do this ?
my program
const test = async () => {
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow) {
try{
stream.pause()
if(!authorName.includes(csvrow.author)) {
const author = new Author({author: csvrow.author})
authorId = author._id
authorName.push(author.author)
authorData.push(author)
}
if(!companyName.includes(csvrow.company_name)) {
const company = new Company({companyName: csvrow.company_name})
companyID = company._id
companyName.push(company.companyName)
companyData.push(company)
}
users = new User({
name: csvrow.firstname,
dob: csvrow.dob,
address: csvrow.address,
phone: csvrow.phone,
state: csvrow.state,
zip: csvrow.zip,
email: csvrow.email,
gender: csvrow.gender,
userType: csvrow.userType
})
userData.push(users)
book = new Book({
book_number: csvrow.book_number,
book_name: csvrow.book_name,
book_desc: csvrow.book_desc,
user_id: users._id,
author_id: authorId
})
bookData.push(book)
relationalData.push({
username: users.name,
author_id: authorId,
book_id: book._id,
company_id: companyID
})
}finally {
stream.resume()
}
})
.on('end', async function() {
try {
Author.insertMany(authorData)
User.insertMany(userData)
Book.insertMany(bookData)
Company.insertMany(companyData)
await Relational.insertMany(relationalData)
parentPort.postMessage("true")
}catch(e){
console.log(e)
parentPort.postMessage("false")
}
})
}
test()
This program is working fine also inserting the data into the DB, But I am looking for something like this:
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow, maxLineToRead: 500) {
// whole code/logic of insert data into DB
})
so maxLineToRead is my imaginary term.
basically my point is I want to process 500 data at a time and insert it into the DB and want to repeat this process till the end.
You can create a higher scoped array variable where you accumulate rows of data as they arrive on the data event. When you get to 500 rows, fire off your database operation to insert them. If not yet at 500 rows, then just add the next one to the array and wait for more data events to come.
Then, in the end event insert any remaining rows still in the higher scoped array.
In this way, you will insert 500 at a time and then however many are left at the end. This has an advantage vs. inserting them all at the end that you spread out the database load over the time you are parsing.
Here's an attempt to implement that type of processing. There are some unknowns (documented with comments) based on an incomplete description of exactly what you're trying to accomplish in some circumstances):
const test = () => {
return new Promise((resolve, reject) => {
const accumulatedRows = [];
async function processRows(rows) {
// initialize data arrays that we will insert
const authorData = [],
companyData = [],
userData = [],
bookData = [],
relationalData = [];
// this code still has a problem that I don't have enough context
// to know how to solve
// If authorName contains csvrow.author, then the variable
// authorId is not initialized, but is used later in the code
// This is a problem that needs to be fixed.
// The same issue occurs for companyID
for (let csvrow of rows) {
let authorId, companyID;
if (!authorName.includes(csvrow.author)) {
const author = new Author({ author: csvrow.author })
authorId = author._id
authorName.push(author.author)
authorData.push(author)
}
if (!companyName.includes(csvrow.company_name)) {
const company = new Company({ companyName: csvrow.company_name })
companyID = company._id
companyName.push(company.companyName)
companyData.push(company)
}
let users = new User({
name: csvrow.firstname,
dob: csvrow.dob,
address: csvrow.address,
phone: csvrow.phone,
state: csvrow.state,
zip: csvrow.zip,
email: csvrow.email,
gender: csvrow.gender,
userType: csvrow.userType
});
userData.push(users)
let book = new Book({
book_number: csvrow.book_number,
book_name: csvrow.book_name,
book_desc: csvrow.book_desc,
user_id: users._id,
author_id: authorId
});
bookData.push(book)
relationalData.push({
username: users.name,
author_id: authorId,
book_id: book._id,
company_id: companyID
});
}
// all local arrays of data are populated now for this batch
// so add this data to the database
await Author.insertMany(authorData);
await User.insertMany(userData);
await Book.insertMany(bookData);
await Company.insertMany(companyData);
await Relational.insertMany(relationalData);
}
const batchSize = 50;
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow) {
try {
accumulatedRows.push(csvRow);
if (accumulatedRows.length >= batchSize) {
stream.pause();
await processRows(accumulatedRows);
// clear out the rows we just processed
acculatedRows.length = 0;
stream.resume();
}
} catch (e) {
// calling destroy(e) will prevent leaking a stream
// and will trigger the error event to be called with that error
stream.destroy(e);
}
}).on('end', async function() {
try {
await processRows(accumulatedRows);
resolve();
} catch (e) {
reject(e);
}
}).on('error', (e) => {
reject(e);
});
});
}
test().then(() => {
parentPort.postMessage("true");
}).catch(err => {
console.log(err);
parentPort.postMessage("false");
});
When I want to add two records in sequence, only one record is added, on the second it throws an error due to the fact that it cannot create a field with such data:
"NOTES_ID is required","key: NOTES_ID, value: undefined, is not a
number"
How to create an entry for two related tables sequentially from the beginning for the main table, and then for the one that has the foreign key installed.
module.exports.create = async function (req, res) {
const stateMatrix = await StateMatrix.select().exec()
const noteObj = {
DATE: req.body.DATE,
TITLE: req.body.TITLE,
CONTENT: req.body.CONTENT
};
const noteStateObj = {
STATE_DATE: new Date().toLocaleDateString("en-US"),
STATES_ID: stateMatrix[0]._props.STATES_ID_CURR,
NOTES_ID: req.body.NOTE_ID,
USERS_ID: req.decoded.user_id
};
try {
await Notes.create(noteObj);
await NoteStates.create(noteStateObj);
res.status(201).json(noteObj, noteStateObj);
} catch (e) {
errorHandler(res, e);
}
};
Probably NoteStates is related to Notes through note_id field which can not be empty (I guess it's foreign key). It means that you should set it before saving noteStateObj:
// Something like this
const newNote = await Notes.create(noteObj);
noteStateObj.NOTES_ID = newNote.ID;
await NoteStates.create(noteStateObj);
I have unexpected behaviour when loading data into BigQuery just after creating the schema.
I'm using Node API to insert data with BigQuery streaming API.
In order to reset the data I delete and create the tables before loading any data.
My Problem: the first time it works fine, but if I execute it again it fails.
The process always delete and creates the table schema, but does not insert the data until I wait a moment to execute it again.
This is the code which reproduces the case:
async function loadDataIntoBigquery() {
const {BigQuery} = require('#google-cloud/bigquery')
const tableName = "users"
const dataset = "data_analisis"
const schemaUsers = "name:string,date:string,type:string"
const userData = [{name: "John", date: "20/08/93", type: "reader"}, {
name: "Marie",
date: "20/08/90",
type: "owner"
}]
try {
const bigquery = new BigQuery()
await bigquery.createDataset(dataset).then(err => console.log("dataset created successfully")).catch(err => {
console.log("warn: maybe the dataset already exists")
})
await bigquery.dataset(dataset).table(tableName).delete().then(err => console.log("table deleted successfully")).catch((err) => {
console.log("Error: maybe the table does not exist")
})
await bigquery.dataset(dataset).createTable(tableName, {schema: schemaUsers}).then(() => console.log("table created successfully")).catch(err => console.log("Error: maybe the table already exists"))
await bigquery.dataset(dataset).table(tableName).insert(userData).then((data) => console.log("Ok inserted ", data)).catch(err => console.log("Error: can't insert "))
} catch (err) {
console.log("err", err)
}
}
to verify that the data was inserted I'm using this query
select * from `data_analisis.users`
I have the same issue. As a workaround, i insert data with a query instead :
const query = "INSERT INTO `"+dataset+"."+tableName"` (name, date, type ) VALUES ("+name+",'"+date+"','"+type+"')";
await bigQuery.query({
query: query,
useLegacySql: false,
location: 'EU'
}, (err) => {
console.log("Insertion error : ",err);
})