how to dynamically insert multiple rows into google cloud spanner - node.js

In the below code snippet, Inserting multiple rows with static values. How do I insert multiple rows dynamically into the spanner database in one transaction?
function writeUsingDml(instanceId, databaseId, projectId) {
const {Spanner} = require('#google-cloud/spanner');
const spanner = new Spanner({
projectId: projectId,
});
const instance = spanner.instance(instanceId);
const database = instance.database(databaseId);
database.runTransaction(async (err, transaction) => {
if (err) {
console.error(err);
return;
}
try {
const rowCount = await transaction.runUpdate({
sql: `INSERT Singers (SingerId, FirstName, LastName) VALUES
(12, 'Melissa', 'Garcia'),
(13, 'Russell', 'Morales'),
(14, 'Jacqueline', 'Long'),
(15, 'Dylan', 'Shaw')`,
});
console.log(`${rowCount} records inserted.`);
await transaction.commit();
} catch (err) {
console.error('ERROR:', err);
} finally {
database.close();
}
});
}
For now, I am inserting a single row dynamically as follows and I want to extend this to multiple rows
var sqlString = "Insert " + tName
var cNames = "( "
var cValues = "( "
for(var col in cols) {
cNames = cNames + col + ", " ;
const cValue = cols[col];
if (typeof cValue == 'string' || cValue instanceof String){
cValues = cValues + "'" + cValue + "', "
}else{
cValues = cValues + cValue + ", ";
}
}
return sqlString + cNames + ") values " + cValues + ")";

If you want to insert many rows, it may be easier to use BatchTransaction.insert
https://cloud.google.com/nodejs/docs/reference/spanner/2.1.x/BatchTransaction#insert
database.runTransaction(async (err, transaction) => {
if (err) {
console.error(err);
return;
}
try {
var itemsToInsert = [
{ SingerId: 12, FirstName: 'Mellissa', LastName: 'Garcia' },
{ SingerId: 13, FirstName: 'Russell', LastName: 'Morales' },
{ SingerId: 14, FirstName: 'Jacqueline', LastName: 'Long' },
{ SingerId: 15, FirstName: 'Dylan', LastName: 'Shaw' },
]
transaction.insert('Singers', itemsToInsert);;
await transaction.commit();
console.log(`${itemsToInsert.length} records inserted.`);
} catch (err) {
console.error('ERROR:', err);
} finally {
database.close();
}
});
If you want to use DML though, you can make the SQL statement programatically. The following will build the insert statement. I didn't run your version, but I think it may have syntax errors related to trailing commas.
function buildSqlInsert(tableName, columnNames, rows) {
var sqlString = "INSERT " + tableName + " (" + columnNames.join(', ') + ") VALUES \n";
// add each row being careful to match the column order with columnNames
rows.forEach(row => {
sqlString += "(";
columnNames.forEach(columnName => {
var columnValue = row[columnName]
// strings should be quoted
if (typeof columnValue == 'string' || columnValue instanceof String) {
columnValue = "'" + columnValue + "'";
}
sqlString += columnValue + ", ";
});
// trim added chars.
sqlString = sqlString.substring(0, sqlString.length - 2);
sqlString += "),\n"
});
// trim added comma/newline
sqlString = sqlString.substring(0, sqlString.length - 2);
return sqlString
}
insertStatement = buildSqlInsert('Singers', Object.keys(itemsToInsert[0]), itemsToInsert)
const rowCount = await transaction.runUpdate({
sql: insertStatement,
});

You can also bind the rows you want to insert to an array-of-struct parameter and use that in an insert statement like follows:
INSERT INTO Singers(SingerId, FirstName, LastName)
SELECT * FROM UNNEST(#struct_array_param)
Where #struct_array_param is a bound parameter containing array of STRUCTS(tuples). In addition to being succinct and safer SQL syntax, using bound parameters offers the additional advantage of being able to use the same cached query plan for multiple executions of the same insert statement(with different bound values).

Related

wait for promise before exit process

I'm trying to run a NodeJS code that reads some data fields from an array, use them to do a database query to check if the data is duplicate before inserting them into the corresponding table.
My NodeJS code will be called from a PHP script so I need to know when it ends this is why I need to add process.exit(0) somewhere. The problem I have is that if I add it, the script is terminated and my promise never gets the time to send back the result.
Here is my code:
var bar = new Promise((resolve, reject) => {
result.forEach((row, index, array) => {
var escaped = _.map(row, mysql.escape);
var checkQuery = "SELECT COUNT(*) as found FROM data WHERE field1 = " + escaped[0] + " AND field2 = " + escaped[1] + " AND field3 = " + escaped[2] + " AND field4 = " + escaped[3] + " AND field5 = " + escaped[4] + " AND field6 = " + escaped[5] + " AND field7 = " + escaped[6] + ";";
conn.query(checkQuery, function (err, res) {
if (err) {
console.log("Error checking row for duplicate");
console.log(checkQuery);
process.exit(1);
} else {
if (res[0].found == 0) {
var query = " (";
var escaped = _.map(row, mysql.escape);
var csv = escaped.join(',');
query += csv;
query += ")";
query += row !== _.last(result) ? ',' : ';';
console.log(query);//This will change to inserting the data to the table
}else{
console.log("Duplicate found!");
}
}
});
if (index === array.length -1) resolve();
});
});
bar.then(() => {
console.log('All done!');
process.exit(0);
});
If I remove process.exit(0); I see "All done" first then console.log(query) result.
If I add it, the script is terminated and I see "All done" only.
Is there a better approach to do this task please?
Thanks.
Here is a way to wait for a promise before the application exits.
class Waiter {
private timeout: any
constructor() {
this.waitLoop()
}
private waitLoop():void {
this.timeout = setTimeout(() => { this.waitLoop() }, 100 * 1000)
}
okToQuit():void {
clearTimeout(this.timeout)
}
}
// Your app.
const appPromise:Promise<any> = ...
const w = new Waiter()
appPromise.finally(() => {
console.log("Quitting")
w.okToQuit()
})
Running multiple asynchronous operations in a loop and tracking when everything is done is just way, way, way easier if you use promises for all the individual asynchronous operation rather than trying to track asynchronous operations that use plain callbacks.
You don't say exactly what your database is, but if it's mysql, then there is a mysql2/promise driver that natively supports promises and that would be my recommendation to switch to that. Then you can directly use a promise returned from .query(). But, without the info about your specific database driver, I've shown how to manually promisify .query().
Then, the looping code can use a for loop and await to sequence the database calls so it's easy to know when they are all complete.
const { promisify } = require('util');
async function someFunc() {
// other code here
// promisify conn.query (or use promise interface directly from the database)
conn.queryP = promisify(conn.query);
try {
for (const row of result) {
const escaped = _.map(row, mysql.escape);
const checkQuery = "SELECT COUNT(*) as found FROM data WHERE field1 = " + escaped[0] + " AND field2 = " +
escaped[1] + " AND field3 = " + escaped[2] + " AND field4 = " + escaped[3] + " AND field5 = " +
escaped[4] + " AND field6 = " + escaped[5] + " AND field7 = " + escaped[6] + ";";
let res = await con.queryP(checkQuery);
if (res[0].found == 0) {
const csv = _.map(row, mysql.escape).join(',');
const terminator = row !== _.last(result) ? ',' : ';';
const query = " (" + csv + ")" + terminator;
console.log(query); //This will change to inserting the data to the table
} else {
console.log("Duplicate found!");
}
}
} catch (e) {
console.log("Error checking row for duplicate: ", checkQuery);
console.log(e);
process.exit(1);
}
console.log('All done!');
process.exit(0);
}
The code appears to be trying to build a query inside the loop where each iteration of the loop will add-on to the next (that's what _.last(result) ? ',' : ';'; look like anyway). If that's the case, then the query variable has to be moved outside the loop so it can build from one iteration of the loop to the next. But, you don't show what you're really trying to do with that query so you're on your own for that part.
you decide how many promises will go out before hand and then count them as they resolve, then exit
in this example the same principle is applied but it has callback functions instead of promises. For promises you would call a count function from the .then() or .finally(), and the count function will decide whether it is time to exit
mongoose example from a javascript server:
let g_DB = null;
//init mongoose
const mongoose = require("mongoose");
const connectionParams = {
useNewUrlParser: true,
useUnifiedTopology: true,
};
const connStr1 = "mongodb+srv://XX:XX#clusterXX.XX.mongodb.net/XX?
retryWrites=true&w=majority";
mongoose.set("strictQuery", false);
mongoose.connect(connStr1, connectionParams)
.then(handleConnection())
.catch((err) => console.log("Error:", err));
//end script
//handleConnection - start on successful response from mongoose connection
function handleConnection(msg) {
console.log("mongoose has connected to Mongo Atlas successfully");
g_DB = mongoose.connection;
g_DB.once("open", function () {
console.log(
"mongoose has connected to Mongo Atlas Cluster using database XX"
);
doTest();
});
}
//---------------------------------------------------
function doTest() {
console.log("test-05: create 500 books");
//---- MODEL ----
const _schema = new mongoose.Schema({
name: String,
price: Number,
quantity: Number,
});
//g_DB is a mongoose connection set earlier in the script
const _model = g_DB.model("book_schema", _schema, "bookstore");
let loopcount = 500;
let waitcount = loopcount;
for (let i = 0; i < loopcount; i++) {
_m = new _model({
name: `WHY MAKE 500 BOOKS ${new Date().toISOString()}`,
price: 200,
quantity: 2000,
});
_m.save((e, x) => {
if (e) return console.error(e);
console.log(x, `waitcount: ${--waitcount}`);
if (!waitcount) doExit();
});
}
}
//--
function doExit() {
console.log("exit from server");
process.exit();
}
Use Reject/Resolve to manage promise in Nodejs
When your task fulfils your request send result with resolve(); and if its failing use reject();
In your case you are not managing promise properly that's why it's running asynchronously, better to use following way with the proper returns.
var bar = new Promise((resolve, reject) => {
return result.forEach((row, index, array) => {
var escaped = _.map(row, mysql.escape);
var checkQuery = "SELECT COUNT(*) as found FROM data WHERE field1 = " + escaped[0] + " AND field2 = " + escaped[1] + " AND field3 = " + escaped[2] + " AND field4 = " + escaped[3] + " AND field5 = " + escaped[4] + " AND field6 = " + escaped[5] + " AND field7 = " + escaped[6] + ";";
return conn.query(checkQuery, function (err, res) {
if (err) {
console.log("Error checking row for duplicate");
console.log(checkQuery);
return reject(err);
} else {
if (res[0].found == 0) {
var query = " (";
var escaped = _.map(row, mysql.escape);
var csv = escaped.join(',');
query += csv;
query += ")";
query += row !== _.last(result) ? ',' : ';';
console.log(query);//This will change to inserting the data to the table
return resolve(query)
} else {
console.log("Duplicate found!");
return reject('Duplicate Found');
}
}
});
});
});
bar.then((data) => {
console.log('All done!');
});
In above code I am returning query + resolve/reject so it makes better to run in more synchronised way.
return conn.query(checkQuery, function (err, res) {
Plus, while processing this promise I am handling with .then((data) so I can handle that resolve values here.
bar.then((data) => {
console.log('All done!');
});
Note: If you are rejecting any promise it won't be available in above .then block you'll find this reject in catch block so code will be changed in following way.
bar.then((data) => {
console.log('All done!');
}).catch(err=>{
console.log(err);
});
You can try the following:
(async () => {
await new Promise((resolve, reject) => {
result.forEach((row, index, array) => {
var escaped = _.map(row, mysql.escape);
var checkQuery = "SELECT COUNT(*) as found FROM data WHERE field1 =
" + escaped[0] + " AND field2 = " + escaped[1] + " AND field3 = " +
escaped[2] + " AND field4 = " + escaped[3] + " AND field5 = " + escaped[4] + " AND field6 = " + escaped[5] + " AND field7 = " + escaped[6] + ";";
conn.query(checkQuery, function (err, res) {
if (err) {
console.log("Error checking row for duplicate");
console.log(checkQuery);
process.exit(1);
} else {
if (res[0].found == 0) {
var query = " (";
var escaped = _.map(row, mysql.escape);
var csv = escaped.join(',');
query += csv;
query += ")";
query += row !== _.last(result) ? ',' : ';';
console.log(query);//This will change to inserting the data to the table
}else{
console.log("Duplicate found!");
}
}
});
if (index === array.length -1) resolve();
});
});
console.log('All done!');
})();
You don't even need to call the process.exit(0) because the code will always terminate when the job is done :)

Nested Promises in node.js and pg

I am new to node and writing a small application. I haven't used a language as asynchronous as this on the server before and have myself in a bit of a pickle. I need to take a string, query a table for an id, then insert in a second table using the result, then return a string from the funtion two levels up. I have a custom dao I use for the db stuff. Here is the function where it all happens:
function generateToken(data, userId, client) {
var random = Math.floor(Math.random() * 100001);
var sha256 = crypto.createHmac("sha256", random );
var token = sha256.update(data).digest("base64");
var query = dao.select(
'auth.apps',
{
name: client.name,
version: client.version,
subversion: client.subversion,
patch: client.patch
}
).done(
function(result) {
dao.insert(
'auth.tokens',
{
user_id:userId,
app_id: result.rows[0].id,
token:token
}
);
return "mmmm yellllo";
}
);
var ret_val = await(query);
console.log("Token return: " + ret_val);
return ret_val;
}
and here is the relevant part of my dao for select:
dbo.prototype.select = function(table, where, order_by) {
var where_clause = this.construct_where(where);
var sql = 'SELECT * FROM ' + table + ' WHERE ' + where_clause;
if(order_by !== undefined) {
sql = sql + ' ORDER BY ' + order_by;
};
var result = this.pool.query(sql);
return result;
};
and insert:
dbo.prototype.insert= function(table, values) {
var key_list='', value_list = '';
for( var k in values)
{
key_list = key_list + ', ' + k;
value_list = value_list + ", '" + values[k] + "'";
}
// chop off comma space
key_list = key_list.substring(2);
value_list = value_list.substring(2);
var sql = 'INSERT INTO ' + table + '(' + key_list + ') VALUES(' + value_list + ') RETURNING id';
var result = this.pool.query(sql).catch(function(error) {
console.log("SQL:" + sql + " error:" + error);
});
return result;
};
How do unwind the double promise. I want the generateToken function to return the token variable but only after the insert query has finished.
There is a library named deasync.
And the motivation to create it was to solve the situations when
API cannot be changed to return merely a promise or demand a callback
parameter
So this is the primary and probably the only use case. Because in general Node.js should stay async.
To do the trick you basically should write a function that accepts a callback and then wrap it with deasync as follows:
var deasync = require('deasync');
//It can still take the params before the callback
var asyncGenerateToken = function (data, userId, client, callback) {
var token = 'abc';
//Async operation starts here
setTimeout(function () {
//Async operation is finished, now we can return the token
//Don't forget that the error is 1st arg, data is the 2nd
callback(null, token);
}, 1000);
};
var generateToken = deasync(asyncGenerateToken);
//We'll retrieve a token only after a second of waiting
var token = generateToken('my data', 'my user id', 'my client');
console.log(token);
Hope this helps.

I am trying to develop a specific adapter for sqlite3 with node.js This is similar sails-mysql as devised. I do not know how to start

OracleDialect.prototype.describe = function (connection, collection, callback) {
var tableName = this.normalizeTableName(collection.tableName);
var queries = [];
queries[0] = "SELECT COLUMN_NAME, DATA_TYPE, NULLABLE FROM USER_TAB_COLUMNS WHERE TABLE_NAME = '" + tableName + "'";
queries[1] = "SELECT index_name,COLUMN_NAME FROM user_ind_columns WHERE table_name = '" + tableName + "'";
queries[2] = "SELECT cols.table_name, cols.column_name, cols.position, cons.status, cons.owner "
+ "FROM all_constraints cons, all_cons_columns cols WHERE cols.table_name = '" + tableName
+ "' AND cons.constraint_type = 'P' AND cons.constraint_name = cols.constraint_name AND cons.owner = cols.owner "
+ "ORDER BY cols.table_name, cols.position";
asynk.each(queries, function (query, nextQuery) {
connection.client.raw(query).asCallback(nextQuery);
}).serie().done(function (results) {
var schema = results[0];
var indexes = results[1];
var tablePrimaryKeys = results[2];
if (schema.length === 0) {
return callback({code: 'ER_NO_SUCH_TABLE', message: 'Table ' + tableName + ' doesn\'t exist.'}, null);
}
// Loop through Schema and attach extra attributes
schema.forEach(function (attribute) {
tablePrimaryKeys.forEach(function (pk) {
// Set Primary Key Attribute
if (attribute.COLUMN_NAME === pk.COLUMN_NAME) {
attribute.primaryKey = true;
// If also a number set auto increment attribute
if (attribute.DATA_TYPE === 'NUMBER') {
attribute.autoIncrement = true;
}
}
});
// Set Unique Attribute
if (attribute.NULLABLE === 'N') {
attribute.required = true;
}
});
// Loop Through Indexes and Add Properties
indexes.forEach(function (index) {
schema.forEach(function (attribute) {
if (attribute.COLUMN_NAME === index.COLUMN_NAME)
{
attribute.indexed = true;
}
});
});
//console.log("describe schema: ",schema);
callback(null, schema);
}).fail(function(err) {
return callback(err, null);
});
};
I hope all my questions can be answered with a link to a good documentation!
Thank you!
I hope all my questions can be answered with a link to a good documentation!
this is for orcale adapter

Sequelize, Raw Query by daterange

I use "sequelize": "^2.0.0-rc3" with pg (postgresql), in this moment i am trying to do a raw query with date range but sequelize don't return data.
When I run the same query in postgresql db get correct results. Please Help Me.
In sequilize:
// Init main query
var query = "SELECT * FROM" + '"Calls"' +
" WHERE " + '"EquipmentId"' + " = 1" +
" AND " + '"initDate"' + " >= " + "'2015-02-05 14:40' " +
" AND " + '"endDate"' + " <= " + " '2015-02-05 15:00' ";
global.db.sequelize.query(query)
.then(function(calls) {
console.log(calls);
})
.error(function (err) {
console.log(err);
});
In console of node server I get.
Executing (default): SELECT * FROM "Calls" WHERE "EquipmentId" = 1 AND "initDate" >= '2015-02-05 14:40' AND "endDate" <= '2015-02-05 15:00'
But empty array calls...
Try this:
var query = ' \
SELECT * FROM "Calls" \
WHERE "EquipmentId" = :EquipmentId \
AND "initDate" >= :initDate \
AND "endDate" <= :endDate; \
';
global.db.sequelize.query(query, null, {raw: true}, {
EquipmentId: 1,
initDate: new Date('2015-02-05 14:40'),
endDate: new Date('2015-02-05 15:00')
})
.then(function(calls) {
console.log(calls);
})
.error(function (err) {
console.log(err);
});
You can also use sequelize function like findAll
Model.findAll(
where : {
gte:sequelize.fn('date_format', initDate, '%Y-%m-%dT%H:%i:%s'),
lte:sequelize.fn('date_format', endDate, '%Y-%m-%dT%H:%i:%s')
}
)
To get a result between two date ranges you just need to do this.
let whereClause = {
where: { $and: [
{ time: { [ Op.gte ]: req.query.start_time } },
{ time: { [ Op.lte ]: req.query.end_time } }
]
}
};
await model_name.findAll(whereClause);

executing sequent command in redis

This code doesn't work and I couldn't find out why?
It is always pushing obj right serialized JSON string but it always return with wrong key. In obj id is regularly increasing but key isn't.
var c = redis.createClient(),
obj = {id:0, name:"dudu"},
key="person:";
c.select(0);
c.multi()
.incr("idx:person", function(err, _idx) {
console.log("incr -> #idx: " + _idx);
key += obj.id = _idx;
console.log("After Inc obj: " + JSON.stringify(obj));
})
.set(key, JSON.stringify(obj), function(err, _setResp) {
console.log("set -> #_setResp: " + _setResp);
console.log(JSON.stringify(ihale));
})
.get(key, function(er, _obj) {
console.log("get -> " + key);
if (er) {
res.json(er);
} else {
console.log("Found: " + JSON.stringify(_obj));
res.json(_obj);
}
})
.exec(function(err, replies) {
console.log("MULTI got " + replies.length + " replies");
replies.forEach(function(reply, index) {
console.log("Reply " + index + ": " + reply.toString());
});
});
c.quit();
This worked:
c.INCR("idx:person", function(a,b) {
obj.id = b;
console.dir(obj);
key = "pa:" + b;
c.set(key, JSON.stringify(obj), function(err, _setResp) {
console.log("set -> #_setResp: " + _setResp);
console.log(JSON.stringify(obj));
c.get(key, function(er, _obj) {
console.log("get -> " + key);
if (er) {
res.json(er);
} else {
console.log("Found: " + JSON.stringify(_obj));
res.json(_obj);
}
});
});
});
The way to do this is simple :)
Event driven node is executing every part inside of previous one:
c.INCR("idx:person", function(a,b) {
obj.id = b;
key = "pa:" + b;
c.set(key, JSON.stringify(obj), function(err, _setResp) {
c.get(key, function(er, _obj) {
if (er) {
res.json(er);
} else {
res.json(_obj);
}
});
});
});
In transaction mode command are grouped and passed to Redis . The Exec command execute code you passed . When you pass the key value to a the set command , there is no incremental key value on right of it.
For this kind of use, and if you still want have merged commands in one , script it In Lua:
local keyid = redis.call('INCR', 'idx:person')
local result = redis.call('SET', 'person:'..keyid,ARGV[1])
return 'person:'..keyid
for using it in a redis eval command :
eval "local keyid = redis.call('INCR', 'idx:person'); local result = redis.call('SET', 'person:'..keyid,ARGV[1]);return 'person:'..keyid" 0 "yourJSONObject"
this should work:
client.eval([ "local keyid = redis.call('INCR', 'idx:person'); local result = redis.call('SET', 'person:'..keyid,ARGV[1]);return result", 0,JSON.stringify(obj) ], function (err, res) {
console.log(res); // give the personID
});
You Can also use a Hash instead of a simple key in your example for separate id, name , and json object . Return the hash from the lua script will be like return it from a hset.

Resources