Sequelize, Raw Query by daterange - node.js

I use "sequelize": "^2.0.0-rc3" with pg (postgresql), in this moment i am trying to do a raw query with date range but sequelize don't return data.
When I run the same query in postgresql db get correct results. Please Help Me.
In sequilize:
// Init main query
var query = "SELECT * FROM" + '"Calls"' +
" WHERE " + '"EquipmentId"' + " = 1" +
" AND " + '"initDate"' + " >= " + "'2015-02-05 14:40' " +
" AND " + '"endDate"' + " <= " + " '2015-02-05 15:00' ";
global.db.sequelize.query(query)
.then(function(calls) {
console.log(calls);
})
.error(function (err) {
console.log(err);
});
In console of node server I get.
Executing (default): SELECT * FROM "Calls" WHERE "EquipmentId" = 1 AND "initDate" >= '2015-02-05 14:40' AND "endDate" <= '2015-02-05 15:00'
But empty array calls...

Try this:
var query = ' \
SELECT * FROM "Calls" \
WHERE "EquipmentId" = :EquipmentId \
AND "initDate" >= :initDate \
AND "endDate" <= :endDate; \
';
global.db.sequelize.query(query, null, {raw: true}, {
EquipmentId: 1,
initDate: new Date('2015-02-05 14:40'),
endDate: new Date('2015-02-05 15:00')
})
.then(function(calls) {
console.log(calls);
})
.error(function (err) {
console.log(err);
});

You can also use sequelize function like findAll
Model.findAll(
where : {
gte:sequelize.fn('date_format', initDate, '%Y-%m-%dT%H:%i:%s'),
lte:sequelize.fn('date_format', endDate, '%Y-%m-%dT%H:%i:%s')
}
)

To get a result between two date ranges you just need to do this.
let whereClause = {
where: { $and: [
{ time: { [ Op.gte ]: req.query.start_time } },
{ time: { [ Op.lte ]: req.query.end_time } }
]
}
};
await model_name.findAll(whereClause);

Related

wait for promise before exit process

I'm trying to run a NodeJS code that reads some data fields from an array, use them to do a database query to check if the data is duplicate before inserting them into the corresponding table.
My NodeJS code will be called from a PHP script so I need to know when it ends this is why I need to add process.exit(0) somewhere. The problem I have is that if I add it, the script is terminated and my promise never gets the time to send back the result.
Here is my code:
var bar = new Promise((resolve, reject) => {
result.forEach((row, index, array) => {
var escaped = _.map(row, mysql.escape);
var checkQuery = "SELECT COUNT(*) as found FROM data WHERE field1 = " + escaped[0] + " AND field2 = " + escaped[1] + " AND field3 = " + escaped[2] + " AND field4 = " + escaped[3] + " AND field5 = " + escaped[4] + " AND field6 = " + escaped[5] + " AND field7 = " + escaped[6] + ";";
conn.query(checkQuery, function (err, res) {
if (err) {
console.log("Error checking row for duplicate");
console.log(checkQuery);
process.exit(1);
} else {
if (res[0].found == 0) {
var query = " (";
var escaped = _.map(row, mysql.escape);
var csv = escaped.join(',');
query += csv;
query += ")";
query += row !== _.last(result) ? ',' : ';';
console.log(query);//This will change to inserting the data to the table
}else{
console.log("Duplicate found!");
}
}
});
if (index === array.length -1) resolve();
});
});
bar.then(() => {
console.log('All done!');
process.exit(0);
});
If I remove process.exit(0); I see "All done" first then console.log(query) result.
If I add it, the script is terminated and I see "All done" only.
Is there a better approach to do this task please?
Thanks.
Here is a way to wait for a promise before the application exits.
class Waiter {
private timeout: any
constructor() {
this.waitLoop()
}
private waitLoop():void {
this.timeout = setTimeout(() => { this.waitLoop() }, 100 * 1000)
}
okToQuit():void {
clearTimeout(this.timeout)
}
}
// Your app.
const appPromise:Promise<any> = ...
const w = new Waiter()
appPromise.finally(() => {
console.log("Quitting")
w.okToQuit()
})
Running multiple asynchronous operations in a loop and tracking when everything is done is just way, way, way easier if you use promises for all the individual asynchronous operation rather than trying to track asynchronous operations that use plain callbacks.
You don't say exactly what your database is, but if it's mysql, then there is a mysql2/promise driver that natively supports promises and that would be my recommendation to switch to that. Then you can directly use a promise returned from .query(). But, without the info about your specific database driver, I've shown how to manually promisify .query().
Then, the looping code can use a for loop and await to sequence the database calls so it's easy to know when they are all complete.
const { promisify } = require('util');
async function someFunc() {
// other code here
// promisify conn.query (or use promise interface directly from the database)
conn.queryP = promisify(conn.query);
try {
for (const row of result) {
const escaped = _.map(row, mysql.escape);
const checkQuery = "SELECT COUNT(*) as found FROM data WHERE field1 = " + escaped[0] + " AND field2 = " +
escaped[1] + " AND field3 = " + escaped[2] + " AND field4 = " + escaped[3] + " AND field5 = " +
escaped[4] + " AND field6 = " + escaped[5] + " AND field7 = " + escaped[6] + ";";
let res = await con.queryP(checkQuery);
if (res[0].found == 0) {
const csv = _.map(row, mysql.escape).join(',');
const terminator = row !== _.last(result) ? ',' : ';';
const query = " (" + csv + ")" + terminator;
console.log(query); //This will change to inserting the data to the table
} else {
console.log("Duplicate found!");
}
}
} catch (e) {
console.log("Error checking row for duplicate: ", checkQuery);
console.log(e);
process.exit(1);
}
console.log('All done!');
process.exit(0);
}
The code appears to be trying to build a query inside the loop where each iteration of the loop will add-on to the next (that's what _.last(result) ? ',' : ';'; look like anyway). If that's the case, then the query variable has to be moved outside the loop so it can build from one iteration of the loop to the next. But, you don't show what you're really trying to do with that query so you're on your own for that part.
you decide how many promises will go out before hand and then count them as they resolve, then exit
in this example the same principle is applied but it has callback functions instead of promises. For promises you would call a count function from the .then() or .finally(), and the count function will decide whether it is time to exit
mongoose example from a javascript server:
let g_DB = null;
//init mongoose
const mongoose = require("mongoose");
const connectionParams = {
useNewUrlParser: true,
useUnifiedTopology: true,
};
const connStr1 = "mongodb+srv://XX:XX#clusterXX.XX.mongodb.net/XX?
retryWrites=true&w=majority";
mongoose.set("strictQuery", false);
mongoose.connect(connStr1, connectionParams)
.then(handleConnection())
.catch((err) => console.log("Error:", err));
//end script
//handleConnection - start on successful response from mongoose connection
function handleConnection(msg) {
console.log("mongoose has connected to Mongo Atlas successfully");
g_DB = mongoose.connection;
g_DB.once("open", function () {
console.log(
"mongoose has connected to Mongo Atlas Cluster using database XX"
);
doTest();
});
}
//---------------------------------------------------
function doTest() {
console.log("test-05: create 500 books");
//---- MODEL ----
const _schema = new mongoose.Schema({
name: String,
price: Number,
quantity: Number,
});
//g_DB is a mongoose connection set earlier in the script
const _model = g_DB.model("book_schema", _schema, "bookstore");
let loopcount = 500;
let waitcount = loopcount;
for (let i = 0; i < loopcount; i++) {
_m = new _model({
name: `WHY MAKE 500 BOOKS ${new Date().toISOString()}`,
price: 200,
quantity: 2000,
});
_m.save((e, x) => {
if (e) return console.error(e);
console.log(x, `waitcount: ${--waitcount}`);
if (!waitcount) doExit();
});
}
}
//--
function doExit() {
console.log("exit from server");
process.exit();
}
Use Reject/Resolve to manage promise in Nodejs
When your task fulfils your request send result with resolve(); and if its failing use reject();
In your case you are not managing promise properly that's why it's running asynchronously, better to use following way with the proper returns.
var bar = new Promise((resolve, reject) => {
return result.forEach((row, index, array) => {
var escaped = _.map(row, mysql.escape);
var checkQuery = "SELECT COUNT(*) as found FROM data WHERE field1 = " + escaped[0] + " AND field2 = " + escaped[1] + " AND field3 = " + escaped[2] + " AND field4 = " + escaped[3] + " AND field5 = " + escaped[4] + " AND field6 = " + escaped[5] + " AND field7 = " + escaped[6] + ";";
return conn.query(checkQuery, function (err, res) {
if (err) {
console.log("Error checking row for duplicate");
console.log(checkQuery);
return reject(err);
} else {
if (res[0].found == 0) {
var query = " (";
var escaped = _.map(row, mysql.escape);
var csv = escaped.join(',');
query += csv;
query += ")";
query += row !== _.last(result) ? ',' : ';';
console.log(query);//This will change to inserting the data to the table
return resolve(query)
} else {
console.log("Duplicate found!");
return reject('Duplicate Found');
}
}
});
});
});
bar.then((data) => {
console.log('All done!');
});
In above code I am returning query + resolve/reject so it makes better to run in more synchronised way.
return conn.query(checkQuery, function (err, res) {
Plus, while processing this promise I am handling with .then((data) so I can handle that resolve values here.
bar.then((data) => {
console.log('All done!');
});
Note: If you are rejecting any promise it won't be available in above .then block you'll find this reject in catch block so code will be changed in following way.
bar.then((data) => {
console.log('All done!');
}).catch(err=>{
console.log(err);
});
You can try the following:
(async () => {
await new Promise((resolve, reject) => {
result.forEach((row, index, array) => {
var escaped = _.map(row, mysql.escape);
var checkQuery = "SELECT COUNT(*) as found FROM data WHERE field1 =
" + escaped[0] + " AND field2 = " + escaped[1] + " AND field3 = " +
escaped[2] + " AND field4 = " + escaped[3] + " AND field5 = " + escaped[4] + " AND field6 = " + escaped[5] + " AND field7 = " + escaped[6] + ";";
conn.query(checkQuery, function (err, res) {
if (err) {
console.log("Error checking row for duplicate");
console.log(checkQuery);
process.exit(1);
} else {
if (res[0].found == 0) {
var query = " (";
var escaped = _.map(row, mysql.escape);
var csv = escaped.join(',');
query += csv;
query += ")";
query += row !== _.last(result) ? ',' : ';';
console.log(query);//This will change to inserting the data to the table
}else{
console.log("Duplicate found!");
}
}
});
if (index === array.length -1) resolve();
});
});
console.log('All done!');
})();
You don't even need to call the process.exit(0) because the code will always terminate when the job is done :)

how to dynamically insert multiple rows into google cloud spanner

In the below code snippet, Inserting multiple rows with static values. How do I insert multiple rows dynamically into the spanner database in one transaction?
function writeUsingDml(instanceId, databaseId, projectId) {
const {Spanner} = require('#google-cloud/spanner');
const spanner = new Spanner({
projectId: projectId,
});
const instance = spanner.instance(instanceId);
const database = instance.database(databaseId);
database.runTransaction(async (err, transaction) => {
if (err) {
console.error(err);
return;
}
try {
const rowCount = await transaction.runUpdate({
sql: `INSERT Singers (SingerId, FirstName, LastName) VALUES
(12, 'Melissa', 'Garcia'),
(13, 'Russell', 'Morales'),
(14, 'Jacqueline', 'Long'),
(15, 'Dylan', 'Shaw')`,
});
console.log(`${rowCount} records inserted.`);
await transaction.commit();
} catch (err) {
console.error('ERROR:', err);
} finally {
database.close();
}
});
}
For now, I am inserting a single row dynamically as follows and I want to extend this to multiple rows
var sqlString = "Insert " + tName
var cNames = "( "
var cValues = "( "
for(var col in cols) {
cNames = cNames + col + ", " ;
const cValue = cols[col];
if (typeof cValue == 'string' || cValue instanceof String){
cValues = cValues + "'" + cValue + "', "
}else{
cValues = cValues + cValue + ", ";
}
}
return sqlString + cNames + ") values " + cValues + ")";
If you want to insert many rows, it may be easier to use BatchTransaction.insert
https://cloud.google.com/nodejs/docs/reference/spanner/2.1.x/BatchTransaction#insert
database.runTransaction(async (err, transaction) => {
if (err) {
console.error(err);
return;
}
try {
var itemsToInsert = [
{ SingerId: 12, FirstName: 'Mellissa', LastName: 'Garcia' },
{ SingerId: 13, FirstName: 'Russell', LastName: 'Morales' },
{ SingerId: 14, FirstName: 'Jacqueline', LastName: 'Long' },
{ SingerId: 15, FirstName: 'Dylan', LastName: 'Shaw' },
]
transaction.insert('Singers', itemsToInsert);;
await transaction.commit();
console.log(`${itemsToInsert.length} records inserted.`);
} catch (err) {
console.error('ERROR:', err);
} finally {
database.close();
}
});
If you want to use DML though, you can make the SQL statement programatically. The following will build the insert statement. I didn't run your version, but I think it may have syntax errors related to trailing commas.
function buildSqlInsert(tableName, columnNames, rows) {
var sqlString = "INSERT " + tableName + " (" + columnNames.join(', ') + ") VALUES \n";
// add each row being careful to match the column order with columnNames
rows.forEach(row => {
sqlString += "(";
columnNames.forEach(columnName => {
var columnValue = row[columnName]
// strings should be quoted
if (typeof columnValue == 'string' || columnValue instanceof String) {
columnValue = "'" + columnValue + "'";
}
sqlString += columnValue + ", ";
});
// trim added chars.
sqlString = sqlString.substring(0, sqlString.length - 2);
sqlString += "),\n"
});
// trim added comma/newline
sqlString = sqlString.substring(0, sqlString.length - 2);
return sqlString
}
insertStatement = buildSqlInsert('Singers', Object.keys(itemsToInsert[0]), itemsToInsert)
const rowCount = await transaction.runUpdate({
sql: insertStatement,
});
You can also bind the rows you want to insert to an array-of-struct parameter and use that in an insert statement like follows:
INSERT INTO Singers(SingerId, FirstName, LastName)
SELECT * FROM UNNEST(#struct_array_param)
Where #struct_array_param is a bound parameter containing array of STRUCTS(tuples). In addition to being succinct and safer SQL syntax, using bound parameters offers the additional advantage of being able to use the same cached query plan for multiple executions of the same insert statement(with different bound values).

NodeJS ,Partial binding of Oracle DB parameters gives : ORA-01036

I'm working on a task that has some insert/update on some table.
when I use the full columns of the table in the insert/update statements it works for me
but whenever I'm only updating the required columns and neglect the remaining to be untouched I face the ORA-01036
I'm calling a generic Lambda function passing the query and parameters
Success scenario :
{
"stage": "dev",
"params": {
"id": 5956049,
"groupName": "testtoberemoved123",
"externalName": "Axiom_420K_Wheattest547",
"description": "desc 123",
"createdOn": "2018-08-27T22:00:00.000Z",
"createdBy": "EOM",
"updatedOn": "2018-08-28T16:16:41.207Z",
"updatedBy": "EOM",
"status": 1,
"vendorID": null,
"technologyCode": null
},
"query": "update assay_group set NAME=:groupName , EXTERNAL_NAME=:externalName, DESCRIPTION=:description ,CREATED_DATE=to_timestamp_tz( :createdOn, 'yyyy-mm-dd"T"hh24:mi:ss:ff3 TZH:TZM'),CREATED_USER=:createdBy ,LAST_UPDATED_DATE=to_timestamp_tz( :updatedOn, 'yyyy-mm-dd"T"hh24:mi:ss:ff3 TZH:TZM'),LAST_UPDATED_USER=:updatedBy ,GROUP_STATUS=:status,VENDOR_ID=:vendorID,TECHNOLOGY_CODE=:technologyCode where ID=:id",
"enableObjectFormat": true,
"options": {
"autoCommit": true
}
}
this one runs successfully , but just when removing some columns from the statement it fails as below scenario :
{
"stage": "dev",
"params": {
"id": 5956049,
"groupName": "testtoberemoved123",
"externalName": "Axiom_420K_Wheattest547",
"description": "desc 123",
"createdOn": "2018-08-27T22:00:00.000Z",
"createdBy": "EOM",
"updatedOn": "2018-08-28T16:09:36.215Z",
"updatedBy": "EOM",
"status": 3,
"vendorID": null,
"technologyCode": null
},
"query": "update assay_group set NAME=:groupName where ID=:id",
"enableObjectFormat": true,
"options": {
"autoCommit": true
}
}
and this results in the following error :
{"errorMessage":"Error while executing query - ORA-01036: illegal variable name/number\n",
Generic executor as below
`
'use strict';
var oracledb = require("oracledb-for-lambda");
var dbConfig = require('./resources/dbConfig-dev.js');
module.exports.executeQuery= (event, context, callback) => {
var maxSize = parseInt(process.env.maxRows, 10);
// Extract enableJSONParse option
var enableJSONParse = false;
if(event.enableJSONParse != null && event.enableJSONParse != undefined) {
enableJSONParse = event.enableJSONParse;
console.log("enableJSONParse provided in event");
}
console.log("Enable JSON Parse: " + enableJSONParse);
// Extract options
var options = {};
if(event.options != null && event.options != undefined) {
options = event.options;
console.log("options provided in event");
}
// Add maxSize to options
options.maxRows = maxSize;
console.log("Options: " + JSON.stringify(options));
// Set oracledb output format to object
var enableObjectFormat = event.enableObjectFormat;
console.log("Enable Object Format: " + enableObjectFormat);
if(enableObjectFormat) {
console.log("Object Format Enabled");
oracledb.outFormat = oracledb.OBJECT;
} else {
oracledb.outFormat = oracledb.ARRAY;
}
console.log("oracledb.outFormat: " + oracledb.outFormat);
var currentStage = event.stage;
console.log("Current Stage: " + currentStage);
if (currentStage != null && currentStage != 'undefined') {
var configFileName = './resources/dbConfig-' + currentStage + '.js'
try{
dbConfig = require(configFileName);
} catch (error) {
callback(new InternalServerError("No dbConfig found - " + error.message));
return;
}
}
console.log("Using dbConfig: " + JSON.stringify(dbConfig));
var response = "";
var parameters = event.params;
var query = event.query;
if(query == null || query == undefined || query == "") { // Empty Query - throw error
console.log("Missing required field - query")
callback(new MissingRequiredFieldError("Missing Required Field - query"));
return;
}
if(parameters == null || parameters == undefined) { // parameters not provided in event - set to empty list
console.log("No parameters defined");
parameters = [];
}
console.log("Query: " + query);
console.log("Query Parameters: " + parameters);
oracledb.getConnection(
{
user : dbConfig.user,
password : dbConfig.password,
connectString :dbConfig.connectString
},
function(err, connection) {
if (err) {
console.error("Connection Error: " + err.message);
callback(new InternalServerError("Error while connecting to database - "+ err.message));
return;
}
// return all CLOBs as Strings
oracledb.fetchAsString = [ oracledb.CLOB ];
connection.execute(
// The statement to execute
query,
parameters, // Query Param
options, // Options
// The callback function handles the SQL execution results
function(err, result) {
if (err) {
console.error("Execution Error Messages = " + err.message);
doRelease(connection);
callback(new InternalServerError("Error while executing query - "+ err.message));
return;
}
console.log("Query " + query + " Executed Successfully");
var resultSet;
// In case query is SELECT
if(result.rows != null && result.rows != undefined) {
console.log("Returned rows: " + result.rows.length);
console.log("Result.metaData: " + JSON.stringify(result.metaData));
console.log("Result.rows: " + JSON.stringify(result.rows));
resultSet = result.rows;
try {
if(result.rows.length != undefined && result.rows.length == 0) {
resultSet = [];
} else if(enableJSONParse) {
if(result.rows[0][0].type == oracledb.CLOB) {
console.log("rows.type is CLOB");
resultSet = JSON.parse(result.rows[0][0]);
}
resultSet = JSON.parse(result.rows);
}
} catch(error) {
callback(new InternalServerError("Error while parsing result of query: "+error.message));
return;
}
} else { // In case query is INSERT/UPDATE/DELETE
console.log("Result.rowsAffected: " + result.rowsAffected);
if(result.rowsAffected > 0) {
resultSet = 'Executed Succesfully - Rows Affected: '+ result.rowsAffected;
} else {
resultSet = 'No rows affected';
}
}
doRelease(connection);
callback(null, resultSet);
});
});
// Note: connections should always be released when not needed
function doRelease(connection) {
connection.close(
function(err) {
if (err) {
console.error(err.message);
callback(new InternalServerError(err.message));
return;
}
});
}
};
`
The problem is that you are asking Oracle to set values for bind parameters that don't exist.
Let's consider your statement update assay_group set NAME=:groupName where ID=:id. Oracle will parse this and then run through your bind parameters. It will set values for groupName and id fine, and then it will get to the parameter named externalName. However, there is no bind parameter :externalName in your statement.
What's Oracle supposed to do with the value you've given to this non-existent parameter? You seem to be expecting Oracle to just ignore it. However, ignoring it isn't an option: if for example someone mistypes a parameter name this should in my opinion generate an error straight away rather than waiting until all the other parameters have been set and then complaining that one of them was missing.
You will have to pass to your executeQuery function the parameters that are used by the query or statement being executed and no others.

Node.js & Redis & For Loop with bluebird Promises?

i want a function that makes a new JSON Object that looks so :
{ T-ID_12 : [{ text: "aaaaa", kat:"a" }], T-ID_15 : [{ text: "b", kat:"ab" }], T-ID_16 : [{ text: "b", kat:"ab" }] }
This { text: "aaaaa", kat:"a" } in thesenjsondata and this T-ID_12 is an entry of the array Thesen_IDS. And my solution so far is :
function makeThesenJSON(number_these, Thesen_IDS){
var thesenjsondata;
var thesenids_with_jsondata = "";
for (i = 0; i < number_these; i++ ){
db.getAsync(Thesen_IDS[i]).then(function(res) {
if(res){
thesenjsondata = JSON.parse(res);
thesenids_with_jsondata += (Thesen_IDS[i] + ' : [ ' + thesenjsondata + " ], ");
}
});
}
var Response = "{ " + thesenids_with_jsondata + " }" ;
return Response;
}
I know, that the for-loop is faster then the db.getAsync(). How can i use the bluebird promises with redis right, so that the return value has all data that i want ?
You just create an array of promises out of Redis calls, then use Bluebird's Promise.all to wait for all to come back as an array.
function makeThesenJSON(number_these, Thesen_IDS) {
return Promise.all(number_these.map(function (n) {
return db.GetAsync(Thesen_IDS[n]);
}))
.then(function(arrayOfResults) {
var thesenids_with_jsondata = "";
for (i = 0; i < arrayOfResults.length; i++) {
var res = arrayOfResults[i];
var thesenjsondata = JSON.parse(res);
thesenids_with_jsondata += (Thesen_IDS[i] + ' : [ ' + thesenjsondata + " ], ");
}
return "{ " + thesenids_with_jsondata + " }";
})
}
Note how this function is Async because it returns a Promise that will eventually resolve to a string. So you call it like this:
makeThesenJSON.then(function (json) {
//do something with json
})

executing sequent command in redis

This code doesn't work and I couldn't find out why?
It is always pushing obj right serialized JSON string but it always return with wrong key. In obj id is regularly increasing but key isn't.
var c = redis.createClient(),
obj = {id:0, name:"dudu"},
key="person:";
c.select(0);
c.multi()
.incr("idx:person", function(err, _idx) {
console.log("incr -> #idx: " + _idx);
key += obj.id = _idx;
console.log("After Inc obj: " + JSON.stringify(obj));
})
.set(key, JSON.stringify(obj), function(err, _setResp) {
console.log("set -> #_setResp: " + _setResp);
console.log(JSON.stringify(ihale));
})
.get(key, function(er, _obj) {
console.log("get -> " + key);
if (er) {
res.json(er);
} else {
console.log("Found: " + JSON.stringify(_obj));
res.json(_obj);
}
})
.exec(function(err, replies) {
console.log("MULTI got " + replies.length + " replies");
replies.forEach(function(reply, index) {
console.log("Reply " + index + ": " + reply.toString());
});
});
c.quit();
This worked:
c.INCR("idx:person", function(a,b) {
obj.id = b;
console.dir(obj);
key = "pa:" + b;
c.set(key, JSON.stringify(obj), function(err, _setResp) {
console.log("set -> #_setResp: " + _setResp);
console.log(JSON.stringify(obj));
c.get(key, function(er, _obj) {
console.log("get -> " + key);
if (er) {
res.json(er);
} else {
console.log("Found: " + JSON.stringify(_obj));
res.json(_obj);
}
});
});
});
The way to do this is simple :)
Event driven node is executing every part inside of previous one:
c.INCR("idx:person", function(a,b) {
obj.id = b;
key = "pa:" + b;
c.set(key, JSON.stringify(obj), function(err, _setResp) {
c.get(key, function(er, _obj) {
if (er) {
res.json(er);
} else {
res.json(_obj);
}
});
});
});
In transaction mode command are grouped and passed to Redis . The Exec command execute code you passed . When you pass the key value to a the set command , there is no incremental key value on right of it.
For this kind of use, and if you still want have merged commands in one , script it In Lua:
local keyid = redis.call('INCR', 'idx:person')
local result = redis.call('SET', 'person:'..keyid,ARGV[1])
return 'person:'..keyid
for using it in a redis eval command :
eval "local keyid = redis.call('INCR', 'idx:person'); local result = redis.call('SET', 'person:'..keyid,ARGV[1]);return 'person:'..keyid" 0 "yourJSONObject"
this should work:
client.eval([ "local keyid = redis.call('INCR', 'idx:person'); local result = redis.call('SET', 'person:'..keyid,ARGV[1]);return result", 0,JSON.stringify(obj) ], function (err, res) {
console.log(res); // give the personID
});
You Can also use a Hash instead of a simple key in your example for separate id, name , and json object . Return the hash from the lua script will be like return it from a hset.

Resources