Nodejs - Apache Cassandra (using Datastax Driver) - node.js

I'm trying to insert a map type in cassandra:
const query = 'INSERT INTO stats.tickets (office, line, generation, inserted_at, meta, number, prefix) VALUES (:office, :line, :generation, :inserted_at, :meta, :number, :prefix)';
const parametersExample = {
office: "office",
line: "line",
generation: 10,
inserted_at: Date.now(),
meta: {"tag1": "ujkukkik", "tag2": "asdascee"},
number: 1,
prefix: "prefix_"
};
const result = async () => {
return await client.execute(query, parametersExample, { prepare: true });
};
result().then( res => {
res.rows.map( row => console.log(row.content) );
process.exit();
}).catch( err => console.error(err));
The code insert the row but shows the following message:
TypeError: Cannot read property 'map' of undefined
at lib/handleData.js:23:14
at <anonymous>
at process._tickDomainCallback (internal/process/next_tick.js:228:7)
What is the reason?

That's normal behavior - per CQL documentation:
INSERT returns no results unless IF NOT EXISTS is used.
So driver receives undefined... Usually .map will be used for selects.

Related

Why can't I close mongoose after this promise?

I have a MongoDB instance to which I want to add entries, and I am using the mongoose library for this.
import mongoose from 'mongoose'
const personSchema = new mongoose.Schema({
name: String,
number: String
})
const Person = mongoose.model('Person', personSchema)
// addToPhonebook adds an entry to the phonebook
//
const addToPhonebook = (name, number) => {
const person = new Person({
name: name,
number: number
})
return person.save()
.then(_result => {
console.log(`added ${name} number ${number} to phonebook`)
mongoose.connection.close()
},
error => console.log(`nothing happened: ${error}`))
}
// Do some initialization
const url = 'mongodb+srv://...'
await mongoose.connect(url, {
useNewUrlParser: true,
useUnifiedTopology: true,
useFindAndModify: false,
useCreateIndex: true,
}).then(
_result => {
console.log('logged in to phonebook')
},
error => {
console.log(`couldn't log in to phonebook: ${error}`)
process.exit(2)
}
)
await addToPhonebook('Some name', '1-800-xxx-xxxx')
The code above works as expected. But, if I were to rearrange addToPhonebook such that mongoose.connection.close() is called after the first .then:
const addToPhonebook = (name, number) => {
const person = new Person({
name: name,
number: number
})
return person.save()
.then(_result => console.log(`added ${name} number ${number} to phonebook`),
error => console.log(`nothing happened: ${error}`))
.then(mongoose.connection.close())
}
It always returns nothing happened: MongoError: server is closed.
Am I doing something wrong or is my knowledge about promises faulty?
You're using your second .then() function wrong. You have to change it to this:
.then(() => mongoose.connection.close))
Explanation:
.then() needs a function as parameter. If you call the function just like this:
.then(mongoose.connection.close());
... you're passing the return value of .close() as parameter, therefore nothing happens.
mongoose.connection.close() is deprecated, check here use mongoose.disconnect().
I would suggest you not to close connection on every query, may be you need it at this moment, but try to make it more robust.
This stack overflow thread is good to check how to close connection i.e whenever your node process might end up or any other thing which might cause your application to restart.
const addToPhonebook = (name, number) => {
const person = new Person({
name: name,
number: number
})
return person.save()
.then(_result => console.log(`added ${name} number ${number} to phonebook`),
error => console.log(`nothing happened: ${error}`))
.then(mongoose.disconnect) // this will execute the method automatically
}

Mongoose saves invalid data without throwing validation errors if model.validate() is called first

MongoDB 4.2.2 and Mongoose 5.8.3 (latest) and NodeJS 13.3.0 (Windows x64)
If I create a schema and model, then create an instance of the model and add some data, then run validate(), then save(): even if validate() fails, the data is saved into the collection, without throwing an additional validation error.
Is this a bug, or am I doing something wrong?
Here's the test code:
var mongoose = require('mongoose')
mongoose.connect("mongodb://user:pass#localhost/mydb")
db = mongoose.connection
var Schema = mongoose.Schema
var PartSchema = new Schema({
name: {
type: String,
required: true,
validate: {
validator: (v) => v !== 'asdf' // Don't allow name to be 'asdf'
}
},
number: {
type: String,
required: true,
validate: {
validator: (v) => !v.includes(' ') // Don't allow spaces in part number.
}
}
})
var ProductSchema = new Schema({
name: String,
parts: [PartSchema]
})
var Part = mongoose.model('Part', PartSchema)
var Product = mongoose.model('Product', ProductSchema)
var p1 = new Product({name:"Baseball Bat", parts:[ new Part({name:"First part", number: "003344"}), new Part({name: "Second part", number: "554422"}) ]})
p1.parts.push(new Part({name: "No number, so invalid"})) // this one is invalid because no part number is specified (required)
p1.parts.push(new Part({name: 'asdf', number: 'zzzzzaaaa'}))
p1.parts.push(new Part({name: 'bbbb', number: 'with a space'})) // This one is invalid because number has spaces.
p1.validate()
.then(() => {console.log('Validation successful')})
.catch((err) => { console.log("Validation failed.")})
p1.save()
.then(()=>{ console.log("Saved successfully")})
.catch((err)=>{console.log("Save ERROR", err)})
Running this code yields the following:
Validation failed.
Saved successfully
And the new document appears in the database:
However, if I remove the p1.validate() before calling save(), the save function's catch() block triggers and the item is not saved:
Save ERROR Error [ValidationError]: Product validation failed: parts.2.number: Path `number` is required., parts.3.name: Validator failed for path `name` with value `asdf`, parts.4.number: Validator failed for path `number` with value `with a space`
at ValidationError.inspect
... snipped
May be you need to use p1.save() inside the promise chain.
p1.validate()
.then(res => {
console.log("Validation successful");
})
.then(() => {
return p1.save();
})
.then(res => {
console.log("saved success ", res);
})
.catch(err => {
console.log("Some error.", err);
});

problem when restarting tables and insert data in bigquery using node api

I have unexpected behaviour when loading data into BigQuery just after creating the schema.
I'm using Node API to insert data with BigQuery streaming API.
In order to reset the data I delete and create the tables before loading any data.
My Problem: the first time it works fine, but if I execute it again it fails.
The process always delete and creates the table schema, but does not insert the data until I wait a moment to execute it again.
This is the code which reproduces the case:
async function loadDataIntoBigquery() {
const {BigQuery} = require('#google-cloud/bigquery')
const tableName = "users"
const dataset = "data_analisis"
const schemaUsers = "name:string,date:string,type:string"
const userData = [{name: "John", date: "20/08/93", type: "reader"}, {
name: "Marie",
date: "20/08/90",
type: "owner"
}]
try {
const bigquery = new BigQuery()
await bigquery.createDataset(dataset).then(err => console.log("dataset created successfully")).catch(err => {
console.log("warn: maybe the dataset already exists")
})
await bigquery.dataset(dataset).table(tableName).delete().then(err => console.log("table deleted successfully")).catch((err) => {
console.log("Error: maybe the table does not exist")
})
await bigquery.dataset(dataset).createTable(tableName, {schema: schemaUsers}).then(() => console.log("table created successfully")).catch(err => console.log("Error: maybe the table already exists"))
await bigquery.dataset(dataset).table(tableName).insert(userData).then((data) => console.log("Ok inserted ", data)).catch(err => console.log("Error: can't insert "))
} catch (err) {
console.log("err", err)
}
}
to verify that the data was inserted I'm using this query
select * from `data_analisis.users`
I have the same issue. As a workaround, i insert data with a query instead :
const query = "INSERT INTO `"+dataset+"."+tableName"` (name, date, type ) VALUES ("+name+",'"+date+"','"+type+"')";
await bigQuery.query({
query: query,
useLegacySql: false,
location: 'EU'
}, (err) => {
console.log("Insertion error : ",err);
})

Convert a ColumnSet column into geometry with pg-promise

I'm creating a ColumnSet object with pg-promise, according to this:
const cs = new pgp.helpers.ColumnSet([
{name: 'Id',prop: 'Id'},
{name: 'Lat',prop: 'Lat'},
{name: 'Lng',prop: 'Lng'},
{name: 'CreationDateTime',prop: 'CreationDateTime'},
{name: 'Topic',prop: 'Topic'},
{name: 'UserId',prop: 'UserId'},
{name: 'shape',mod: ':raw',prop: 'shape',def: 'point'},
{name: 'UserName',prop: 'UserName'},
{name: 'appName',prop: 'appName'},
{name: 'appVersion',prop: 'appVersion'}
], {
table: 'Location'
});
def: 'point' point is method to converting into geometry-- This is a value or how can i run point method and do bind in this column (shape) ?
and write this method for bulk inserting :
async function insertMany(values) {
try {
let results = await db.none(pgp.helpers.insert(values, cs));
} catch (error) {
console.log(error);
}
}
for converting lat and lng i wrote this method :
const point = (lat, lng) => ({
toPostgres: () => pgp.as.format('ST_SetSRID(ST_MakePoint($1, $2), 4326)', [Lag, Lng]),
rawType: true
});
But I got this error:
TypeError: Values null/undefined cannot be used as raw text
According this page:
Raw-text variables end with :raw or symbol ^, and prevent escaping the text. Such variables are not allowed to be null or undefined, or the method will throw TypeError = Values null/undefined cannot be used as raw text.
When point method is not executed, of course that shape filed is null.
First, you are misusing option prop, which is documented as to be used when the destination property name differs from the column name, which is not your case.
And def, as documented also, represents the value when the property is missing. When the property is there set to null or undefined, the value of def isn't used.
You are trying to override the resulting value, that means you need to use property init.
Another issue - your variables inside point implementation switch cases.
In all, your code should look something like this:
const getPoint = col => {
const p = col.value;
// we assume that when not null, the property is an object of {lat, lng},
// otherwise we will insert NULL.
return p ? pgp.as.format('ST_SetSRID(ST_MakePoint(${lat}, ${lng}), 4326)', p) : 'NULL';
};
const cs = new pgp.helpers.ColumnSet([
'Id',
'Lat',
'Lng',
'CreationDateTime',
'Topic',
'UserId',
{name: 'shape', mod: ':raw', init: getPoint},
'UserName',
'appName',
'appVersion',
], {
table: 'Location'
});
And version that uses Custom Type Formatting would look like this:
const getPoint = col => {
const p = col.value;
if(p) {
return {
toPostgres: () => pgp.as.format('ST_SetSRID(ST_MakePoint(${lat}, ${lng}), 4326)', p),
rawType: true
};
}
// otherwise, we return nothing, which will result into NULL automatically
};
const cs = new pgp.helpers.ColumnSet([
'Id',
'Lat',
'Lng',
'CreationDateTime',
'Topic',
'UserId',
{name: 'shape', init: getPoint},
'UserName',
'appName',
'appVersion',
], {
table: 'Location'
});

nodeJS inserting Data into PostgreSQL error

I have a weird error using NodeJS with a PostgreSQL and I hope you can maybe help me out.
I have a huge amount of data sets, about 2 Million entries that I want to insert into my DB.
One data consists of 4 columns:
id: string,
points: float[][]
mid: float[]
occurences: json[]
I am inserting data like so:
let pgp = require('pg-promise')(options);
let connectionString = 'postgres://archiv:archiv#localhost:5432/fotoarchivDB';
let db = pgp(connectionString);
cityNet.forEach((arr) => {
db
.none(
"INSERT INTO currentcitynet(id,points,mid,occurences) VALUES $1",
Inserts("${id},${points}::double precision[],${mid}::double precision[],${occurences}::json[]",arr))
.then(data => {
//success
})
.catch(error => {
console.log(error);
//error
});
})
function Inserts(template, data) {
if (!(this instanceof Inserts)) {
return new Inserts(template, data);
}
this._rawDBType = true;
this.formatDBType = function() {
return data.map(d => "(" + pgp.as.format(template, d) + ")").join(",");
};
This works out for exactly for the first 309248 data pieces, then suddenly it just errors out with the following for (what it seems like) every next data it tries to insert:
{ error: syntax error at end of input
at Connection.parseE (/home/christian/Masterarbeit_reworked/projekt/server/node_modules/pg-promise/node_modules/pg/lib/connection.js:539:11)
at Connection.parseMessage (/home/christian/Masterarbeit_reworked/projekt/server/node_modules/pg-promise/node_modules/pg/lib/connection.js:366:17)
at Socket.<anonymous> (/home/christian/Masterarbeit_reworked/projekt/server/node_modules/pg-promise/node_modules/pg/lib/connection.js:105:22)
at emitOne (events.js:96:13)
at Socket.emit (events.js:188:7)
at readableAddChunk (_stream_readable.js:176:18)
at Socket.Readable.push (_stream_readable.js:134:10)
at TCP.onread (net.js:548:20)
name: 'error',
length: 88,
severity: 'ERROR',
code: '42601',
detail: undefined,
hint: undefined,
position: '326824',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'scan.l',
line: '1074',
routine: 'scanner_yyerror' }
The 'position' entry changes for every iterating error-message.
I can redo that and it will always error after 309248 entries.
When I try to insert less, like 1000 entries, the error does not occur.
That really confuses me. I thought PostgreSQL does not have any max amount of rows. Also the error message does not help me at all.
SOLVED
The error was found. In my data there were "null" entries that have slipped into it. Filtering out null-data worked out.
I will try out the other recommendations for inserting data, since the current way works, but the performance is very crappy.
I'm the author of pg-promise. Your whole approach should be changed to the one below.
Proper way to do massive inserts via pg-promise:
const pgp = require('pg-promise')({
capSQL: true
});
const db = pgp(/*connection details*/);
var cs = new pgp.helpers.ColumnSet([
'id',
{name: 'points', cast: 'double precision[]'},
{name: 'mid', cast: 'double precision[]'},
{name: 'occurences', cast: 'json[]'}
], {table: 'currentcitynet'});
function getNextInsertBatch(index) {
// retrieves the next data batch, according to the index, and returns it
// as an array of objects. A normal batch size: 1000 - 10,000 objects,
// depending on the size of the objects.
//
// returns null when there is no more data left.
}
db.tx('massive-insert', t => {
return t.sequence(index => {
const data = getNextInsertBatch(index);
if (data) {
const inserts = pgp.helpers.insert(data, cs);
return t.none(inserts);
}
});
})
.then(data => {
console.log('Total batches:', data.total, ', Duration:', data.duration);
})
.catch(error => {
console.log(error);
});
UPDATE
And if getNextInsertBatch can only get the data asynchronously, then return a promise from it, and update the sequence->source callback accordingly:
return t.sequence(index => {
return getNextInsertBatch(index)
.then(data => {
if (data) {
const inserts = pgp.helpers.insert(data, cs);
return t.none(inserts);
}
});
});
Related Links:
tx
sequence / spex.sequence
ColumnSet
Multi-row insert with pg-promise
I'm not sure, but it looks like you got wrong data structure at the last element(309249) and PostgreSQL cannot parse some property

Resources