I want to get the data from PostgreSQL using Node.js.
So I use 'pg-promise' package.
But I can't get anything.
var pgp = require("pg-promise")( /*options*/ );
var db = pgp("postgres://username:password#localhost:5432/database");
db
.any('SELECT * FROM mytable')
.then(function(data) {
console.log("ABC");
});
I can't see the output ("ABC") on console(cmd).
I am using PostgreSQL 12.0 now.
What did I wrong?
Related
I'm using elastic apm to profiling my NestJS application and my apm agent is elastic-apm-node.
My ORM is typeOrm and my database is Oracle.
My problem is apm agent does not record database query spans and I can't see database query spans in kibana ui.
Can anyone help me?
unfortunately oracle is not supported by elastic apm agent. you should wrap your oracleQueryRunner in order to start and end agent spans manually. put this code in your main.ts file:
import { OracleQueryRunner } from 'typeorm/driver/oracle/OracleQueryRunner';
const query = OracleQueryRunner.prototype.query;
OracleQueryRunner.prototype.query = async function (...args) {
const span = apm.startSpan('query');
if (span) {
span.type = 'db';
span.action = args[0];
}
const result = await query.bind(this)(...args);
if (span) { span.end(); }
return result;
};
It seems I can't find a proper way to use the read/write functions for admin in the Cloud Functions. I am working on a messaging function that reads new messages created in the Realtime Database with Cloud Functions Node.js and uses the snapshot to reference a path. Here is my initial exports function:
var messageRef = functions.database.ref('Messages/{chatPushKey}/Messages/{pushKey}');
var messageText;
exports.newMessageCreated = messageRef.onCreate((dataSnapshot, context) => {
console.log("Exports function executed");
messageText = dataSnapshot.val().messageContent;
var chatRef = dataSnapshot.key;
var messengerUID = dataSnapshot.val().messengerUID;
return readChatRef(messengerUID, chatRef);
});
And here is the function that reads from the value returned:
function readChatRef(someUID, chatKey){
console.log("Step 2");
admin.database.enableLogging(true);
var db;
db = admin.database();
var userInfoRef = db.ref('Users/' + someUID + '/User Info');
return userInfoRef.on('value', function(snap){
return console.log(snap.val().firstName);
});
}
In the firebase cloud functions log I can read all console.logs except for the one inside return userInfoRef.on.... Is my syntax incorrect? I have attempted several other variations for reading the snap. Perhaps I am not using callbacks efficiently? I know for a fact that my service account key and admin features are up to date.
If there is another direction I need to be focusing on please let me know.
Is it possible to use NodeJs like function Callbacks with ArangoJs 3.x;
I have seen that ArangoJs 3.x using .then method (promises)..
But I am using NodeJs 4.4 .. so i can't use .then method there.. Can I use nodejs like function callbacks for arangojs 3.x ?
Quoting the ArangoJS github page:
// ES2015-style
import arangojs, {Database, aql} from 'arangojs';
let db1 = arangojs(); // convenience short-hand
let db2 = new Database();
let {query, bindVars} = aql`RETURN ${Date.now()}`;
// or plain old Node-style
var arangojs = require('arangojs');
var db1 = arangojs();
var db2 = new arangojs.Database();
var aql = arangojs.aql(['RETURN ', ''], Date.now());
var query = aql.query;
var bindVars = aql.bindVars;
// Using a complex connection string with authentication
let host = process.env.ARANGODB_HOST;
let port = process.env.ARANGODB_PORT;
let database = process.env.ARANGODB_DB;
let username = process.env.ARANGODB_USERNAME;
let password = process.env.ARANGODB_PASSWORD;
let db = arangojs({
url: `http://${username}:${password}#${host}:${port}`,
databaseName: database
});
// Using ArangoDB 2.8 compatibility mode
let db = arangojs({
arangoVersion: 20800
});
Isn't that exactly what you were looking for?
I decided to try Webstorm, mainly for the autocomplete feature, but I've got an issue with it.
I require a .js file of my project(Which in this case is a driver to communicate with my Database) but the autocomplete is not working properly:
var db = require('../../config/database');
var Validator = {};
Validator.isAKnownUserId = function (user_id) {
var query = 'SELECT * FROM users WHERE id = ?';
db.
};
The databse.js file :
var cassandra = require('cassandra-driver');
// Client connecting to the keyspace used by the application
var client = new cassandra.Client ({
keyspace: keyspace,
contactPoints: ['127.0.0.1']
});
module.exports = client;
As you can see nothing special. But for example the "execute" function that is available with the cassandra.Client is not autocompletes in my validator.js file when it is in the database.js file.
Furthermore, if I replace
var db = require('../../config/database');
with
var db;
db = require('../../config/database');
or
var db = new require('../../config/database');
then the autocomplete is working correctly in my file.
Can someone help me figure out this behavior and how to get a proper autocomplete ?
Thanks in advance
I have some code thats works Ok in PHP.
From the postgres CLI I issue a
NOTIFY job;
The notification is correctly raised by Postgres ( I can see it in the PHP client),
but can't read it in node.
JS:
var pg = require('pg');
var conString = "your postgres information";
var client = new pg.Client(conString);
client.connect();
client.query('LISTEN job');
client.on('notification', function(msg) {
console.log('data');
});
I would prefer to keep it simple. Is the only way to make a
procedure in postgres like this?
Ok, the problem was in the conString parameter.
var conString = "tcp://user:pass#localhost/db";
Is important to check that you are using the correct database to reach the notification messages.