Is it possible to use parameterized queries in influxdb via nodejs driver? - node.js

Consider the scenario: a client app sends me the name of the table I need to get data from and additional information such as date range. How can one construct such queries safely? I use nodejs as my backend and influxdb as data store.

Related

Using Node.js and Mongoose: Is switching between multiple connections on a MongoDB server an inefficient alternative?

I have several clients. Each client has multiple data acquisition stations. Each station has multiple sensors. I need to store this data on a MongoDB server (using mongoose and Node.js). So, I thought about the organization that follows.
Each client has its own database inside the MongoDB server.
Within this database, each station has its own collection.
Data is sent to the MongoDB server via an MQTT broker (Node.js). So, depending on which client the broker receives the message from, I need to create a new connection to the MongoDB server (using mongoose.createConnection).
I'm not sure if this is the best alternative. I don't know if creating multiple different connections will slow down the system.
What do you think?

Identifying connect to Postgres using node pg-postgres

I have an application using a PostgreSQL database with multiple backend-API's. Because we some times run out of connections, I would like to be able to identify which backend API has active connections in view pg_stat_activity. Is it possible to identify the connection from node pg-postgres?
What you see (in pg_stat_activity) is what you get. If none of those fields are helpful, then I guess the answer is no.
Aren't all the APIs pooled together through the same shared connections?
Application_name could help if you can convince node.js to set them per API. query text could help if you recognize what query text is from which API.

Need to post JSON from endpoint to Prometheus using node js

I need to post JSON to Prometheus by collecting the data from endpoint using Node JS. Kindly give me some samples to work on this.
You can "push" data to prometheus using the push gateway:
https://github.com/prometheus/pushgateway
This is not recommended because it will not clean up data automatically for you, so you need to have a cron or something periodically delete old data, or your filesystem will eventually fill up.
The way it works is, the pushgateway is a module that you push data to, and then prometheus will pull data from the pushgateway normally. It's kind of a pain to get up and running, but it's nice to have in situations where you can't pull data for whatever reason (i.e., devices on a LAN can only establish outgoing connections, but aren't directly accessible by a prometheus server).
You cannot 'post to Prometheus'. Prometheus works in pull mode. It scrapes metrics exposed by services and exporters. The easiest way to do this is in your node.js application is by using an existing client library. Look at the examples here: https://github.com/siimon/prom-client/tree/master/example.

Can we sync data from MongoDB to Elasticsearch in realtime in NodeJS?

I am dividing the load on my Database and want to retrieve data from ES and write data to MongoDB. Can I sync them real time? I have checked the Transporter library but I want to do it for realtime.
There are several ways to achieve that :
Using your own application server. Whenever you are inserting a new
document in the mongo, put it in the ES as well at the same time.
That way you will maintain the consistency with minimum latency.
Use logstash. It has near realtime pipelining capabilities.
You can use elasticsearch mongodb river. Its a plugin used for data synchronization between mongo and elasticsearch.

EMFILE error on bulk data insert

I'm developing an loopback application to get data using oracledb npm module from ORACLE and convert it to JSON format to be stored in MONGODB.
MONGODB is accessed using "loopback-connector-mongodb".
The data to be stored would be around for 100 collections as of for 100 tables from ORACLE.
I'm pushing data with http request row by row for the entire collection list from node server from my local application to another server application on remote machine using http-request through remote method call.
When the data write operation the remote machine application stops throwing an error showing "EMFILE error".
I googled and some reference showed that it is because of the maximum number of opened files/sockets. Hence i tried disconnecting the DataSource for each request. Still i'm getting the same error.
Please help me on this!!
Thanks.
If you are making an http request for each row of data, and aren't throttling or otherwise controlling the order of those requests, it is possible you are simply making too many requests at once because of node's async io model.
For example, making those calls in a simple for loop would actually result in all of them being made in parallel.
If this is the case, you might want to consider using something like the async module, which includes some utilities for throttling the parallelism.
Don't forget Oracle Database 12.1.0.2 has JSON support. Maybe you don't need to move the data in the first place?
See JSON in Oracle Database. To quote the manual:
'Oracle Database supports JavaScript Object Notation (JSON) data
natively with relational database features, including transactions,
indexing, declarative querying, and views.'

Resources