Heroku - NodeJS - Send json with data from postgres database - node.js

I'm starting with Heroku and NodeJS and Postgres database.
I have several tables named table1, table2, table3, ...table27.
All tables have the same structure 10 columns.
I'm trying to build a webservice to send as json all data from these tables.
I tried several ways, usync async modules or without but without success. I fected some unwanted data (basically, database structure). For example:
JSON-result:
{"command":"SELECT","rowCount":0,"oid":null,"rows":[],"fields":[{"name":"id","tableID":945497,"columnID":1,"dataTypeID":23,"dataTypeSize":4,"dataTypeModifier":-1,"format":"text"},{"name":"code_event","tableID":945497,"columnID":2,"dataTypeID":23,"dataTypeSize":4,"dataTypeModifier":-1,"format":"text"},{"name":"version","tableID":945497,"columnID":3,"dataTypeID":23,"dataTypeSize":4,"dataTypeModifier":-1,"format":"text"},{"name":"used","tableID":945497,"columnID":4,"dataTypeID":23,"dataTypeSize":4,"dataTypeModifier":-1,"format":"text"},{"name":"reference1","tableID":945497,"columnID":5,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"reference2","tableID":945497,"columnID":6,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"reference3","tableID":945497,"columnID":7,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"reference4","tableID":945497,"columnID":8,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"reference5","tableID":945497,"columnID":9,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"label1","tableID":945497,"columnID":10,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"label2","tableID":945497,"columnID":11,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"label3","tableID":945497,"columnID":12,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"label4","tableID":945497,"columnID":13,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"label5","tableID":945497,"columnID":14,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"export_action","tableID":945497,"columnID":15,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":54,"format":"text"}],"_parsers":[null,null,null,null,null,null,null,null,null,null,null,null,null,null,null],"rowAsArray":false}
I tried these code:
Node js call function, that access mysql database and returns json result, multiple times
How can I get a proper json with following data?
{table1: {row1, row2...}, table2:{row1,row2...}....}

Related

How to check if two files have the same content

I am working with a nodejs application. I am querying an API endpoint, and storing the retrieved data inside a database. Everything is working well. However, there are instances where some data is not pushed to the database. In this case what I normally do is manually query the endpoint by assigning the application the date when that data was lost, and retrieve it since its stored in a server which automatically deletes the data after 2 days. The API and database fields are identical.
The following is not the problem, but to give you context, I would like to automate this process by making the application retrieve all the data for the past 48 HRS, save it in a .txt file inside the app. I will do the same, query my mssql database to retrieve the data for the past 48 hrs.
My question is, how can check whether the contents of my api.txt file are the same with that of the db.txt?
You could make use of buf.equals(), as detailed in the docs
const fs = require('fs');
var api = fs.readFileSync('api.txt');
var db = fs.readFileSync('db.txt');
//Returns bool
api.equals(db)
So that:
if (api.equals(db))
console.log("equal")
else
console.log("not equal")

How to fetch raw sql insert/update from sqlalchemy ORM

I was trying to dump my PostgreSQL database created via SQLalchemy using python script. Though I have successfully created a database and all the data are getting inserted via web parsing in the ORM I have mapped with. But when I am trying to take a dump for all my insert queries using this
tab = Table(table.__tablename__, MetaData())
x = tab.insert().compile(
dialect=postgresql.dialect(),
compile_kwargs={"literal_binds": True},
)
logging.info(f"{x}")
I am adding values using ORM like this:
for value in vertex_type_values:
data = table(
Type=value["type"],
Name=value["name"],
SizeX=value["size_x"],
SizeY=value["size_y"],
SizeZ=value["size_z"],
)
session.add(data)
session.commit()
here table is the model which i have designed and imported from my local library and vertex_type_values which I have extracted and yield in my script
I am getting the output as
INSERT INTO <tablename> DEFAULT VALUES
So my question is how to get rid of Default Values and get actual values so that I can directly use insert command if my DB crash anytime? I need to know raw SQL for insert command

Knex + SQL Server whereIn query 8-12s -- raw version returns NO results but if I input the .toQuery() result directly I get results

The database is in Azure cloud and not being used in production currently. There are 80.000 rows and a uprn is a VARCHAR(100);
I'm already using JOI to validate each UPRN as well;
I'm using KNEX with a SQL Server database with the following whereIn query:
knex(LOCATIONS.table).whereIn(LOCATIONS.uprn, req.body.uprns)
but this takes 8-12s to complete and sometimes timesout. if I use .toQuery() on the same thing, SSMS will return the result within 1-2.
If I do a raw query, the resulting .toQuery() or toString() works in SSMS and returns results. But if I try to use the raw directly, it will return 0 results.
I'm looking to either fix what's making whereIn so slow or get the raw query working.
EDIT 1:
After much debugging and trying -- it seems that the bug is due to how knex deals with arrays, so I made a for-of loop to add ? ? ? for each array element and then inputed the array for all params.
This led me to realizing the performance issue is due to SQL server way of parameterising.
I ended up building a raw query string with all of the parameters and validating the input with Joi string/regex config:
Joi.string()
.min(1)
.max(35)
.regex(/^[a-z\d\-_\s]+$/i)
allowing only for alphanumeric, dashes and spaces which should prevent sql injection.
I'm going to look deeper into security issues with this and might make a separate login that can only SELECT data from that table and nothing more to run with these queries.
Needed to just handle it raw and validate separately.

Resource Conflict after syncing with PouchDB

I am new to CouchDB / PouchDB and until now I somehow could manage the start of it all. I am using the couchdb-python library to send initial values to my CouchDB before I start the development of the actual application. Here I have one database with templates of the data I want to include and the actual database of all the data I will use in the application.
couch = couchdb.Server()
templates = couch['templates']
couch.delete('data')
data = couch.create('data')
In Python I have a loop in which I send one value after another to CouchDB:
value = templates['Template01']
value.update({ '_id' : 'Some ID' })
value.update({'Other Attribute': 'Some Value'})
...
data.save(value)
It was working fine the whole time, I needed to run this several times as my data had to be adjusted. After I was satisfied with the results I started to create my application in Javascript. Now I synced PouchDB with the data database and it was also working. However, I found out that I needed to change something in the Python code, so I ran the first python script again, but now I get this error:
couchdb.http.ResourceConflict: (u'conflict', u'Document update conflict.')
I tried to destroy() the pouchDB database data and delete the CouchDB database as well. But I still get this error at this part of the code:
data.save(value)
What I also don't understand is, that a few values are actually passed to the database before this error comes. So some values are saved() into the db.
I read it has something to do with the _rev values of the documents, but I cannot get an answer. Hope someone can help here.

Node & Redis: Cannot pull more than 46 records in a list

We have a problem that simply we cannot solve for weeks.
We are using Node.js on Heroku, openredis for our Redis server, 'node-redis' for the Node Redis client.
In our mobile app project, we have a redis list called 'user_list' where it contains the keys in our redis db #1 (list is in the db #1) as well. We show the locations of the users on a map by first pulling the id's from the user_list and later on for each id in list we make a 'get' request.
However the issue is when we are trying to make a 'smembers user_list' we cannot retrieve the entire elements in user_list. After number of 46 records it returns like ',,,,,,,,'
We literally tried everything and when we gradually increase the elements inside the list it works but when we put 100 elements we simply cannot retrieve
What could be the problem?

Resources