How to export sql table,sp,view with nodejs? - node.js

I want to export tables, views, sp from my database to file.
One way to do that is to backup the database - I can't use this option because the database is on remote location and I do not have access to db server filesystem.
The other way is to use "Generate and Publis Script" wizard, and choose data and schema. - Which is failed during the generation (and I don't know why, for some reason I don't care why).
So my question is there is a sql query that I can run which iterate on all tables, views, and sp and get the schema and get data and write to file? (if some table is failed to open because some reason, then ignore).
Can I do it with nodejs? using sequelize perhaps? I not sure how to get table/view/sp schema with sequlize.
I would much like for guidance here

If i understood correctly, you'd like to select and view a tables contents and then be able to export it into a file on your computer right ?
If that's the case, I would do as follows:
1) First import my required functions (In my case I use MSSQL)
const sql = require('mssql');
const fs = require('fs');
I use MSSQL for database managing, so in this case I would need to import MSSQL to be able to connect and query my DB.
FS (Or 'File System') contains a function to copy a file from a certain file location into another location.
2) Then I will configure my database connection:
var config =
{
user: 'your username to log in',
password: 'password to log in',
server: "server path",
database: 'name of the database',
connectionTimeout: 0,
requestTimeout: 0,
pool:{
idleTimeoutMillis: 500,
max: 1
}
};
3)Then I would start making my function which would include the querying and saving of the file:
function commenceQuery()
{
var connection = new sql.ConnectionPool(config);
var request = new sql.Request(connection);
request.query("DECLARE #output removableTable (id INT IDENTITY, command NVARCHAR(512)) DECLARE #query = 'SELECT * FROM yourTable', #outputFile = VARCHAR (2048) = 'Where you want the file to be saved, most probably will be on the database computer', #connectionString VARCHAR = '-U databaseUserName -P databasePassword' + ##servername, #bcpQuery = 'bcp "#query" QUERYOUT "#outputFile" -T -c -t, -r\n #connectionString' SET #bcpQuery = REPLACE (#bcpQuery, '#query', #query) SET #bcpQuery = REPLACE (#bcpQuery, '#outputFile', #outputFile+'Test_Name.csv') SET #bcpQuery = REPLACE (#bcpQuery, '#connectionString', #connectionString) SET #bcpQuery = REPLACE (#bcpQuery, CHAR(10), ' ')) INSERT INTO #output EXEC master..xp cmdshell #bcpquery")
.then(function()
{
fs.copyFile('/filePath/to/where/output/isSpecified', '/filePath/of/where/you/want/toSave')
}
.catch(function()
{
conn.close()
})
};
To explain in details the query I've stated inside request.query(''):
DECLARE #output removableTable (id INT IDENTITY, command
NVARCHAR(512))
Declares a variable like a temporary table to put the information gathered into it
DECLARE #query = 'SELECT * FROM yourTable'
Declares a variable to hold the actual Query string
#outputFile = VARCHAR (2048) = 'Where you want the file to be saved,
most probably will be on the database computer'
Declares a variable to hold the file path destination of where the database should output the file, ex: C:\Program Files\anyFolder
#connectionString VARCHAR = '-U databaseUserName -P databasePassword' + ##servername
Same as the config we used above, it is the key to logging in to the database itself.
#bcpQuery = 'bcp "#query" QUERYOUT "#outputFile" -T -c -t, -r\n
#connectionString'
We're going to use this as a means of execution, replacing in variable that has the '#' with the actual values (That are declared below)
SET #bcpQuery = REPLACE (#bcpQuery, '#query', #query)
Would replace #query with the string it contains
SET #bcpQuery = REPLACE (#bcpQuery, '#outputFile',
#outputFile+'Test_Name.csv')
Same as above
SET #bcpQuery = REPLACE (#bcpQuery, '#connectionString',
#connectionString)
Same as above
SET #bcpQuery = REPLACE (#bcpQuery, CHAR(10), ' '))
Removes any line breaks
INSERT INTO #output
Inserts all what you received into #output
EXEC master..xp cmdshell #bcpquery
Executes the whole query in the cmdshell
Hope this helps !! Let me know if you get stuck with something

Related

Bulk Insert records in Postgres using nodejs?

I am trying to do multiple insert row using node-postgres, I want id to be created dynamically and return those ids
Following approach, I am using
const insertArray = [['kunal#test.com',1000,'abcdef'],['kunal1#test.com',1000,'fedkcn']]
let query1 = format(`INSERT INTO users VALUES %L returning id`, insertArray);
const newClient = new Client();
await newClient.connect();
let {rows} = await newClient.query(query1);
I am getting errors as
invalid input syntax for type integer: "kunal#test.com",
how can we skip id?
Tried using CSV option via copyfrom also getting the same issue
const stream = client.query(copyFrom(`COPY users FROM STDIN (format csv, DELIMITER ',', HEADER)`))
const readStream = fs.createReadStream('./tmp/copy.csv');
copy.csv
"email","amount","address"
'kunal#test.com',1000,'abcdef'
USERS table schema
id email amount address
Integer. String. Integer. String.
Not sure how to auto-generate id and return the rows containing new id.
Cannot provide the col name also, as this functionality will also be used by other schemas.
If I provide id it works fine
Thanks for help
Inserting (either by insert or copy) without specifying column names is positional operational in order of column definition. In other words here insert into users values ... is the same as insert into users(id, email, amount, address) values .... However, your data does not have the same structure. The solution is to specify your columns:
... INSERT INTO users(email, amount, address) VALUEs...
... COPY users(email, amount, address) FROM STDIN ...

Azure SQL Error Retrieving data from shard - Login Failed for User

I'm trying to do a cross database query but my error suggest's that I can't even connect to my external data source.
My exact error message is the following:
Error retrieving data from shard [DataSource=xxxxxxxxxxxxxxxxxx Database=CRDMPointOfSale_Configuration]. The underlying error message received was: 'Login failed for user 'CRDMAdmin'.'.
Below is my 'Create Database Scoped Credential'.
CREATE DATABASE SCOPED CREDENTIAL CRDMCred
WITH IDENTITY = 'CRDMAdmin',
SECRET = 'xxxxxxxxxx';
GO
Below is my 'Create Extenal Data Source'.
CREATE EXTERNAL DATA SOURCE CRDM_Configuration
WITH (
TYPE=RDBMS,
LOCATION='xxxxxxxxxxxxxxxxxxxxx',
DATABASE_NAME='CRDMPointOfSale_Configuration',
CREDENTIAL = CRDMCred
);
Below you can see my execute remote statement is within a stored procedure. Which I've seen elsewhere online.
CREATE PROCEDURE [admin].[InsertThreadProcessingDataIntoLoadTable]
(
#ThreadID VARCHAR(100)
, #DataLoadSchemaID INT OUTPUT
, #DateFrom CHAR(8) OUTPUT
, #DateTo CHAR(8) OUTPUT
, #DatabaseName VARCHAR(100)
)
AS BEGIN
SET NOCOUNT ON
DECLARE #IsBatchLoad BIT
SET #IsBatchLoad = CASE 'NO' WHEN 'YES' THEN 1 ELSE 0 END
Exec sp_execute_remote #data_source_name = N'CRDM_Configuration',
#stmt = N'SELECT #DateFrom = CONVERT(CHAR(8),FromDate,112), #DateTo = CONVERT(CHAR(8),DATEADD(DAY,1,ToDate),112)
FROM [admin].[GetFromAndToDatesForDatabase] (#DatabaseName, #IsBatchLoad,NULL)',
#params = N'#DatabaseName VARCHAR(100), #IsBatchLoad BIT',
#DatabaseName = 'CRDMPointOfSale', #IsBatchLoad = 1;
END
As you can see above the execute remote contains a SELECT statement, the FROM is the result of a function being called ([admin].[GetFromAndToDatesForDatabase]) that is from a different database which is why i have a 'Exec sp_execute_remote' wrapped around.
Should I be specifying parameters when not directly calling a SP? Also what am i doing wrong?

How to make case sensitive query with nodejs + pg

How to make case sensitive query with nodejs + pg
I want to select column content == 'a#gmail.com',
but it seems become select column == 'a#gmail.com'?
[error: column "a#gmail.com" does not exist]
code
var userEmail = 'a#gmail.com';
var query = 'SELECT EXISTS(SELECT * FROM "User" WHERE "Email" = "'+userEmail+'")';
dbClient.query(query, function(error, result) {
...
For use binding parameters you must number it begining with $1 (then $2 and so), then put the parameters in an Array:
var query = 'SELECT EXISTS(SELECT * FROM "User" WHERE "Email" = $1)';
dbClient.query(query, [userEmail], function(error, result) {
Always pass the parameters in an array. Is most secure.
Remember do not pass a function to query if you have a very big table unless you want to read all the table before returns the control to de function. Else you can use the "on" event or use a promise way (like https://www.npmjs.com/package/pg-promise-strict)
This doesn't have anything to do with case. The problem is that you're putting the email address in double-quotes, and in (most varieties of) SQL double-quotes indicate a column name or table name. That's why the error message says column "a#gmail.com" does not exist.
Use single-quotes around values:
var userEmail = 'a#gmail.com';
var query = 'SELECT EXISTS(SELECT * FROM "User" WHERE "Email" = \'' + userEmail + '\')';
Ideally, though, you should just use parameter binding so you don't have to worry about quoting values at all. When you use string concatenation to build SQL queries you very often open yourself up to SQL injection attacks.

How to index plain text files for search in Sphinx

I scanned dozens of articles and forum threads, looked through official documentation, but couldn't find an answer. This article sounds promising, since is says that The data to be indexed can generally come from very different sources: SQL databases, plain text files, HTML files, but unfortunately as all other articles and forum threads it is devoted to MySQL.
It is rather strange to hear that Sphinx is so cool, it can do this and that, it can do practically anything you want with any data source you like. But where are all those examples with data sources other than MySQL ? Just one tiniest and trivial step-by-step example of Sphinx configuration when you want to scan the easiest source of data in the world - plain text files. Let's say, I've installed Sphinx and want to scan my home directory (recursively) to find all plain text files, containing "Hello world". What should I do to implement this?
Prerequisites:
Ubuntu
sudo apt-get install sphinxsearch
... what is next????
Have a look at this before proceeding Sphinx without SQL! .
Ideally I would do this.
We are going to use Sphinx's sql_file_field to index a table with file path. Here is the PHP script to create a table with file path for a particular directory(scandir).
<?php
$con = mysqli_connect("localhost","root","password","database");
mysqli_query($con,"CREATE TABLE fileindex ( id INT(6) UNSIGNED AUTO_INCREMENT PRIMARY KEY,text VARCHAR(100) NOT NULL);");
// Check connection
if (mysqli_connect_errno()) {
echo "Failed to connect to MySQL: " . mysqli_connect_error();
}
$dir = scandir('/absolute/path/to/your/dir/');
foreach ($dir as $entry) {
if (!is_dir($entry)) {
$path= "/absolute/path/to/your/dir/$entry";
mysqli_query($con,"INSERT INTO fileindex ( text ) VALUES ( '$path' )");
}
}
mysqli_close($con);
?>
Below code is sphinx.conf file to index the table with filepath. Notice sql_file_field which will index those files which are specified in the text(filepath) column
source src1
{
type = mysql
sql_host = localhost
sql_user = root
sql_pass = password
sql_db = filetest
sql_port = 3306 # optional, default is 3306
sql_query_pre = SET CHARACTER_SET_RESULTS=utf8
sql_query_pre = SET NAMES utf8
sql_query = SELECT id,text from fileindex
sql_file_field = text
}
index filename
{
source = src1
path = /var/lib/sphinxsearch/data/files
docinfo = extern
}
indexer
{
mem_limit = 128M
}
searchd
{
log = /var/log/sphinxsearch/searchd.log
pid_file = /var/log/sphinxsearch/searchd.pid
}
After creating table, saving the sphinx.conf in /etc/sphinxsearch/sphinx.conf just run sudo indexer filename --rotate, your indexes are ready! Type search and then keyword to get results.

XPages #DbLookup returning undefined when looking to another server/database

I am trying to perform an #DbLookup to another server/database and continually receive an "undefined" return message. The database exists, the view name is correct, the key is correct, as well as the column I am trying to return. I have reader access to the database.
I have tried all these combinations for the server/file path, but none seem to work:
var dbName = new Array(session.getServerName(), "my/folder/thisdb.nsf");
var dbName = session.getServerName() + "!!" + my\\folder\\thisdb.nsf;
var dbName = "CN=Server/OU=Name/O=This" + "!!" + my\\folder\\thisdb.nsf;
var dbName = [#DbName([0]), "my/folder/thisdb.nsf"];
I have found this post and tried most of the combinations:
http://www.c-lutions.com/c-lutions/mcblog.nsf/dx/08242012095124AMJMMJ69.htm
Are there any other combinations I can try?
Thanks!
Is your folder's name my folder with a space in it? This could be the challenge. I would suggest to ease your pain....
Create one XPage in your target database, have one computed field on it with #DbName() as formula and see what is coming back. Besides that, your formulas have some issues (comments below the entries):
var dbName = new Array(session.getServerName(), "my/folder/thisdb.nsf");
looks OK unless your folder isn't a subfolder of my. Folders need to be relative to the data directory.
var dbName = session.getServerName() + "!!" + my\\folder\\thisdb.nsf;
dbName must be an array, this one isn't. Also there are no quotes around the file name
var dbName = "CN=Server/OU=Name/O=This" + "!!" + my\\folder\\thisdb.nsf;
same here: must be an array, quote is missing. It is confusing since the data source syntax uses the (CN) format of this: #Name("[CN]";#Subset(#DbName();1)+"!!....
var dbName = [#DbName([0]), "my/folder/thisdb.nsf"];
almost there. #DbName() doesn't take a parameter, so you would write:var dbName = [#DbName()[0], "my/folder/thisdb.nsf"]; or use var dbName = [#Subset(#DbName(),1), "my/folder/thisdb.nsf"];
You also can check great samples to play with.
Make sure the second server is in a trusted server group that is trusted by the first server. For security reasons XPages (and LotusScript) running on any server is unable to access the contents of databases if they are not in the same trusted server group.

Resources