I export SQL tables to txt files (by the code below in ssms)
all the columns are exported well except decimal columns,
which export 41 chars (I want 19 chars),
even the size column is 14(4)
how can I change the settings in order the column will export in the size I want?
notes:
bcp export 0.0 decimal value as .0000 as I need
my tables is very big can't use substring- a lot of columns and a lot of rows
EXEC sp_configure 'show advanced options', 1
RECONFIGURE
EXEC sp_configure 'xp_cmdshell', 1
RECONFIGURE
-------------
DECLARE #stmtQuery VARCHAR(8000),#stmt VARCHAR(8000);
----שליפת הנתונים מהטבלה
set #stmtQuery ='SELECT * FROM myDB.dbo.HugeTable' --a lot of decimal columns, a lot of rows
--copy data to txt file
SET #stmt = 'BCP "' + #stmtQuery +
'" QUERYOUT "path\to\file\TableData.txt" -t -C RAW -S ' + ##SERVERNAME + ' -d ' + DB_NAME()+' -e path\to\file\log.txt -c -r"0x0A" -T';
EXEC master.sys.xp_cmdshell #stmt;
Need to extract the below query data along with header in csv file using shell script.
Below is the query.
SELECT SourceIdentifier,SourceFileName,ProfitCentre2,PlantCode,
tax_retur ReturnPeriod,document_number DocumentNumber,TO_CHAR(invoice_generation_date,'YYYY-MM-DD')
Docume,Original,customer_name CustomerName,NVL(sns_pos,new_state_code)POS,PortCode,NEW_HSN_CODE HSNorSAC,(SGSATE+UTGSATE) Stat,(SGS+UT)StateUT,Userde FROM arbor.INV_REPO_FINA WHERE UPPER(document_type)='INV' AND UPPER(backout_flag)='VALID' AND new_gst_id_new IS NOT NULL AND new_charges<>0 AND taxable_adj=0
UNION
SELECT SourceIdentifier,SourceFileName,ProfitCentre2,PlantCode,
tax_retur ReturnPeriod,document_number DocumentNumber,TO_CHAR(invoice_generation_date,'YYYY-MM-DD')
Docume,Original,customer_name CustomerName,NVL(sns_pos,new_state_code)POS,PortCode, NEW_HSN_CODE HSNorSAC,(SGSATE+UTGSATE) Stat,(SGS+UTG)StateUT,Userde FROM arbor.INV_REPO_FINA WHERE UPPER(document_type)='INV' AND UPPER(backout_flag)='VALID' AND new_gst_id_new IS NOT NULL AND new_charges<>0 AND taxable_adj<>0
Could please let me know if below approach to fetch data using shell script is correct and script is correct.
#!/bin/bash
file="output.csv"
sqlplus -s username/password#Oracle_SID << EOF
SPOOL $file
select 'SourceIdentifier','SourceFileName','ProfitCentre2','PlantCode',
'tax_retur ReturnPeriod','document_number DocumentNumber','TO_CHAR(invoice_generation_date,'YYYY-MM-DD') Docume','Original','customer_name CustomerName','NVL(sns_pos,new_state_code)POS','PortCode','NEW_HSN_CODE HSNorSAC','(SGSATE+UTGSATE) Stat','(SGS+UT)StateUT','Userde' from dual
Union all
select 'TO_CHAR(SourceIdentifier)','TO_CHAR(SourceFileName)','TO_CHAR(ProfitCentre2)','TO_CHAR(PlantCode)',
'TO_CHAR(tax_retur ReturnPeriod)','TO_CHAR(document_number DocumentNumber)','TO_CHAR(invoice_generation_date,'YYYY-MM-DD')
Docume','TO_CHAR(Original)','TO_CHAR(customer_name CustomerName)','TO_CHAR(NVL(sns_pos,new_state_code)POS)','TO_CHAR(PortCode)','TO_CHAR(NEW_HSN_CODE HSNorSAC)','TO_CHAR((SGSATE+UTGSATE) Stat)','TO_CHAR((SGS+UT)StateUT)','TO_CHAR(Userde)' from
(SELECT SourceIdentifier,SourceFileName,ProfitCentre2,PlantCode,
tax_retur ReturnPeriod,document_number DocumentNumber,TO_CHAR(invoice_generation_date,'YYYY-MM-DD')
Docume,Original,customer_name CustomerName,NVL(sns_pos,new_state_code)POS,PortCode,NEW_HSN_CODE HSNorSAC,(SGSATE+UTGSATE) Stat,(SGS+UT)StateUT,Userde FROM arbor.INV_REPO_FINA WHERE UPPER(document_type)='INV' AND UPPER(backout_flag)='VALID' AND new_gst_id_new IS NOT NULL AND new_charges<>0 AND taxable_adj=0
UNION
SELECT SourceIdentifier,SourceFileName,ProfitCentre2,PlantCode,
tax_retur ReturnPeriod,document_number DocumentNumber,TO_CHAR(invoice_generation_date,'YYYY-MM-DD')
Docume,Original,customer_name CustomerName,NVL(sns_pos,new_state_code)POS,PortCode, NEW_HSN_CODE HSNorSAC,(SGSATE+UTGSATE) Stat,(SGS+UTG)StateUT,Userde FROM arbor.INV_REPO_FINA WHERE UPPER(document_type)='INV' AND UPPER(backout_flag)='VALID' AND new_gst_id_new IS NOT NULL AND new_charges<>0 AND taxable_adj<>0)
SPOOL OFF
EXIT
EOF
In short: the ; is missing from the end of the select statement.
Some unrequested advice:
I think spool will put extra stuff into your file (at least some new lines), a redirect is better, further the first line is not db-related:
echo "SourceIdentifier;SourceFileName;ProfitCentre2..." > $file
I recommend to generate the csv format right in the select query, later it will be more headache (you can escape there what you want):
$query = "select SourceIdentifier || ';' || SourceFileName || ';' || ProfitCentre2 ... ;"
So querying the DB (I think capital -S is the right one) plus for the formatting of the records (and maybe you want to format your columns too):
sqlplus -S username/password#Oracle_SID >> $file << EOF
set linesize 32767 pagesize 0 heading off
$query
EOF
For me this one is working but one empty line before first query and second query is coming. Empty line remove using awk command
#!/bin/bash
FILE="A.csv"
$ORACLE_HOME/bin/sqlplus -s username/password#Oracle_SID<<EOF
SET PAGESIZE 50000 COLSEP "," LINESIZE 20000 FEEDBACK OFF HEADING off
SPOOL $FILE
select 'TYPE_OF_CALL_V','SWITCH_CALL_TYPE_V','RECORD_TYPE_V','TARF_TYPE_V' from dual;
SELECT TYPE_OF_CALL_V,SWITCH_CALL_TYPE_V,RECORD_TYPE_V,TARF_TYPE_V FROM TABLE;
SPOOL OFF
EXIT
EOF
awk 'NF > 0' $FILE > out.txt
mv out.txt $FILE
i need to load data from flat file into mariaDB on linux environment.
i've plan to put mariaDB script on shell file. then call shell from cron.
mariadb script shown as follow:
set #path = (select path_file from param);
set #tbl = (select table_name from param);
set #x = concat(
'LOAD DATA LOCAL INFILE ',
#path,
' INTO TABLE ', #tbl,
' (#row) set id = trim(substr(#row,1,2)), name = trim(substr(#row,3,19)), address= trim(substr(#row,22,20))'
);
prepare y from #x;
execute y;
deallocate prepare y;
when i execute the script directly on heidisql,
error shown:
this command is not supported in the prepared statement protocol yet
does any one have better way to load data from flat file into MariaDB on linux environment regularly (scheduled) without using any ETL tools?
Thanks.
One option you can try is (adjust as needed):
File: load_data.sh
path=$(mysql -u ${mysql_user} -p${mysql_password} -s -N <<GET_PATH
SELECT '/path/to/file/data.csv';
GET_PATH
)
tbl=$(mysql -u ${mysql_user} -p${mysql_password} -s -N <<GET_TABLE
SELECT 'table';
GET_TABLE
)
# mysql -u ${mysql_user} -p${mysql_password} -s -N <<LOAD_DATA
# LOAD DATA LOCAL INFILE '${path}'
# INTO TABLE \`${tbl}\` ...
# LOAD_DATA
# TEST
cat <<LOAD_DATA
LOAD DATA LOCAL INFILE '${path}'
INTO TABLE \`${tbl}\` ...
LOAD_DATA
Command line:
$ ls -l
-r-x------ load_data.sh
$ ./load_data.sh
LOAD DATA LOCAL INFILE '/path/to/file/data.csv'
INTO TABLE `table` ...
For clarity, write as much of the SQL into a STORED PROCEDURE. Then use bash to call that SP.
I'm trying to generate a CSV file with BCP. My problem is that I have some NVARCHAR columns so I must use the parameter -w for the bcp utility. So, the CSV file generated is opening in a single column in EXCEL. If I create a new text file copy the content of the CSV generated and paste in the new file and then change its type to CSV it works and open the content spread in different columns. Has someone seen it before?
SET #File = 'MyQuery.csv'
set #SQL = 'bcp "SELECT FirstName, LastName, DwellingName FROM Table" queryout "' + + '" -w -t
"," -T -S '+ convert(varchar,##ServerName)
exec master..xp_cmdshell #SQL
I've found a solution for my problem:
I've used the -CACP to generate the CSV file ANSI Encoded and it works!!!
Now my command looks like:
SET #File = 'MyQuery.csv'
set #SQL = 'bcp "SELECT FirstName, LastName, DwellingName FROM Table" queryout "' + + '" -c -CACP -t "," -T -S '+ convert(varchar,##ServerName)
exec master..xp_cmdshell #SQL
I am able to connect to a Microsoft SQL Server 2008 instance via a Mint Linux VM using freeTSD and command line to execute sql statements on it. Now I want automate this in a bash script. I am able to successfully login in my bash script:
TDSVER=8.0 tsql -H servername -p 1433 -D dbadmin -U domain\\Administrator -P password
I then have my SQL query:
USE dbname GO delete from schema.tableA where ID > 5 GO delete from schema.tableB where ID > 5 GO delete from schema.tableC where ID > 5 GO exit
This works when doing manually via freeTSD command line, but not when I put in bash file. I followed this post: freeTSD & bash.
Here is my bash script sample:
echo "USE dbname GO delete from schema.tableA where userid > 5 go delete from schema.tableB where userid > 5 go delete from schema.tableC where ID > 5 GO exit" > tempfile | TDSVER=8.0 tsql -H servername -p 1433 -D dbname -U domain\\Administrator -P password < tempfile
the output of the bash script is:
locale is "en_US.UTF-8"
locale charset is "UTF-8"
Default database being set to sbdb
1> 2> 3> 4> 5> 6> 7> 8>
and then the rest of my script is executed.
Can someone give me a step by step answer to my problem ?
I'm not sure how your sample can work at all.
Here is my bash script sample:
echo "USE dbname .... exit" > tempfile | TDSVER=8.0 tsql -H servername -p 1433 -D dbname -U domain\\Administrator -P password < tempfile
# ------------------------------------^^^^ ---- pipe char?
Try using a ';' char.
echo "USE dbname .... exit" > tempfile ; TDSVER=8.0 tsql -H servername -p 1433 -D dbname -U domain\\Administrator -P password < tempfile
# ------------------------------------^^^^ ---- semi-colon
Better yet, use shell's "here documents".
TDSVER=8.0 tsql -H servername -p 1433 -D dbname -U domain\\Administrator -P password <<EOS
USE dbname
GO
delete from schema.tableA where userid > 5
go
delete from schema.tableB where userid > 5
go
delete from schema.tableC where ID > 5
GO
exit
EOS
IHTH.
Current command line input:
echo "delete from table where userid > 5
go
delete from table where userid > 5
go
delete from table where ID > 5
GO
exit" < /tmp/tempfile; TDSDUMP=/tmp/freetds.log TDSVER=8.0 tsql -H servername -p 1433 -D dbname -U Administrator -P password <<EOS
Old thread but this seemed to work..
printf "use mydbname\ngo\nselect * from mytable\ngo\nexit\n"|tsql -I freetds.conf -S profileName -U user -P 'password'
1> 2> 1> 2> ID stringtest integertest
1 test 50
2 teststring2 60
3 test3 70
(3 rows affected)
try
echo "USE dbname\n GO\n delete from schema.tableA where ID > 5\n GO\n delete from schema.tableB userid > 5\n go\n delete from schema.tableC where ID > 5\n GO\n exit\n"
the rest of this string is stuff that maybe works
and try
echo "USE dbname;\n delete from schema.tableA where ID > 5;\n delete from schema.tableB userid > 5;\n delete from schema.tableC where ID > 5;\n exit\n"
and try
echo "USE dbname; delete from schema.tableA where ID > 5; delete from schema.tableB userid > 5; delete from schema.tableC where ID > 5; exit"
if you are using odbc, i recommend the second trial.
if you are sending commands to sql with a "go" word as sql sentences separator, maybe the first one is better.
maybe the third one... who knows... only trial and error can tell...