BCP queryout -w generating wrong CSV File - excel

I'm trying to generate a CSV file with BCP. My problem is that I have some NVARCHAR columns so I must use the parameter -w for the bcp utility. So, the CSV file generated is opening in a single column in EXCEL. If I create a new text file copy the content of the CSV generated and paste in the new file and then change its type to CSV it works and open the content spread in different columns. Has someone seen it before?
SET #File = 'MyQuery.csv'
set #SQL = 'bcp "SELECT FirstName, LastName, DwellingName FROM Table" queryout "' + + '" -w -t
"," -T -S '+ convert(varchar,##ServerName)
exec master..xp_cmdshell #SQL

I've found a solution for my problem:
I've used the -CACP to generate the CSV file ANSI Encoded and it works!!!
Now my command looks like:
SET #File = 'MyQuery.csv'
set #SQL = 'bcp "SELECT FirstName, LastName, DwellingName FROM Table" queryout "' + + '" -c -CACP -t "," -T -S '+ convert(varchar,##ServerName)
exec master..xp_cmdshell #SQL

Related

BCP QUERYOUT how to change length to all decimal fields

I export SQL tables to txt files (by the code below in ssms)
all the columns are exported well except decimal columns,
which export 41 chars (I want 19 chars),
even the size column is 14(4)
how can I change the settings in order the column will export in the size I want?
notes:
bcp export 0.0 decimal value as .0000 as I need
my tables is very big can't use substring- a lot of columns and a lot of rows
EXEC sp_configure 'show advanced options', 1
RECONFIGURE
EXEC sp_configure 'xp_cmdshell', 1
RECONFIGURE
-------------
DECLARE #stmtQuery VARCHAR(8000),#stmt VARCHAR(8000);
----שליפת הנתונים מהטבלה
set #stmtQuery ='SELECT * FROM myDB.dbo.HugeTable' --a lot of decimal columns, a lot of rows
--copy data to txt file
SET #stmt = 'BCP "' + #stmtQuery +
'" QUERYOUT "path\to\file\TableData.txt" -t -C RAW -S ' + ##SERVERNAME + ' -d ' + DB_NAME()+' -e path\to\file\log.txt -c -r"0x0A" -T';
EXEC master.sys.xp_cmdshell #stmt;

Excel file generation using SQL query, SPOOL command in Shell script

I have to generate excel file from tables from Oracle Database. The code is working fine but however the column names are not coming completely, there are just coming as the length of there data length.
I want complete header/column names
The column names are coming like this
ST STAT_TYPE_DESC ST S NXT_STATME DELAY_DAYS ANN_PREM_LIM_LOW ANN_PREM_LIM_HI CONTRIB_HIST_LEN EVENT_DO C P
But I want is complete names of the columns, for example ST is STATEMENT_TYPE_ID
#!/bin/ksh
FILE="A.csv"
sqlplus -s lifelite/lifelite#dv10 <<EOF
SPOOL $FILE
SET HEADING ON
SET HEADSEP OFF
SET FEEDBACK OFF
SET LINESIZE 250
SET PAGESIZE 5000 embedded ON
SET COLSEP ","
SELECT * FROM TLLI_01150STATTYPE;
EOF
SPOOL OFF
EXIT 0
Before your select add a new one with the column names:
SELECT 'STATEMENT_TYPE_ID' || ',' || 'STAT_TYPE_DESC' || ',' || ... FROM dual;
And set heading off

load data infile parameterized

i need to load data from flat file into mariaDB on linux environment.
i've plan to put mariaDB script on shell file. then call shell from cron.
mariadb script shown as follow:
set #path = (select path_file from param);
set #tbl = (select table_name from param);
set #x = concat(
'LOAD DATA LOCAL INFILE ',
#path,
' INTO TABLE ', #tbl,
' (#row) set id = trim(substr(#row,1,2)), name = trim(substr(#row,3,19)), address= trim(substr(#row,22,20))'
);
prepare y from #x;
execute y;
deallocate prepare y;
when i execute the script directly on heidisql,
error shown:
this command is not supported in the prepared statement protocol yet
does any one have better way to load data from flat file into MariaDB on linux environment regularly (scheduled) without using any ETL tools?
Thanks.
One option you can try is (adjust as needed):
File: load_data.sh
path=$(mysql -u ${mysql_user} -p${mysql_password} -s -N <<GET_PATH
SELECT '/path/to/file/data.csv';
GET_PATH
)
tbl=$(mysql -u ${mysql_user} -p${mysql_password} -s -N <<GET_TABLE
SELECT 'table';
GET_TABLE
)
# mysql -u ${mysql_user} -p${mysql_password} -s -N <<LOAD_DATA
# LOAD DATA LOCAL INFILE '${path}'
# INTO TABLE \`${tbl}\` ...
# LOAD_DATA
# TEST
cat <<LOAD_DATA
LOAD DATA LOCAL INFILE '${path}'
INTO TABLE \`${tbl}\` ...
LOAD_DATA
Command line:
$ ls -l
-r-x------ load_data.sh
$ ./load_data.sh
LOAD DATA LOCAL INFILE '/path/to/file/data.csv'
INTO TABLE `table` ...
For clarity, write as much of the SQL into a STORED PROCEDURE. Then use bash to call that SP.

BCP Error Numeric value out of range when importing multi-line file to Azure SQL Data Warehouse

I'm trying to load my Azure SQL Data Warehouse using the bcp utility but have been running into issue after issue...I finally got a .txt file with one record to import successfully, but now when I put two or more records into the file, it bombs out with the error (via an error output file):
Row 1, Column 5: Numeric value out of range
The data looks like this:
2014-06-01,11111,test,used,1
2014-06-01,22222,test,used,1
and the table I'm importing to looks like this:
[Date] (date, not null)
[Code] (varchar(50), not null)
[Model] (varchar(100), not null)
[Type] (varchar(20), not null)
[Quantity] (int, not null)
I think it has something to do with the new line character but I haven't been able to work around it. I have tried changing the encoding in Notepad++ to ANSI, ISO-8859-1, UTF-8 w/o BOM, as well as UTF-16 LE & BE with Visual Studio CODE. When 'ANSI' was specified, the one-line file imported successfully. The end-of-line sequence is set to LF, and my bcp command is as follows:
bcp Schema.Table in C:\BcpFiles\sourceData.txt -S serverName -d databaseName -U userName -P password -q -c -t "," -r/n -e C:\BcpFiles\Errors.txt
The -r parameter requires a back slash rather than forward: try -r \n instead. This article explains the various combinations: https://msdn.microsoft.com/en-gb/library/ms191485.aspx
UPDATE:
create table tst (
[Date] date not null,
[Code] varchar(50) not null,
[Model] varchar(100) not null,
[Type] varchar(20) not null,
[Quantity] int not null
)
And then using this:
bcp dbo.tst in so.txt -S TONYMSI -d AdventureWorks2012 -T -q -c -t "," -r \n
Worked fine.

In Kentico, how can I export files from the database?

In Kentico, how can I export files with their correct filenames from the table 'BlobTable'?
Using this example:
Declare #sql varchar(500)
set #sql = 'BCP "SELECT * FROM [esc_mcms].dbo.[BlobTable] where [BlobId]=4" QUERYOUT C:\Temp\blob\04.pdf -T -f C:\Temp\blob\testblob_all.fmt -S ' + ##SERVERNAME EXEC master.dbo.xp_CmdShell #sql
I wrote a script to generate commands that could be executed to extract all of the files stored as blobs:
SELECT
'set #sql = ''BCP "SELECT * FROM [esc_mcms].dbo.[BlobTable] where [BlobId]="'+convert(varchar,[BlobId])+' QUERYOUT C:\Temp\blob\out\'+NodeResource.name+'.'+BlobFileExt+' -T -f C:\Temp\blob\testblob_all.fmt -S '' + ##SERVERNAME EXEC master.dbo.xp_CmdShell #sql' SQL
FROM [BlobTable], NodeResource, Node
where BlobId=ResourceBlobId
and Node.Id=NodeResource.Id
and NodeResource.name != '_Content'

Resources