Receiving Unterminated CSV Quoted Field Error while inserting into Postgres using Copy - excel

I am trying to insert CSV file into Postgres using Copy command at the time I am receiving Unterminated CSV Quoted Field error (out of 100 CSV files I am getting this error for 2-5 files).
I will usually identify the error file then I will open the same using Microsoft Excel and I will simply save without doing anything changes. Then I will try to copy the same file again into Postgres, this time it was working the data get inserted. Can anyone please explain how its possible simply opening and saving the file using Excel will resolve this error?

Related

Talend 7.1 tFileOutputExcel corrupt file

I'm trying to output an excel file from Talend 7.1. I've tried a few different setups and both xls and xlsx formats but they all result in the output file being corrupt and not being able to open it.
What am I doing wrong? I am loading an xlsx file into a database and this part works fine but outputting to excel I just can't figure it out! I was writing from the tMap directly to the tFileOutputExcel and it wasn't working (corrupt) so I changed it to write to a csv file first and then write that csv to the tFileOutputExcel but it is still corrupt.
This is my job detail:
And this is the settings in the tFileOutputExcel
I got this working by changing the transfer mode in the FTP component from 'ascii' to 'binary'. Such a simple thing but if this helps anyone else with this issue who is a newb like me :)

SSIS Package strange data flow issue, spitting out empty excel with large dataset

I am having issue with the SSIS package, by Running from BIDS I could export 400K records successfully, But when I tried to run from the Job the package ran successfully but the excel file is empty.
The user which I am running the package with having the full access to the C:\Users folders. and I see it saving the data into the temporary folder but not writing that data into the file and finish with empty file.
For example : 230000 records (works good)
Create the excel file
Load the temporary data
Write data into the file
close the file
330000 records (not working)
Create the excel file
Load the temporary data
Write data into the file xxxxxxx this line missing from the process monitor
close the file
Solution : give permission to the user executing the package to the folder C:\Users\Default doesn't work for me.
Please help!
Sorry for bugging you guys, Found the problem. There was just 1.6GB of disk space on the server, thought the file is taking just 200MB of space but generate lots of temporary files causing the disk full error. Strange that SSIS package ran successfully without giving any warning or error. Thanks for looking into it.

An error occurred saving import/export specification ''

I am new to MS-Access.
The title is the error I get when I try to import an excel sheet into a new table in Access 2016. Note the single empty quote is part of the error message.
I've tried reinstalling, playing around with import options, importing from a CSV, CSV with different encodings, checked the table in excel for errors or inconsistencies.
I have searched and searched without luck. Help would be appreciated.
ADDENDUM:
The CSV I've tried to import is:
CashAccountID,AccountDescription,BankName,BankAccountNumber
301,Primary Checking Account,MegaBank,9017765453
302,Money Market Account,Wells Gargle,3831157490
303,Payroll Account,MegaBank,9835320050
I've encountered the same error and, from trial and error, it appears the issue is related to the size of the Excel file you're importing from. I've had success by splitting the 70MB Excel file into two 35MB files before doing the same import into Excel.
The error message from MS Access is nonsensical - the problem occurs when we're not using an import/export specification at all (and nor are there any saved in the Access I'm running). I think we can put this failure and erroneous error message down as an MS Access bug.

openpyxl close archive after breaking read operation because max rows are 1048498 rows

I have two problems using openpyxl
The number of rows in the spreadsheet are 1048498. The iteration hogs memory so I put a logic to check for first five empty columns and break from it
Logic 1 works for me and code does not indefinitely iterate over the spreadsheet blank cells. I am using P4Python to delete this read only file after I am done reading it. However, openpyxl is still using that file and there is no method except save to close the archive used internally. Since my file is in read only mode, I cannot save the file. When P4 is trying to delete this file, I get this error - "The process cannot access the file because it is being used by another process."
Help is appreciated :)
If you open the file in read-only mode then it will not hog memory. Cells are created only when read. Memory use has been tested with huge files but if you think this is a bug then please submit a bug report with a sample file.
This looks like an existing issue or intended beahvior with openpyxl. If you have a read only file (P4Python sync operation - p4.run_sync(file_path_to_sync)) and if you are reading it using openpyxl, you will not be able to delete the file (P4Python p4.run_sync(file_path_to_sync + '#0') - Remove from workspace) until you save the file which is not possible (or intended in my case) since it is a read only file.

How to import data into teradata tables from an excel file using BTEQ import?

Im trying to import data into tables from a file using BTEQ import.
im facing weird errors while doing this
Like:
if im using text file as input data file with ',' as delimiter as filed seperator im getting the error as below:
*** Failure 2673 The source parcel length does not match data that was defined.
or
if im using EXCEL file as input data file im getting the error as below:
* Growing Buffer to 53200
* Error: Import data size does not agree with byte length.
The cause may be:
1) IMPORT DATA vs. IMPORT REPORT
2) incorrect incoming data
3) import file has reached end-of-file.
*** Warning: Out of data.
please help me out by giving the syntax for BTEQ import using txt file as input data file and also the syntax if we use EXCEL file as the input data file
Also is there any specific format for the input data file for correct reading of data from it.
if so please give me the info about that.
Thanks in advance:)
EDIT
sorry for not posting the script in first.
Im new to teradata and yet to explore other tools.
I was asked to write the script for BTEQ import
.LOGON TDPD/XXXXXXX,XXXXXX
.import VARTEXT ',' FILE = D:\cc\PDATA.TXT
.QUIET ON
.REPEAT *
USING COL1 (VARCHAR(2)) ,COL2 (VARCHAR(1)) ,COL3 (VARCHAR(56))
INSERT INTO ( COL1 ,COL2 ,COL3) VALUES ( :COL1 ,:COL2 ,:COL3);
.QUIT
I executed the above script and it is successful using a txt(seperating the fileds with comma) file and giving the datatype as varchar.
sample input txt file:
1,b,helloworld1
2,b,helloworld2
3,D,helloworld1
12,b,helloworld1
I also tried to do the same using tab(\t) as the field seperator but it giving the same old error.
Q) Does this work only for comma seperated txt files?
Please could u tell me where can i find the BTEQ manual...
Thanks a lot
Can you post your BTEQ script? May I also ask why you are using BTEQ instead of FastLoad or MultiLoad?
The text file error is possibly due to the data types declared in the using clause. I believe they need to be declared as VARCHAR when reading delimited input (eg. declare as VARCHAR(10) for INTEGER fields).
As for Excel, I can't find anything in the BTEQ manual that says that BTEQ can handle .xls files.
For your tab delimited files, are you doing this (that's a tab character below)?
.import vartext ' '
Or this?
.import vartext '\t'
The former works, the latter doesn't.
The BTEQ manual that I have is on my work machine. One of the first Google results for "BTEQ manual" should yield one online.

Resources