Package executed in Visual Studio cancels without helpful error - excel

I am following a Udemy Course Learn ETL using SSIS. The first simple task is to transfer data from an excel file to a database.
The only change I have made is I am trying to transfer to a PostgreSQL server instead of a Microsoft SQL Server. I therefore had to install SSDT for Visual Studio first, and get the ODBC driver necessary to create a ODBC Destination for the package.
All well so far, but then when I try and run the package I just get:
SSIS package "Visual Studio 2017\Projects\Excel_SQL\Excel_SQL\Package.dtsx" starting.
Information: 0x4004300A at Excel to SQL, SSIS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at Excel to SQL, SSIS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Excel to SQL, SSIS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Excel to SQL, SSIS.Pipeline: Pre-Execute phase is beginning.
Information: 0x4004300C at Excel to SQL, SSIS.Pipeline: Execute phase is beginning.
SSIS package "Visual Studio 2017\Projects\Excel_SQL\Excel_SQL\Package.dtsx" finished: Canceled.
The program '[14368] DtsDebugHost.exe: DTS' has exited with code 0 (0x0).
No data transfers over. The excel file is very simple, excel 97-2003 as the connection expects, contains 2 columns only, rollnumber and marks, rollnumber has 11 rows of data going 1 to 11, and then some random marks in the marks column.
My database on Postgres is set up with these 2 columns, as numeric types.
I really cannot figure out what is going wrong.
I have seen some similar questions on stack overflow, but that was around the file type not being correct:
SSIS Package Cancels instantly on Debug
I don't think that's my issue.
Can anyone please advise?
Thank you.

I think the situation is not clear, but there are many suggestions that you can follow:
(1) Try to run the package in 32-bit
The issue may be caused if the package is trying to run in 64-bit mode and you don't have installed the relevant references, try to execute the package in 32-bit mode:
Package Properties >> Debugging >> Run64BitRuntime = false
How to execute a 32 bit SSIS package in a 64bit package?
(2) AccessDatabaseEngine is not installed
The issue may be caused if the Office connectivity components for microsoft Excel are missing, check that you have installed them:
Microsoft Access Database Engine 2010 Redistributable
Microsoft Access Database Engine 2016 Redistributable
(3) Make sure you have followed the appropriate steps to create ODBC Destination
You can follow this article in order to create a package that import data to postgres, check that all steps are done correctly:
SSIS WITH POSTGRESQL : CONNECT TO POSTGRESQL WITH SSIS COMPONENTS
(4) Doing some workarounds
In order to specify the error source, try to replace the Postgres destination with a Flat File Destination, if the package is executed successfully then the problem is with the ODBC Destination, also try to replace the excel source with FLat File Source, if the package is executed successfully then the problem is with the Excel Source.
If you are new to SSIS, som articles can help:
Simple SSIS: Importing Data from Flat Text Files
Export Data from SQL Server to Flat File in SSIS Example
Using SSIS to Export Data to Flat Files
(5) Try to Use SQL Import Export Wizard
If you have SQL Server Installed, try to use the Excel Import Export wizard to create and execute the package:
Import and Export Data with the SQL Server Import and Export Wizard
Connect to a PostgreSQL Data Source (SQL Server Import and Export Wizard)
How to export data from SQL Server to a Flat file

So, in this case it was that I just had to move the (Source) excel file out to my E:\ drive. Perhaps the path it was in was too long? Only 3 folders in from E:\ but...it worked.
Can anyone explain why that was the issue? Nothing pointed to that from the error messages.

Related

Issues Displaying SSIS Data - Importing from Excel File to SQL DB

I am importing data from an Excel file into a SQL DB but it keeps giving me errors. I receive the error listed below for all of my control flows and there are five. How can I resolve this error?
Data viewer at path 'Excel Source Output' of task 'Data Flow Task - Course Facts' [DataflowID: {DF84ABE7-A1C5-4CA8-B5B9-C4FED8744F8E}; IDString: Paths[Excel Source.Excel Source Output]; PackageRPath: Integration Services Project1\Package.dtsx];;;break always (currently 0)
event log

SSIS Dynamically Import Excel Files

So I created a ForEach loop and a data flow task to write from Excel to SQL DB. All works fine with the Excel source hard coded. As soon as I change the connection string to use the file path variable as a data source, I get this error:
[Excel Source 1] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager 1" failed with error code 0xC0202009.
I print the file path in a message box before executing the data flow so I know that the variable is working.
Naturally I browsed tons of answers and tutorials, but nothing. Here's what I tried:
Changing the data source on the connection string
Using the ExcelFilePath expression instead of the connection string
Changing the Excel file name in the connection manager properties
Ran the package in 32 bits
Set delayed validation to True in all data flow tasks and connection manager
Deleting and creating a new connection manager
Combinations of the above, lots of trial an error
I'm using Visual Studio 2013.
I'd appreciate your help as I've been plucking my hair all afternoon with this :)
I never got this to work, so I imported the Excel files to the DB using SQL:
SELECT * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0', 'Excel 12.0;Database=C:\Test\Excel_Data.xlsx;', 'SELECT * FROM [Sheet1$]')
https://www.sqlshack.com/query-excel-data-using-sql-server-linked-servers/

Azure Storage Explorer: Properties of type '' are not supported error

I inherited a project that uses an Azure table storage database. I'm using Microsoft Azure Storage Explorer as a tool to query and manage data. I'm attempting to migrate data from my Dev database to my QA database. To do this, I'm exporting a CSV from a Dev database table and then trying to import into the QA database table. For a small number of tables, I get the following error when I try to import the CSV:
Failed: Properties of type '' are not supported.
When I ran into this before, since I exported a "typed" CSV from Dev, I checked to make sure all "#type" columns had values. They did. Then I split the CSV (with thousands of records) up into smaller files to try to determine which record was the issue. When I did this and started importing them, I was ultimately able to import all of the records successfully by individual files which is peculiar. Almost like a constraint violation issue.
I'm also seeing errors with different types. Eg:
Properties of type 'Double' are not supported.
In this case, there is already a column in the particular table of type "Double".
Anyway, now that I'm seeing it again, I'm having trouble resolving it. Any thoughts?
UPDATE
I was able to track a few of these errors to "bad" data in the CSV. It was a JSON string in a Edm.String field that for some reason, it wasn't liking. I minified the JSON using an online tool and it imported fine. There is one data set, though, that has over 7,000 records I'm trying to import (the one I referenced breaking up previously earlier in this post). I ended up breaking it up into different files and was able to successfully import them individually. When I try to import the entire file after loading all the data through individual files, though, I again get an error.
I split the CSV (with thousands of records) up into smaller files to try to determine which record was the issue. When I did this and started importing them, I was ultimately able to import all of the records successfully by individual files which is peculiar.
Based on your test, the format and data of source CSV file seems ok. It will be difficult to find out why Azure Storage Explorer return those unexpected error while importing large CSV file. You can try to upgrade your Azure Storage Explorer and check if you can export and import data successfully using the latest Azure Storage Explorer.
Besides, you can try to use AzCopy (designed for copying data to and from Microsoft Azure Blob, File, and Table storage using simple commands with optimal performance) to export/import table.
Export table:
AzCopy /Source:https://myaccount.table.core.windows.net/myTable/ /Dest:C:\myfolder\ /SourceKey:key /Manifest:abc.manifest
Import table:
AzCopy /Source:C:\myfolder\ /Dest:https://myaccount.table.core.windows.net/mytable1/ /DestKey:key /Manifest:"abc.manifest" /EntityOperation:InsertOrReplace

Reading excel file (.csv) using ODBC driver in Qt

I have a csv file that I'd like to parse in Qt. I'd like to use the sql plugin, but I'm not sure how to get things set up. I currently am unable to open the .csvfile from my Qt app--I have to manually open it then start my app in hopes to query from it.
If the file I'm trying to read isn't manually opened before I begin my app, I get the following driver error:
[Microsoft][ODBC Excel Driver] External table is not in the expected format.
[Microsoft][ODBC Excel Driver] General Warning Unable to open registry key 'Temporary (volatile) Jet DSN for process 0x1ae0 Thread 0x1f2c DBC 0x4c52a4 Excel.' QODBC3: Unable to connect.
Here's my setup...
Qt Creator Build/Version:
I am working in Qt 5.3 which I did not build from source--I downloaded the installer.
I have configured a few debugging kits, but the one i'm using currently uses the MSVC 2012 openGL 32 bit compiler (which I've got set to default).
I have both Visual Studio 2012 AND 2010 on my machine, which is 64bit.
I didn't have to make the sql drivers, they came installed already (available drivers: () ("QSQLITE", "QMYSQL", "QMYSQL3", "QODBC", "QODBC3", "QPSQL", "QPSQL7"))
My .pro file hooks to the sql plugin:
QT += core gui network sql
And I've got the following includes:
#include <QSqlDatabase>
#include <QSqlQuery>
#include <QSqlError>
This is my code that establishes an Excel database connection and attempts a query:
QSqlDatabase db = QSqlDatabase::addDatabase("QODBC");
QString pathString = QString("C:\\MY_FILES\\test_work\\Xena\\Xena2544-2G\\Reports\\asdf_SN1_RFC2544_7_port_verify.csv");
db.setDatabaseName("DRIVER={Microsoft Excel Driver (*.xls)};DBQ=" + pathString );
if (!db.open())
{
QSqlError er = db.lastError();
QMessageBox::information(0, "Error", er.text());
}
if(db.open())
{
QSqlQuery query("select * from [" + QString("asdf_SN1_RFC2544_7_port_verify") + "$B13]"); //NEVER GETS HERE UNLESS i ALREADY MANUALLY OPENED THE FILE BEFORE RUNNING CODE.
QString process1Result = query.value(0).toString();
ui->verifyUnit1Status_lb->setText(process1Result);
}
File Permissions
- Full control
- modify
- read& execute
- read
- write
I'd really like to get this feature going. I would assume the error is in the connection string, but at this point I've tried for a couple hours with no success.
Thank you in advance.

How to get columns of DBF file in my VC++ application

I have problem to Get columns of DBF file in my vc++ application.
I used SQLColumns() to get Columns List.
but it giving SQL_NO_DATA as result.
what can i do, it happens only for One DBF file. if i create sample DBF file it perfectly worked,
Please give me suggestion,
Thanks in Advance.
I see you're using ODBC. The VFP ODBC driver works only with VFP 6 and earlier data files. Maybe the file you're having trouble with was created using features added in a later version of VFP?

Resources