SSIS Package Execution Fails when large size Excel File is loaded via SSIS Catalog (SSISDB) - excel

the package execute successfully from VS but when deploy on SSIS catalog it gives following error
Error: The Execute method on the task returned error code 0x80070008 (Could not load file or assembly 'System.Windows.Forms, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' or one of its dependencies. Not enough storage is available to process this command. (Exception from HRESULT: 0x80070008)). The Execute method must succeed, and indicate the result using an "out" parameter
Excel File size is 147 mb

Yes, This problem occurs when you try to load excel file through SQL server till 2012. The limit is 120MB split your file to less than this size and try to load again.
Second solution is try SQL server 2016 and you will not face this issue.

Microsoft has acknowledged this shortcoming and has released the following package to rectify it using "newer" Access/Excel Connection managers. However, after installing and following the instructions - the same issue occurs.
The cause of the problem: The problem is caused due to a limitation that exists with office versions 1997-2003. The connection manager was designed for Office 1997 and has not been amended or improved since. This is why, when you actually research limitations on Office 2000 or 2003, you'll see that they do not allow more than 255 columns (the limit mentioned above) AND the length of the column-name cannot surpass 64 characters. These are software limitations that existed with Office 2000 and 1997 and since the Connection Manager was designed then, they just remained with it given MS did not invest in improving or updating their product. Hope this helps!
In my other consideration it may due to OLEDB provider version older version issue
Follow below link it may help you a lot
[https://blogs.msdn.microsoft.com/dataaccesstechnologies/2017/10/18/unexpected-error-from-external-database-driver-1-microsoft-jet-database-engine-after-applying-october-security-updates/][1]
Last Solution:
As you said your package is working fine but not in deployment mode right ?
So why didn't you try DTEXEC command line execution for this package. Schedule it using Windows Task Scheduler
Follow below link.
https://www.mssqltips.com/sqlservertutorial/218/command-line-tool-to-execute-ssis-packages/

Related

Creating Word OLE Client from a Windows Scheduled Task causes ntdll application error

Our normal workflow is to use a Windows Scheduled Task to:
convert files from rtf to pdf in bulk
manipulate the pdfs to add barcodes
aggregate into a single pdf
then send for printing
This was working fine on Windows Server 2008, but since our upgrade to Windows Server 2019 (end of life and whatnot) we've run into a whole host of issues. We're now falling down at the first hurdle- even instantiating the Word OLE client. The error isn't consistent - one day the process will complete successfully, the next it will fail immediately.
Our tasks trigger a Dyalog APL workspace, where the code to be executed sits. For debugging purposes, I have set up a simplified version which is doing the following:
tries←0 ⍝ Initialise the try counter
Log'Attempting to create Word instance' ⍝
:Repeat ⍝ Keep trying to...
Word←⎕NEW'OleClient'(⊂'ClassName' 'Word.Application') ⍝ Create the Word Client instance
Word.Visible←1 ⍝ making the application visible
tries+←1 ⍝ and incrementing the try counter each time
Log'Try: ',⍕tries ⍝
:Until #.Word.PropList∊⍨⊂'Documents' ⍝ Until it has seemingly created successfully
:OrIf tries≥maxTries_create ⍝ ... or the tries have exceeded the maximum (currently 5)
⍝
'doc open'delayAndLog dl_open ⍝ With an optional external delay...
myWordDoc←Word.Documents.Open⊂docPath ⍝ open the specified test doc
⍝
'doc close'delayAndLog dl_close ⍝ With an optional external delay...
myWordDoc.Close 0 ⍝ close the doc (not saving)
⍝
'application quit'delayAndLog dl_quit ⍝ With an optional external delay...
Word.Application.Quit 0 ⍝ quit the word client
The various delays are held externally in a config file. I didn't include the reading of this config file, as essentially what is shown here is the substance. Note: I retry creating the instance as sometimes I've found that it only instantiates with the barebones methods. Putting a delay in, or retries seemed to fix this.
This piece of code, will run completely fine in my non-prod servers (and even sometimes in production). Today, it will run fine when I am running it through the IDE, but when running as a Windows Scheduled Task it will report a DOMAIN ERROR when opening the document.
My observations are that it briefly creates the WINWORD.exe, i.e. you can see it in the task manager, the status turns to "suspended", then it disappears. In the event viewer, we see the following:
Event Viewer - ntdll application fault (id: 1000)
Things I've tried so far
Rebooted the server (yes, you joke, but turning it off and on again is the first thing we should always try)
Repaired Office
Reinstalled Office
Tried configuring the task to run as a different user (myself, with local admin rights)
Tried configuring the task to run on a different server.
Tried configuring the task to run with a higher priority.
Making the Word instance visible, to see if there are any error pop-ups.
Built a simplified version of the task (see above) to ensure it's not just the overall complexity of the main task causing confusion.
Disabled "automatic inking" on Microsoft's recommendation
Captured procmon logs of the error (awaiting Microsoft's review)
Versions currently installed:
Word: Microsoft Word 2019 MSO (16.0.10374.20040) 64-bit
Dyalog APL: 16.0.35960.0 32-bit Unicode
Windows Server 2019 Standard Version 1809. OS Build 17763.1697.
Any help would be really greatly appreciated on this one as I feel like I'm tackling Schrödinger's OLE Client. Thanks in advance.
That is a crash in Word.
There might be crash dump files in c:\Users<yourname>\AppData\Local\Temp\ or c:\Users<yourname>\AppData\Local\CrashDumps.
At your company, if you have any C developers who can use Visual Studio or Windbg, they could open the dump file and see if it gives any clues.
If not, I could take a quick look if you send it to Dyalog Support.
This is just a guess...Have you tried increasing the size of the Desktop heap? Please see this blog post:
https://learn.microsoft.com/en-us/archive/blogs/ntdebugging/desktop-heap-overview
Regards,
Vince

Power Query M - An operation that uses the database driver could not be completed

From time to time when I run some queries I get this error message :
An operation that uses the database driver could not be completed. If the driver is a Microsoft driver, make sure the driver file isn't damaged
I didn't found yet how to fix it with internet help.
this last website give a "solution" but I can't do the same.
"Well in my case it worked by giving the user access to the DB it is accessing."
I just have a worksheet who use others files from my company network, I don't use any database ...
I can't reinstall the driver or anything else, my company have a very strict politic about IT security.
May I have some help ?
I had the same issue when I try to connect Analysis service in Azure. The error message is misguiding. In actual, you are missing few client drivers that are needed to connect to Azure Analysis services. Install all the missing drivers (OLEDB, AMO and ADOMD) from https://learn.microsoft.com/en-us/analysis-services/client-libraries?view=asallproducts-allversions#known-issues and you should be good to go.
I think OLEDB installation should suffice the need but I did all three to avoid future encounters related to other admin stuff. Check if OLEDB updated # C:\Program Files (x86)\Microsoft Analysis Services\AS OLEDB\140 by comparing msolap.dll version. Latest version as I write this is 15.1.108.23
Tip: 1) Try connecting to Analysis Service from Excel by following the steps - Data Menu --> Get Data --> From Database --> From Analysis Services. Follow the prompts by filling appropriate info. Bingo! You are in!
2) Wait for a while if still you can't connect after installing all drivers. I don't know the actual reason why but was able to connect next day.
Please mark this as an answer if it helps to solve your problem.
Good Luck!
Update - 1/5/2021
Today I face same ugly error again which made me to rethink what else could be gone wrong in the system. Later realized that I had restarted the machine and that removed the temp connection file referred by Excel file. Following the steps to reconnect under the Tip section in above instructions let me successfully in.

Excel Data Load using SSIS - Memory used up error

I am trying to load data to an excel file using SSIS Package. Please find below the details
Source : SQL Server Table
Destination : Excel File
No.of rows:646K
No.of columns:132
I have deployed the package in the SQL Server Integration Services Catalog and trying to execute it from there.
But the following errors are being thrown:
Not enough storage is available to complete this operation.
The attempt to add a row to the Data Flow task buffer failed with
error code 0xC0047020.
SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on
SRC_MDM_ENTITYDUPLICATE returned error code 0xC02020C4. The
component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the
component, but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with more information
about the failure.
My DFT looks like the following:
I am using Data Conversion since I am facing some datatype mismatch between Unicode and Non-Unicode characters.
The package is working fine in my local machine with 95-99% resource utilization.
Since I have deployed the package in production environment, I can't do any modifications in the Server Settings. Also I guess the high resource utilization is creating issue while executing the package in production server.
I tried reducing DefaultBufferMaxRows size and increasing DefaultBufferSize which didn't help me anyhow.
Can somebody help me to optimize my package and fix this issue.
Thanks much in Advance.
I realized that the solution of the error is that the column is not excel in your package, as a solution you will either delete that column from the package or add empty columns

Run Excel Macro using SSIS throught SQL Jobs failed

I have a task to create Excel VBA and run that macro on SSIS. I can successfully run the Excel Macro from the SSIS Package, but I have a problem when I run that SSIS on SQL Jobs. I have created Credential, Proxy SSIS, and set SSIS unprotected, but it always fails.
Error Message
Executed as user: HARNANDA7-PC\HARNANDA7.
Microsoft (R) SQL Server Execute Package Utility
Version 11.0.2100.60 for 64-bit Copyright (C) Microsoft Corporation. All rights reserved.
Started: 11:56:14
Error: 2014-05-28 11:56:17.61 Code: 0x00000001
Source: Script
Task
Description: Exception has been thrown by the target of an
invocation. End Error
DTExec: The package execution returned
DTSER_FAILURE (1).
Started: 11:56:14 Finished: 11:56:17 Elapsed:
2.652 seconds.
The package execution failed. The step failed.
but if i create Folder named "Desktop" on C:/Windows/System32/config/systemprofile/ or C:/Windows/SysWOW64/config/systemprofile/
then the SSIS package runs successfully through SQL Jobs.
I'm confused as to what the problem is here. Can anyone help?
Big thanks
For whatever reason (no one seems to know from what I can gather), Excel requires that the desktop folder be present on your machine. It must use it as a global setting or possibly a default setting for creating files or temporary files. If that folder doesn't exists then (and this is just a guess) excel either has issues creating these temporary files or requires that the Desktop directory exists and, in both cases, throws an error if the Desktop directory does not exist.
I doubt that they would want to make something like this an explicit feature, so it is most likely a bug with Excel.

listItem.File.OpenBinary() not working - Remote Blob Storage / FileStreaming not enabled on SQL Server the culprit?

I'm moving a cmd line migration utility from DEV to QA and I'm running into a strange error. Of course, things work perfectly on dev.
This is the offending line of code:
byte[] fileBytes = sourceItem.File.OpenBinary();
"Cannot Open File Error". Inner exception shows COM HResult code is 0x81070211
I have made sure that my account has owner permissions on the library that I'm attempting to pull this file from. I have even made myself a farm admin - but I still get the same error.
I'm seeing that several other people have encountered this same error, but no solutions. One post mentions downloading the file through code as a workaround - what would that look like?
I'm also seeing that some link this problem to files > 100Kb and that it can be overcome by putting the assembly into the GAC. However, this would be problematic for this application.
Yes, I have also tried using all of the different options parameters. I have also tried the workaround of just opening a Stream by using OpenBinaryStream, getting the length and reading the data into the byte[] array. The result is always the same. Something is disallowing me from getting access to the file to be able to read in the bytes – and the error message is just useless.
Thanks in Advance for any help you can provide.
I now believe that the problem may be related to Remote Blob Storage. Several of the posts have mentioned that the problem is only when the file size is > 100kb. That happens to be the limit at which files move from the content db to the file system. I believe that the DBA must enable FileStreaming on the SQL Server database in order to fix this problem. I'm awaiting our offshort DBA to act on this suggestion. I will follow up with a report if it works. In the meantime - anybody else have experience with using OpenBinary on files with RBS?
Trey Carroll
This is just an educated guess - I don't have time to test my theory. Does the library require check out? If so, are you checking out the file before trying to open it?
The problem was due to StoragePoint. The account running the OpenBinary() call must have explicit access to the StoragePoint databases.
I also faced this problem and found that it was due to the CAS permissions.
You may be able to resolve the problem by adding the below entry in your CAS permissions
<IPermission class="System.Data.SqlClient.SqlClientPermission, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" version="1" Unrestricted="true" />

Resources