Here's what I'm doing:
I'm using a Foreach Loop container to grab any .xlsx files in a specified folder and assigning the fully qualified name to a variable called FileName.
Then I have a data flow with an Excel source importing to an OLE DB Destination.
How do I make the excel source the FileName variable?
--When I create the same process for flat files I have no problems creating an expression and changing the delay validation to true but when I try excel files it doesn't work the same. I've been able to work around the problem by using a file system task to move the xlsx files to a new folder giving it a static name and importing from that file, but I'm tired of doing that. Any help will be greatly appreciated!
Related
I developed SSIS package which load excel file from Source named Input Folder and after loading it moves to success or failure Folder. Package is running fine. I am just facing one issue that when i open excel connection manager and close without doing any changes it create a (sample empty excel) file in the input folder and when i place my actual excel in input folder and execute package package move my actual excel to success and the other (sample empty file) to failure. Can anyone help me to resolve this issue why is this happening? Thanks.
I built a macro that writes a custom csv file to disk. I need to add the option to also save this file in a zipped format. I followed Ron de Bruins excellent article on the matter but face the problem that my zip file is empty.
NewZip (FilePathZip) 'creates an empty zip file
Set objShell = CreateObject("Shell.Application")
objShell.Namespace(FilePathZip).CopyHere FilePathCSV
FilePathZip is the full path to the new zip file I'm creating here. FilePathCSV is the full path to the CSV that was just saved to disk. Interestingly, when I switch out FilePathCSV for any other file that already exists before I run the macro, it works. Apparently, I only face this problem when trying to zip a file that was created during the runtime of the macro.
I already checked if the CSV path is recognized by excel through Dir(FilePathCSV) and made sure that the file is closed after the writing process. I also tried adding timeouts (Wait()). I have no idea what the problem is.
Wrapping FilePathCSV in a Dir() function solved my problem. I'm not sure why though.
Using an existing SSIS package, I was trying to import .xlsx files we received from a client. I received the error message:
External table is not in the expected format
These files will open in XL
When I use XL (currently XL2010) to Save As... the file without making any changes:
The new file imports just fine
The new file is 330% the size of the original file
When changing .xlsx to .zip and investigating the contents with WinZip:
The original file only has 4 .xml files and a _rels folder (with 2 .rels files):
The new file has the expected .xlsx contents:
Does anyone know what kind of file this could be?
It would be nice to develop my SSIS package to work with these original files, without having to open and re-save each file. There are only 12 files, so if there are no other options, opening/saving each file is not that big of deal...and I could automate it with VBA going forward.
Thanks for any help anyone can provide,
CTB
There are many Excel file formats.
The file you are trying to import may have another excel format but the extension is changed to .xlsx (it could be edited by someone else) , or it could be created with a different Excel version.
There is a Third-Part application called TridNet File Identifier which is an utility designed to identify file types from their binary signatures. you can use it to specify the real extension of the specified file.
Also after a simple search on External table is not in the expected format this error is thrown when the definition (or version) of the excel files supported in the connection string is different from the file selected. Check the connection string used in the excel connection manager. It might help to identify the version of the file.
So I have a folder in which excel files get loaded daily. I need to create a package that extract those files to a sql table. However, the file changes name daily as the name comes with the current date,so the filename would be like filename+currentdate(name20140801.xlsx).
I tried using a variable that has filename?.xlsx/filename?*xlsx as a excelpath property for the excel connection manager but the task gives me an error DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.
Does anyone know how i can do this using the Data Flow task( I am trying to avoid Script task)
Thank you!
I am creating an application using Bottle framework. I need a feature to upload an Excel file.
I am using the following for file upload.
http://bottlepy.org/docs/dev/tutorial.html#post-form-data-and-file-uploads
On the server side I am getting the file data as binary content. I want to save it in a temporary folder as an Excel file.
I am new to Python and Bottle. Any help will be much appreciated.
Thanks
Chirdeep
Your request.files.data object contains the data about your excel file. So you only need to create a temporary folder and save it inside. This can be done using the tempfile module
f = tempfile.NamedTemporaryFile(delete=False, suffix=".xlsx")
f.write(request.files.data.file.read())
f.close()
I was not able to get simple file writing code like yours to work, So I used the tempfile module. Looking at your code, I would have assumed it would write to the directory where the python file is, if the code is working. Try using the code below, if you don't pass arguments to dir, it will create a file in the current directory.
def save_as_temp_file(data):
with tempfile.NamedTemporaryFile(dir=settings.TEMP_PATH,
delete=False,
suffix=".xlsx") as f:
f.write(data.file.read())
return f.name