Howto handle umlauts in Logic App for export to csv - excel

I created a logic app to export some data to a *.csv file.
Data which will be exported contains german umlauts.
I read all the needed values into variables which are then concatenated and added to an array.
Finally I get an array of semicolon separated strings with the values in it.
This result will then be added to an email as file attachment:
All the values are handled correctly in the Logic App and are correct in the *.csv file but as soon I open the csv with Excel, the umlauts are not shown correctly anymore.
Is there a way to create explicitly a file with the correct encoding within the logic app and add the file to the email instead of the ExportString?
Or can I somehow encode the content of the ExportString-Variable?
Any hints?

I have reproduced in my environment and followed below steps to get correct output in CSV file:
My input is:
I have sent the data into CSV table as below and then created a file in file share as below:
Then when i open my file share and download the content from there i got different output as you got:
Then I opened my Azure Storage explorer and downloaded it as below:
When i open in notepad the downloaded file:
I get the correct output, try to do in this way
And when i save it as hello.csv and keep utf-8 with bom like below:
Then I get the correct output in csv as well:

Related

Steganography with Microsoft Word - DOCX

I write a application hide a of string within a .docx file.
A Docx file comprises of a collection of XML files that are contained inside a ZIP archive. So, My program treat that file like a zip file and hide secret string in it.
After research, I found a way to insert data to ZIP archive.
A Secret String is injected after a file section right before the 1st central directory header. After that, a pointer in an end of central directory record is updated to compensate the shift of the central directory header.
My output docx file work fine with typical file archivers (7-zip, WinRAR, File Roller, &c) or file managers (Windows Explorer).
But when I open my output docx file with Microsoft Word it said:
Here is link for input and output file
What step did I wrong or missing?

Snowpipe doesn't load the files after error has been rectified

I am using snowpipe to load files from S3 bucket. It worked well for 2 files.
But then to check out how snowpipe works when there is any error occur in between file loading, I intentionally changed file format ( changed delimiter to '|' whereas file is CSV ) so that COPY command will not work. And uploaded 3rd CSV file on S3. But it was not loaded due to file format error. It was perfect till this time.
Later I recreated file format with correct delimiter i.e. ',' but since notification was already sent for 3rd file, it did not loaded in table. So I uploaded 4th csv file and it got loaded successfully. So my questions is how to take care of loading of 3rd file for which event notification was generated while file format was wrong.
Let me know if any more details are required.

DoCmd.TransferText where delimiter is semicolon and decimal is comma

I'm trying to import a csv file with:
Dim appAccess As Access.Application
Set appAccess = CreateObject("Access.Application")
appAccess.OpenCurrentDatabase (databasePath)
appAccess.DoCmd.TransferText transferType:=acImportDelim, tableName:=dbTableName, Filename:=strPath, hasFieldNames:=True
I'm using a German machine, where the standard delimiter is ; and the standard decimal-separator is ,.
If I use those separators, I get an error (the data is not separated correctly).
If I change the separator in the csv file to ,and the decimal-separator to ., the data is loaded in the database, but the . is ignored and numeric values therefore aren't imported correctly.
I don't have the option, to create an import scheme in Access manually. Is there a way, to do this with VBA?
I created a Schema.ini file, which looks like this:
[tempfile.csv]
Format=Delimited(;)
ColNameHeader=True
DecimalSymbol=","
I saved it in the same folder where the csv file is located.
I still get a Runtime-Error, saying field1;field2;... is not a header in the target table. So I'm guessing, the method didn't use ; as a delimiter.
If you have a look at the documentation of the DoCmd.TransferText method there exists a parameter SpecificationName which says:
A string expression that's the name of an import or export specification you've created and saved in the current database. For a fixed-width text file, you must either specify an argument or use a schema.ini file, which must be stored in the same folder as the imported, linked, or exported text file.
To create a schema file, you can use the text import/export wizard to create the file. For delimited text files and Microsoft Word mail merge data files, you can leave this argument blank to select the default import/export specifications.
So if you are not able to generate that schema.ini file using the wizard you can generate it yourself in the same folder as your files to import. For a documentation how to build that file see Schema.ini File (Text File Driver).
It should look something like the following I think:
[YourImportFileName.csv]
Format=Delimited(;)
DecimalSymbol=","
Note that you have to generate one ini file for each CSV file you want to import because the first line is always the name of the import file. So generate the schema.ini, import, delete the ini and start over generating the next ini for the next file.
If you want to generate that ini file with VBA on the fly, have a look at How to create and write to a txt file using VBA.

What .xlsx file format is this?

Using an existing SSIS package, I was trying to import .xlsx files we received from a client. I received the error message:
External table is not in the expected format
These files will open in XL
When I use XL (currently XL2010) to Save As... the file without making any changes:
The new file imports just fine
The new file is 330% the size of the original file
When changing .xlsx to .zip and investigating the contents with WinZip:
The original file only has 4 .xml files and a _rels folder (with 2 .rels files):
The new file has the expected .xlsx contents:
Does anyone know what kind of file this could be?
It would be nice to develop my SSIS package to work with these original files, without having to open and re-save each file. There are only 12 files, so if there are no other options, opening/saving each file is not that big of deal...and I could automate it with VBA going forward.
Thanks for any help anyone can provide,
CTB
There are many Excel file formats.
The file you are trying to import may have another excel format but the extension is changed to .xlsx (it could be edited by someone else) , or it could be created with a different Excel version.
There is a Third-Part application called TridNet File Identifier which is an utility designed to identify file types from their binary signatures. you can use it to specify the real extension of the specified file.
Also after a simple search on External table is not in the expected format this error is thrown when the definition (or version) of the excel files supported in the connection string is different from the file selected. Check the connection string used in the excel connection manager. It might help to identify the version of the file.

CSV in UTF-8 and Microsoft Excel

In my application I have a list of items which can be exported to CSV.
For this, I create a Blob as follows:
var BOM = "\ufeff";
var blob = new Blob([csv], {
type: 'csv;charset=utf-8'
});
In case that the data in this list contains special characters, the exported file was not opened correctly in MS Excel. So I added a line to my code (the second line in the following snippet), as I found in many Q&A forums:
var BOM = "\ufeff";
var csv = BOM + csv;
var blob = new Blob([csv], {
type: 'csv;charset=utf-8'
});
That works - the CSV is opened correctly in Excel, but then, when saving the file - it is save in text format and not as CSV. Which meant I need to "Save As" the file and change the default type if I want it to be saved correctly.
Is it really like this? Do I really have to choose between the two options - see the file or save it correctly?
Yes this is a shame but it is really like this. From Excel a CSV is ANSI encoded per default and there is not a directly possibility to save CSV in any unicode encoding. Microsoft itself suggest using Notepad to change the encoding. See How to save an address book to a CSV file by using the UTF-8 encoding format so that the CSV file can be imported to Windows Mail. See also How can I save a csv with utf-8 encoding using Excel 2013?
Only other possibility is using VBA and create the CSV file using ADODB.Stream or Scripting.FileSystemObject.
How to use ADODB.Stream to create unicode encoded CSV file is answered multiple times already. For example: how to Export excel to csv file with "|" delimted and utf-8 code. Simply change the delimiter "|" to ",". This is the basic approach. Maybe you have to extend it to provide text delimiter also, if the delimiter can be part of the data.
Using CreateTextFile Method of Scripting.FileSystemObject is simpler but only allows Unicode which is UTF-16LE rather than UTF-8.

Resources