How do I write the £ (GBP) sign in a CSV file from Ruby and read it back correctly in Excel? - excel

When I write a CSV file using Ruby containing the £ sign and I open it using Excel I see this symbol instead ¬£.
My understanding is that Ruby uses UTF-8, but Excel interprets this file using a different encoding (ASCII).
I tried to write a US-ASCII encoded CSV file and guessed the £ encoding in ASCII like this:
csv = CSV.open(filename, 'w:US-ASCII')
csv << "\xA3"
csv.close
but it fails with invalid byte sequence in UTF-8 somewhere deep into the CSV library.
What am I doing wrong?
Thank you

For sure, Excel is not bound to use ASCII. For instance, I can easily input japanese characters into an Excel cell, and these are certainly not representable by ASCII.
While Ruby, by default, uses Unicode in its internal representation, every String object incorporates its own encoding, so you could in theory mix strings with different encodings, if you want to. In your case, you want to force a certain encoding when writing a file. This can be done either by using the w: output option, as you did, or by using external_encoding: Encoding::US-ASCII. See here for the names of the constants in Encoding.
I don't think US-ASCII is a good choice for the encoding, simply because there is no pound symbol in the ASCII chart. I would have expected that you get a warning message on stderr, when trying to write a pound symbol. If you need an 8-bit-encoding, ISO-8859-1 should do the job, but my recommendation would be to write UTF-8 and tell Excel to use this encoding when reading the CSV file. The possibility to import UTF exists at least since Excel 2007.

Related

Exporting from Excel to CSV replaces Japanese characters with ??? even though Windows, Office locale is Japan/Japanese

I am exporting an excel file (Excel 2016) containing Japanese characters into CSV. (Note : I am not exporting to CSV UTF-8 provided). In the process, all Japanese characters are replaced with '?'
My Windows/Office locale is Japan/Japanese & Windows/office language/format is all Japanese.
I understand that excel uses a codepage to save the CSV file in particular encoding. My understanding was this should be Shift-JIS (as default encoding for Japanese locale). If that is so, why the loss of information & replacement by '?'
What encoding does Excel try to save the CSV in???
(FYI : If I try to open an CSV, excel by default attempts to open the CSV in Shift-JIS 932 as expected)
Note : I am aware of workarounds of using UTF-8. I am interested in understanding above behavior, more than a workaround
Thanks
The character 縺 appears very often when you read a byte stream containing Shift-JIS (MS932) encoded hiragana characters and try to decode it as UTF-8 characters. FYI, CybetChef is handy for this kind of work. You will get the string まとづ…… as output from your string.
So in this situation, Excel 2016 seems to have written the CSV in Shift-JIS (MS932), and your text editor (or Excel 2016. How did you open the CSV?) seems to have read the CSV in UTF-8.

Cannot write british pound or euro symbols to CSV file - Nodejs

I'm writing a CSV file which contains text with british pound and euro symbols, however when I opened the file in Excel, I see some rather odd behavior. I see some weird A-looking symbol before the british pound, and quotes instead of the euro symbol. I figured it's probably because Excel doesn't like a file that's UTF8 encoded.
fs.writeFileAsync("the-file.csv", text-containing-foreing-currency, "utf8");
Does anyone know a way to get around this while creating the file? I don't want the users to have to do anything with excel after downloading the file, I just want them to be able to open the file and see the right symbols.
There shouldn't be any problem with node writing the symbols to the file, if you open it with a text editor you should see the correct characters.
The problem is with excel opening UTF8 csv files. By default it assumes ANSI encoding, so if the file is in UTF8, it scrambles the characters. You can open the file correctly with the text import wizard.
In general this is a limitation of excel. The best workaround for you will depend on your OS and Excel version. This is a heavily discussed topic, here are some good reads:
Is it possible to force Excel recognize UTF-8 CSV files automatically?
Which encoding opens CSV files correctly with Excel on both Mac and Windows?

How to mannually specify Byte Order Mark in CSV

I have a CSV that is encoded in Unicode, however lacks a byte order mark at the start. As such Excel (2013) opens without encoding correctly (i think it assumes ASCII if no BOM specified...), meaning that certain characters are displayed incorectly.
From reading around i have read that a BOM of "\uFEFF" should be entered at the start of the CSV file. I have tried opening in txt editor and adding the characters e.g.
\uFEFF
r1test 1, r1text2, r1text3
r2test 1, r2text2, r2text3
However, this does not solve the problem - the characters "\uFEFF" show up on the first row when I open in excel, rather than it beign interpreted as a BOM. I am not sure what I am doing wrong, and the format of how the text should be specified such that it is interpreted as a BOM, rather than text in the the first of the data
I have only very limited experience using CSV, and only just heard of a BOM... and thus I could be implementing this completely wrong!
(for reference, i know that I could specify the encoding if i use the import data option within excel... however I really want to work out how to get it correctly specified in advance such that I can just open the csv... I have several thousand of these files that I am creating and exporting - once I know how to do this 'manually' [i.e. by adding some text at start of a the file], I can configure to automatically do in Python).
Thanks in advance
For someone else wanting to tell Excel to add a BOM: See if you can "Save as Unicode Text".
source

Reading txt file & opening with excel

Using Delphi 2007 & trying to read a txt file with greek characters & opening with Excel, I am not getting greek characters but symbols...Any help?
The text file is created with this code
CSVSTRList.SaveToFile('c:\test\xxx2.txt');
where CSVSTRList is a TStringList.
Looking at your code in your previous question it seems you are taking a stock TStringList and calling SaveToFile. That encodes the text as ANSI. Your Greek characters cannot be encoded as ANSI.
You want to export text using a Unicode encoding. For instance, in a modern Delphi you would use:
StringList.SaveToFile(FileName, Encoding.Unicode);
for UTF-16, or
StringList.SaveToFile(FileName, Encoding.UTF8);
for UTF-8.
I would expect that Excel will understand either of these encodings.
Since you are using a non-Unicode Delphi, things are somewhat more difficult. You'll need to change all you code, every single piece of string handling, to be Unicode aware. So you cannot use the stock string list any more, for example, because it contains 8 bit ANSI strings. The simplest way to do this with legacy Delphi is with the TNT Unicode library.
Or you could take the step of moving to a Unicode Delphi. If you care about international text it is the most sensible option.

How to write excel file with special characters through Perl script?

I am writing Excel file through perl code. When I insert data in XML file and view in any browser, I see correct data with special characters, but when I write the same data in Excel file, it is showing garbage characters.
For eg.:
(word from XML file on browser) Gràcia - (word from Excel file) Grà cia
I am using 'Spreadsheet::XLSX' for reading excel and 'Excel::Writer::XLSX' for writing excel.
Also need help in finding the encoding format of excel fields.
Do you have any idea? Thanks in advance.
This seems very much like UTF-8 to iso-8859-1 conversion going wrong - seems like a string that contains UTF-8, but is not marked as being UTF-8, is being passed to $worksheet->write(). Since http://metacpan.org/pod/Excel::Writer::XLSX#UNICODE-IN-EXCEL claims to handle unicode correctly, it seems to be a problem with your input string, not the write method itself.
As you don't post any code, and don't tell us where your strings come from, i can't tell why the strings aren't marked correctly.
You can probably get away with
Encode::_utf8_on($str)
before passing your strings to $worksheet->write(), but this might just as well break other things, if not all of your strings are really utf-8. Basically the answer is "get the utf-8 flag on your strings right when you read them".

Resources