Microsof Excel 2016 Unicode (UTF-8) does not work - excel

I searched all around internet how to save CVS file as Unicode (UTF-8), but it still does not work, whenever i save, and open the file, there is ????? instead of letter that are UTF-8.
Has anyone ever had this issue? how can i solve this?

This has been annoying short coming of Excel for a long time.
A way to work around this issue, is to do the following:
Save as... Unicode text (*.txt). Make sure to keep the extension as txt (or at least not csv). It will be saved with tabs instead of commas separating the columns.
Open the document. You will be prompted with an import wizard, like so:
For File origin, choose 65001: Unicode (UTF-8)
For the rest of the options, choose the common sense options.
You will have your document back, ready to edit, with the proper unicode text intact.

Related

Corrupt Text File read/write/open

I have a large text file that I take notes in; Recently, after saving it, it won't open and gives following error. I tried a few things on web that didn't work---opening in different encoding format, etc. Nothing worked. Any idea how I can open it again? Is there a language I can use from bash? I'm very familiar with PHP. Any ideas? Different text editor?
Error:
"The document “ToDo.txt” could not be opened. Text encoding Unicode (UTF-8) isn’t applicable."
"The file may have been saved using a different text encoding, or it may not be a text file."
cat the file from the CLI and make sure your data is still there. Then you could simply copy and paste the output into a new file and hopefully get rid of whatever weird encodings are causing that text editor to not read the file.

Cannot write british pound or euro symbols to CSV file - Nodejs

I'm writing a CSV file which contains text with british pound and euro symbols, however when I opened the file in Excel, I see some rather odd behavior. I see some weird A-looking symbol before the british pound, and quotes instead of the euro symbol. I figured it's probably because Excel doesn't like a file that's UTF8 encoded.
fs.writeFileAsync("the-file.csv", text-containing-foreing-currency, "utf8");
Does anyone know a way to get around this while creating the file? I don't want the users to have to do anything with excel after downloading the file, I just want them to be able to open the file and see the right symbols.
There shouldn't be any problem with node writing the symbols to the file, if you open it with a text editor you should see the correct characters.
The problem is with excel opening UTF8 csv files. By default it assumes ANSI encoding, so if the file is in UTF8, it scrambles the characters. You can open the file correctly with the text import wizard.
In general this is a limitation of excel. The best workaround for you will depend on your OS and Excel version. This is a heavily discussed topic, here are some good reads:
Is it possible to force Excel recognize UTF-8 CSV files automatically?
Which encoding opens CSV files correctly with Excel on both Mac and Windows?

recognising encodings in Excel

It is my understanding that txt files do not have encoding information stored so text editors simply make educated guesses about encoding of a given text file and then display the file on screen using that guessed encoding. If the editor guessed right you get your text on the screen, if the editor guessed wrong, then you (sometimes) get gibberish. Am I getting this right so far?
Now on to my problem. I have my bank statements in a csv file. When I open it in MS Excel 14 (MS Office 2010), it recognises the encoding and displays the problematic work as "obračun". Great. When I open the file in Emacs 24.3.1, it fails to recognise the correct encoding and displays the problematic word as "obra鑾n". Not so great.
My question is: how do I ask Excel which encoding the file is in? So I can tell that to Emacs since Excel obviously guessed correctly.
Thanks.
This could be a possible answer: http://metty-mathews.blogspot.si/2013/08/excel2013-character-encoding.html
After I opened ‘Advanced’ – ‘Web Options’ – ‘Encoding’, it said "Central European (Windows)" in "Save this document as:" field. It turns out that's Microsoft's name for Windows-1250 encoding and it turns out my file was indeed encoded with this encoding.
Is this just pure luck or does this field really show in which encoding Excel is displaying text - that I do not know.

How to mannually specify Byte Order Mark in CSV

I have a CSV that is encoded in Unicode, however lacks a byte order mark at the start. As such Excel (2013) opens without encoding correctly (i think it assumes ASCII if no BOM specified...), meaning that certain characters are displayed incorectly.
From reading around i have read that a BOM of "\uFEFF" should be entered at the start of the CSV file. I have tried opening in txt editor and adding the characters e.g.
\uFEFF
r1test 1, r1text2, r1text3
r2test 1, r2text2, r2text3
However, this does not solve the problem - the characters "\uFEFF" show up on the first row when I open in excel, rather than it beign interpreted as a BOM. I am not sure what I am doing wrong, and the format of how the text should be specified such that it is interpreted as a BOM, rather than text in the the first of the data
I have only very limited experience using CSV, and only just heard of a BOM... and thus I could be implementing this completely wrong!
(for reference, i know that I could specify the encoding if i use the import data option within excel... however I really want to work out how to get it correctly specified in advance such that I can just open the csv... I have several thousand of these files that I am creating and exporting - once I know how to do this 'manually' [i.e. by adding some text at start of a the file], I can configure to automatically do in Python).
Thanks in advance
For someone else wanting to tell Excel to add a BOM: See if you can "Save as Unicode Text".
source

How to force excel not to drop rows when opening csv-file?

Background:
We have a web application where the user can export orders in csv-format. For users with Microsoft Excel installed it's the default program. They simply click 'Open' after the file is downloaded. Users mainly use Internet Explorer, Firefox and Chrome. No difference in behaviour.
The problem:
If the user just open the csv-file (from browser or explorer) the file is opened by Excel and data is loaded automagically. But sometimes rows are just missing. Gone. No exception, no message, nothing.
The data is there, if you open the file with notepad you'll see it.
(I suspect it has something to do with special chars, quotes, commas etc but I can't find a root cause for this)
How to make it work:
If you save the file to disk, open excel and selects File -> Open -> Format: Textfiles (*.prn, *.txt, *.csv, *.skv) -> Open excel will launch a import wizard and everything will work perfectly.
Is there anything I can do with the export-file to either force the Import Wizard or just tell Excel not to exclude our critical order information without warning?
I think I've found the cause of this. It seems to be due to regional settings as described in this post and sugesstions from superuser.com
I solved it by wrapping all fields except numbers with quotes and now it works just fine.
Make sure that you use a consistent delimiter scheme across all lines in the file. For example, check if you have both \n and \r\n sequences as line delimiters.

Resources