DataGrip wrong character export to a file - jetbrains-ide

I'd like to ask how I solve the problem with DataGrip doing the wrong conversion of special characters. Chinese and Japanese characters are broken.
I see the correct characters in the DataGrip console after the query. But when I try to export this data into a CSV file, all these characters get broken there.

Solved.
I saved the file in csv and after I opened it with Excel. So Excel shows to me the wrong character.
The solution is to use another text editor.

Related

Generating file which can Excel easily open and save

i am exporting data from database to file, which can Excel read and save.
(CSV) I generate csv, with default format (according to RFC 4180, comma delimeter). As expected, stupid excel read all data and place it to one cell.
(CSV with semicolon delimeter), this one excel read fine, but after change some value and press save (CTRL+S), stupid excel saved it to unreadable file (well done!). No delimeters, no string separators. Ok, so i tried to save it as (CSV format with SEMICOLON delimeter), saved file looks ok, but after opening it with excel, error message was showed - INCORRECT FORMAT - no cell found :D really?!
Generating .xsl file in php. It take too much RAM (about 2GB), so it cant be used.
Do you know any good format, which can excel easily open and easily save?
Thanks a lot!
This question is off-topic, but IMHO Excel 2002/2003 XML Format would be the best choice in your circumstances.
The reason for this is that the data in this format is typed - so you will not see numbers misinterpreted as dates, or phone numbers with leading zeros stripped. I am not aware of the kind of problems you describe, so I cannot say for sure how those will be affected.

Excel xlsx file saving as CSV file - Korean and Japanese cracking badly

I am trying to make a CSV file from an Excel file. It has English, Korean and Japanese inputs. Right now it's saved as file.xlsx.
But when I try to save-as CSV through Excel as file.csv, all the Korean and Japanese inputs turn into question marks (???????)
I tried importing into Google Spreadsheets and exporting out as csv from there (from reading some other solutions) but it still turns into question marks.
I tried building a CSV file from scratch and just copying/pasting values from the Excel file into the CSV, but after I save it as CSV, the characters always crack.
Does anybody know how to work-around this? Thank you
I don't know that there IS an answer for this. CSV has no encoding, so it gets lost when you save in that format.
I tried, as a test, saving Chinese characters as a Unicode Text file, and believe it or not, that worked. So you may be able to do that, and simply change the filename to CSV. Assuming for some reason you NEED the filename to be CSV.
EDIT: I just ran addional testing on this. I was able to reimport the TXT file with either TXT or CSV extension, and the characters stayed just fine. So I think Unicode text is your answer.
Simply opening a CSV file in Excel only works when default assumptions hold. You may be writing the CSV correctly but not validating it properly.
It is more reliable to open a blank worksheet and then use Data Import. The encoding of the CSV file is one of the parameters you can specify.
To fully retain the characters while saving it on a CSV format and to somehow be able to import/re-use the data in the future.
You can follow these steps.
In Microsoft Excel, open the *.xlsx file.
Select Menu | Save As.
Enter any name for your file.
Under "Save as type," select Unicode Text.
Click Save.
Open your saved file in Microsoft Notepad.
Replace all tab characters with commas (",").
Select a tab character (select and copy the space between two column headers)
Open the "Find and Replace" window (Press Ctrl+H) and replace all tab characters with comma .
Click Save As.
Name the file, and change the Encoding: to UTF-8.
Change the file extension from .txt to .csv.
Click Save.
Open the .csv file in Excel to view your data.
Had the same issue. the below article shows the workaround in details:
https://help.salesforce.com/articleView?id=000003837&type=1
However, i decided to go with LibreOffice Calc, as it requires less steps to achieve the desired outcome. While exporting, you get to select charecter set, field delimiter and text decimeter.
For all other tasks, i prefer Excel.
Download and install Unicode CSV Addin for excel.
Save the csv from the new "Unicode CSV" menu as shown in picture
below.

Excel to Tab delimited (.txt) file with special character

I want to export Excel to Tab delimited (.txt) file with special character.
I've changethe format to UTF-8, but it still doesnt work.
The origin data is like Mädchen and what i got is M?dchen
Anyone can help me? Thank you
This was similarly asked and answered before: Excel to CSV with UTF8 encoding. Take a look at Eric's answer.
The best way to do it is to save the file as an Unicode text file. And save as the unicode text file, change the encoding to utf-8
Save it as Unicode text file. It will still save as Tab Delimited and preserve all the versatile non-English characters.

Opening tab-delimited text file in Excel mangles special symbols

I have a tab-delimited text file which contains dagger characters (†). When I open this in Excel 2010, they are mangled and replaced with † (I'm not sure if Excel is adding the space, too). Why does this occur and how can I fix it?
Right now I do search and replacing in Excel to replace the †s, but it's inefficient for many files and hacky.
The original file is not using the character encoding that Excel expects.
See
Character Encoding and the ’ Issue
Excel's Import Wizard is better at handling encoding issues and may be able to open your source file properly. See
Microsoft Excel mangles Diacritics in .csv files?
Nevermind, after taking another look at Open .csv file containing special characters in Excel I realized that instead of Right Click -> Open with Excel, if I go to File -> Open in Excel it lets me choose the encoding.

How do I export an Excel file with Chinese characters to a CSV?

I having a Excel document with a data table containing Chinese characters. I am trying to export this Excel spreadsheet to a CSV file for importing into a MySQL database.
However, when I save the Excel document as a CSV file, Notepad displays the resulting CSV file's Chinese characters as question marks. Importing into MySQL preserves the question marks, completely ignoring what the original Chinese characters are.
I'm suspecting this may have to do with using Excel with UTF-8 encoding. Thanks for your help!
The following method has been tested and used to import CSV files in MongoDB, so it should work:
In your Excel worksheet, go to File > Save As.
Name the file and choose Unicode Text (*.txt) from the drop-down list next to "Save as type", and then click Save.
Open the unicode .txt file using your preferred text editor, for example Notepad.
Since our unicode text file is a tab-delimited file and we want to convert Excel to CSV (comma-separated) file, we need to replace all tabs with commas.
Select a tab character, right click it and choose Copy from the context menu, or simply press CTRL+C as shown in the screenshot below.
Press CTRL+H to open the Replace dialog and paste the copied tab (CTRL+V) in the Find what field. When you do this, the cursor will move rightwards indicating that the tab was pasted. Type a comma in the Replace with field and click Replace All.
Click File > Save As, enter a file name and change the encoding to UTF-8. Then click the Save button.
Change the .txt extension to .csv directly in Notepad's Save as dialog and choose All files (.) next to Save as type, as shown in the screenshot below.
Open the CSV file from Excel by clicking File > Open > Text files (.prn, .txt, .csv) and verify if the data is Okay.
Source here
As far as I know Excel doesn't save CSV files in any Unicode encoding. I have had similar issues recently trying to export a file as CSV with the £ symbol. I had the benefit of being able to use another tool altogether.
My version of Excel 2010 can export in Unicode format File > Save As > Unicode Text (.txt), but the output is a tab-delimited, UCS-2 encoded file. I don't know MySQL at all but a brief look at the specifications and it appears to handle tab delimited imports and UCS-2. It may be worth trying this output.
Edit: Additionally, you could always open this Unicode output in Notepad++ convert it to UTF-8 Encoding > Convert to UTF-8 without BOM And possibly replace all tab chars with commas too (Use the Replace dialogue in Extended Search mode, \t in the Find box and , in the Replace box.)
You might want to try notepad++, I doubt notepad will support unicode characters.
http://notepad-plus-plus.org/
For some people this solution may work: https://support.geekseller.com/knowledgebase/utf-8/
When saving csv, go to lower right Tools > Web Options > Encoding > Unicode (UTF-8)
Or this SO answer: just use Google Sheets to save csv as unicode:
Excel to CSV with UTF8 encoding
I have tried all above methods for my data but it does not quite work for my data (Simplified Chinese, over 700Mb. I have tried Windows Chinese and English system, English and Chinese excel. Windows excel seems not be able to save to utf8 even it claims to do so. I specify the uft8 csv in save as, but when i use the 'open sheet' to detect the encoding mehtods. it is not uft8,not GB* as well.
Here is my final solution.
(1) Download 'open sheet'.
(2) Open it properly. You Ccan scroll the encoding method until you see the Chinese character displayed in the preview windows.
(3) Save it as utf-8(if you want utf-8).
PS:You need to figure out the default encoding in your system. As far
as I know, Ubuntu deals with UTF8 fine. But the windows default
Simplied Chinese is start with GB**.Even if you encode it as utf8,
still, you might open it cocrrectly as well. In my case, r could not
open my utf-8 csv, but can open the GB* encoding.
This methods work well even your file is very large.
Some other work around is google sheet(but the file size can be limited). Notepad++ also works for smaller file.
There is a way to detect the encoding methods by opening your file and scroll through the encoding methods until you see the Chinese displayed correctly.
You should save csv file with:
df.to_csv(file_name, encoding = 'utf_8_sig')
instead of:
df.to_csv(file_name, encoding = 'utf-8')

Resources