Found these answers but from these it is not clear to me how to simply create excel sheets.
These two marked as answer do create a sheet out of a given txt BUT the data in the two txt columns get inserted in a single column in the table created:
How to convert a text file into excel in bash or perl
As if the tab-delimiter didnt work.
This answer does the same to me:
How to write text file data into same cell of excel using bash
This one is too complicated for an amateur:
Paste output into a CSV file in bash with paste command
I just am not able to decipher and simplify the stuff.
This does the same - columns end up merged in the first one:
#!/bin/bash
while read value; do
echo "$value"
done <tabulka.txt > test.csv
May I ask for a simple way to put data to an xls/csv? Im not really a bash expert, just an engineer forced to work with it. Thanks!
EDIT:
sample textfile as requested (tab as delimiter):
header1 header2
aaaa 1.0
bbbb 1.1
cccc 1.3
result:
Ok, so first step is to convert .txt to .csv. You don't provide your input data, so it is hard to see how you want to split it. If we assume, that your input file has fairly regular structure and you want simply replace " " to ",", you can do it like this. (But it would really be better if you clarified your question with sample data.):
#!/bin/bash
while read value; do
echo "$value" | tr "\t" ","
done <tabulka.txt > test.csv
The second part is explained in the link you pasted - Paste output into a CSV file in bash with paste command
Excel can read the csv file ( as can libreoffice ). This works for me.
libreoffice --calc test.csv
If your file is in any way irregular it's not possible to guess, you have to show it.
Edit: I was writing my answer, when you posted your input. So I'm editing it to reflect, that you use tabs in the input.
Related
I have a text file that is pipe-delimited that also has a new line indicator (START_OF_RECORD). The values are enclosed with single quotes and line breaks are expected in the 5th field. Notice the values with line breaks are still enclosed in single quotes though
Does excel have a native way to handle this? As far as I know, excel can only take in a custom delimiter. It's the START_OF_NEW_LINE that is causing the issue.
Sample screen shot of desired output, followed by input, followed by input as text.
|'START_OF_LINE'|'Key 1'|'Key 2'|'Key 3'|'text1
text2
text3
text4
text5'|'Date'|'END_OF_LINE'|'ID 1'|'ID 2'|'ID 3'|'ID 4'|'ID 5'|
|'START_OF_LINE'|'Key 1'|'Key 2'|'Key 3'|'text5
text6
text7
text8
text9'|'Date'|'END_OF_LINE'|'ID 1'|'ID 2'|'ID 3'|'ID 4'|'ID 5'|
I'm sure this can be hacked together with some tedious VBA but am really hoping there is a better way to do this before starting to write out code. I just have no idea how to handle the new line field using native functionality in excel
The case seems to be consistent. I've used notepad++ find and replace function on the text. and it seems to deliver what you need.
copy the text above and paste in notepad++ > replace "|\r\n|" with "|·|" **
then > replace "\n" with "\n||||"
then > replace "|·|" with "|\n"
remove the 1st "|"
copy n paste into excel, with "|" as delimiter.
Done.
**[Note:\r may not appear in the original file.. it is there in the copy paste activity.. omit it if it is not applicable]
If all the above can be executed using regex, then it is just a line of code away.. ( :
Text to column in excel
Problem:I have data like as above in hyperlink image in excel with headers , i need power shell code to split it columns delimited by comma , i can do it on excel by manually but every time i don't want to do such activity ,so any help is much appreciated.
Check code here
## Power Shell Code ##
worksheet.QueryTables.add(TxtConnector,worksheet.Range("A1"))
you can use text to column option in the excel itself for this.
open the File in excel and Go to DATA Tab,
1. Select the Column
2. Data -> Text to columns
3. Delimiter ',' OK
4. Data Split to all columns
Could you clarify what's the input (xls/csv file?) and what you're trying to archieve?
For what i could understand, you can use import-csv -delimiter "," to force it to divide the columns by comma.
If you need it back on a csv, you can pipe the result to export-csv -path $path -useculture, which will use the delimiters set by your current culture. You can also use any other kind of delimiter by using the -delimiter switch.
If this isn't what you were looking for, could you paste your code in the original post instead of using an image? It'll make it easier to read and test :)
I have a question about how to automize the process of copying contents from a .srt file onto a .xls file.
I want to make sure that the content in the .srt file could be pasted into the corresponding columns of the .xls (e.g. the time-in to the B column; time-out to the C column; subtitles to the E column.)
In order to avoid manually copying and pasting, is there a way to script this process? Any ideas?
Thank you very much in advance! :)
UPDATE: I just found that Subtitle Edit can save .srt as csv, which will be able to change the file into an Excel file. That's handy! But there's another problem, I need to copy the content from this csv to another Excel template, which has a different structure, so I can't directly copy and paste the values in the csv. I'm working on how to make this easier...
Can't post images for now, but the situation is that while each time-in text in the srt converted csv file takes up one row, the time-in text in the Excel template takes up two rows, so I can't directly copy and paste all the texts from one excel file to the other. Is there any easier ways to do this? Thank you!
In a script, you can use perl to do the substitution:
perl -0777 -pe 's/\n([^\n])/\t$1/g; s/ --> /\t/g' input.srt | \
perl -ne 's/^\t//; print unless /^$/' > output.csv
For this sample input
1
00:00:01,478 --> 00:00:04,020
Srt sample
2
00:00:05,045 --> 00:00:09,545
<i>italic</i> font
3
00:00:09,378 --> 00:00:13,745
<b>bold</b> font
4
00:00:14,812 --> 00:00:16,144
Multi
Line
you get the following output:
1 00:00:01,478 00:00:04,020 Srt sample
2 00:00:05,045 00:00:09,545 <i>italic</i> font
3 00:00:09,378 00:00:13,745 <b>bold</b> font
4 00:00:14,812 00:00:16,144 Multi Line
Regarding the command: There are two chained perl commands
The first one does the hard work: replaces newlines and arrows with tabs (keeping double newline as one newline).
The second one only does some cleaning, it removes tabs from line beginnings and removes redundant empty lines.
In Vim I would use:
:%s/\(.\)$\n\|-->/\1\t/g | :g/^$/d | :%s=\s\+$==
I know still not a script but now it should be easy to import it in Excel :-)
It means find the line ending with character and substitute it with that character and tabulator or find characters --> and substitute them with tabulator, and then delete empty lines, and at last remove white spaces at the end of the lines.
I have a simple BCP command to query my MSSQL database and copy the result to a .csv file like this:
bcp "select fullname from [database1].[dbo].employee" queryout "c:\test\table.csv" -c -t"," -r"\n" -S servername -T
The issue comes when the fullname column is varchar separated by a comma like "Lee, Bruce". When the result is copied to the .csv file, the portion before the comma (Lee) is placed into the first column in Excel spreadsheet and the portion after the comma (Bruce) is placed in the second column. I would like it to keep everything in the first column and keep the comma (Lee, Bruce) . Does anyone have any idea how to achieve this?
Obviously you should set columns separator to something different than comma. I'm not familiar with the above syntax, but I guess these: -c -t"," -r"\n" are column and new line separators respectively.
Further you should either change default CSV separator in regional settings OR use import wizard for proper data placing in Excel. By the way, there are plenty of similar questions on SO.
I have sql script "example.sql": SPOOL &1 Select '<.TR>'||'<.TD align="left">'||column_name||'<./TD>'||'<.TR>' from table1; spool off..which dumps it contents to cshell script "getdata.csh" this is how i get data from sql script to csh script sqlplus $ORA_UID/$ORA_PSWD #${SQL}example.sql ${DATA}${ext} once i extract data from it i create a excel file by combining 3 files header. html <html>
<.head>
<.title)
Title
<./title>
<./head>
<.body>
<.table >
<.tr>
<.th>Column Name<./th>
<.tr>ext file that has query results and trailer.html <./tr>
<./table>
<./body>
<./html> and i save this file as .xls and send it through email as attachment.. Now my problem is Column_name has data that starts with 0 but when i open excel file leading 0 are gone but i wanna keep that 0.. so what can i add to make sure that email attached excel file will have leading 0 when that is opened on the other side.. plz any help would be good
Using oracle:
Say your attribute is called 'number'
select '0' || to_char(number) as number
from table mytable
Use the excel object model, or a macro to go into the excel file grab the column and change the formatting.
In your case:
Range("A1").Numberformat = "#"
If you're generating the excel file on the fly, you could prepend those numbers with an apostrophe, ie '
This causes Excel to treat the number like a string. The only downside is it might cause some side effects if the sheet has any equations that use those numbers.
I have dealt with this issue in the past, and the problem is strictly a "feature" of Excel formatting. Unfortunately, I don't have the resources to completely test an answer, but here are two things you can try.
Add a step inside your cshell script to surround your $1 value with ="",
awk '{$1= "=\"" $1 "\""; print $0}' inFile > outFile
The downside is that you're now telling Excel to treat these values as strings. If you're doing any fancy calculations on these values you may have different problems.
#2 (why does SO formatting always renumber numbered blocks as 1 !;-!) . As this is really an Excel formatting problem AND in my recollection, you can't retrieve the leading zero once the file has been opened and processed, I seem to remember I had a trick of pre-formatting a black worksheet, saving it as a template, and then loading the file into the template. I recall that was tricky too, so don't expect it to work. You might have to consult Excel users on the best tactics if #1 above doesn't work.
You might also want to tell people what version of Excel you are using, if you go to them for help.
I hope this helps.
P.S. as you appear to be a new user, if you get an answer that helps you please remember to mark it as accepted, and/or give it a + (or -) as a useful answer