This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Sed command find and replace in file and overwrite file doesnt work, it empties the file!
Ok so I've got this:
sed "s/^/getHtmlBody\(\"\/NmConsole\/Reports\/Workspace\/Virtualization\/WrVMwareHostList\/WrVMwareHostList.asp\?sGroupList=1'/g" out.bat | sed "s/$/\';--\");/g" >out.bat
And as you can see I'm trying to out into out.bat but it isn't working for some reason - nothing gets displayed on the screen but nothing is written to the file.
What am I doing wrong?
When Bash sees "> out.bat" it truncates the file. The file is now empty, so sed doesn't find the beginnings nor ends of any lines and nothing gets put into out.bat
kojiro's link and an SO asnwer by codaddict has a nice description plus ways to get around it.
Related
This question already has answers here:
Rename multiple files based on pattern in Unix
(24 answers)
Closed 2 years ago.
I have a lot of file with the same format (???_ideal.sdf) name and I need to rename all (???.sdf) removing form the name "ideal". For example:
Files:
002_ideal.sdf
ERT_ideal.sdf
234_ideal.sdf
sCX_idel.sdf
New Files:
002.sdf
ERT.sdf
234.sdf
SCX.sdf
I thought to using a loop but I don’t know how to indicate that in the new file name should be removed "individual".
For example:
for file in ???_individual.sdf; mv $file what?
With your shown samples, could you please try following. This will only print the rename command on terminal(for safer side, check and make sure if commands are fine and looking ok to you first before actually renaming files), to perform actual rename remove echo from following.
for file in *_ideal.sdf
do
echo mv "$file" "${file/_ideal/}"
done
Based on this answer you can also write it as one line with:
rename 's/^(.*)_ideal.png/$1.png/s' **/**
First parameter replaces the the _ideal.png, second one means all files.
This question already has answers here:
Unix command to prepend text to a file
(21 answers)
Closed 2 years ago.
Real nit picky Linux question.
I have a text file, call it userec. I also have a string variable 'var_a'.
I want to concatenate the string value, let just say it's 'howdy' to the top of the text file.
So something like
echo $var_a | cat usrec > file_out
where it pipes the output from the echo $var_a as a file and adds it to the top of file_out and then adds the rest of the usrec file.
So if the userec file contains just the line 'This is the second line' then the contents of file_out should be:
howdy
This is the second line.
problem is that's not what the command is doing and I do not want to create a variable to store var_a in. This is running from a script and I don't want to create any extra flack to have to clean up afterwards.
I've tried other variations and I'm comming up empty.
Can anyone help me?
If you give cat any file names then it does not automatically read its standard input. In that case, you must use the special argument - in the file list to tell it to read the standard input, and where to include it in the concatenated output. Since apparently you want it to go at the beginning, that would be:
echo $var_a | cat - usrec > file_out
I would simply do :
echo $var_a > file_out
cat usrec >> file_out
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
I have a text file of 75000 items, 2 lines for each item. line 1 has an identifier, line 2 a text string.
I need to remove 130 items, random identifiers that I have in a list or can put in a file.
I can carry out the removal for one item, but not for more than one.
I tried piping the identifiers and get an empty output file.
I tried repeated commands of sed -e 'expression' inputfile > outfile. This works, but requires a new output file that then becomes the inputfile for the next iteration and so on. this might be the last resort.
I tried sed -i in iteration; this crashes and the error is that there is no file by the name of the inputfile. Which is clearly not the case, as I can see it, ls it and grep the number of identifiers in it. Only sed can't seem to read it.
I even found a python/biopython script online for this exact problem, it is very simple and does not give error messages, but it also removes only the first item.
I think it has something to do with file properties/temporary files that don't really exist (?).
I am using Ubuntu 12.04 'Precise'
How can I get around this issue?
quick and dirty (no check if modification file is created, ...)
sed
Assuming there is no special meta character in your pattern list
sed 's#.*#/&/{N;d;}#' YourListToExclude > /tmp/exclude.sed
sed -f /tmp/exclude.sed YourDataFile > /tmp/YourDataFile.tmp
mv /tmp/YourDataFile.tmp YourDataFile
rm /tmp/exclude.sed
awk
awk 'FNR==NR{ex=(ex==""?"":ex"|")$0;next}$0!~ex{print;getline;print;next}{getline}' YourListToExclude YourDataFile > /tmp/YourDataFile.tmp
mv /tmp/YourDataFile.tmp YourDataFile
This question already has answers here:
How to concatenate string variables in Bash
(30 answers)
Closed 9 years ago.
I am trying to download files from a database using wget and url. E.g.
wget "http://www.rcsb.org/pdb/files/1BXS.pdb"
So format of the url is as such: http://www.rcsb.org/pdb/files/($idnumber).pdb"
But I have many files to download; so I wrote a bash script that reads id_numbers from a text file, forms url string and downloads by wget.
!/bin/bash
while read line
do
url="http://www.rcsb.org/pdb/files/$line.pdb"
echo -e $url
wget $url
done < id_numbers.txt
However, url string is formed as
.pdb://www.rcsb.org/pdb/files/4H80
So, .pdb is repleced with http. I cannot figure out why. Does anyone have an idea?
How can I format it so url is
"http://www.rcsb.org/pdb/files/($idnumber).pdb"
?
Thanks a lot.
Note. This question was marked as duplicate of 'How to concatenate strings in bash?' but I was actually asking for something else. I read that question before asking this one and it turns out my problem was with preparing the txt file in Windows not really string concetanation. I edited question title. I hope it is more clear now.
It sounds like your id_numbers.txt file has DOS/Windows-style line endings (carriage return followed by linefeed characters) instead of plain unix line endings (just linefeed). The result is that read thinks the line ends with a carriage return, $line actually has a carriage return at the end, and that gets embedded in the url, causing various confusion.
There are several ways to solve this. You could have bash trim the carriage return from the variable when you use it:
url="http://www.rcsb.org/pdb/files/${line%$'\r'}.pdb"
Or you could have read trim it by telling it that carriage return counts as whitespace (read will trim leading and trailing whitespace from what it reads):
while IFS=$'\r' read line
Or you could use a command like dos2unix (or whatever the equivalent is on your OS) to convert the id_numbers.txt file.
The -e echo option is used to output the desired content without inserting a new line, you do not need it here.
Also I suspect your file containing the ids to be malformed, on which OS did you create it?
Anyway, you can simplify your script this way:
!/bin/bash
while read line
do
wget "http://www.rcsb.org/pdb/files/$line.pdb"
done < id_numbers.txt
I was able to successfully test it with an id_numbers.txt file generated like so:
for i in $(0 9) ; do echo "$i" >> id_numbers.txt ; done
Try this:
url="http://www.rcsb.org/pdb/files/"$line
$url=$url".pdb"
For more info, check How to concatenate string variables in Bash?
This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
recursively “normalize” filenames
Q on pastebin: http://pastebin.com/raw.php?i=19iYZpwY
i mean getting rid of special chars in filenames, etc.
i have made a script, that can recursively rename files [http://pastebin.com/raw.php?i=kXeHbDQw]:
e.g.: before:
THIS i.s my file (1).txt
after running the script:
This-i-s-my-file-1.txt
Ok. here it is:
But: when i wanted to test it "fully", with filenames like this [http://pastebin.com/raw.php?i=LQ07ntcS]:
¤¥¦§¨©ª«¬®¯°±²³´µ¶·¸¹º»¼½¾¿ÀÂÃÄÅÆÇÈÊËÌÎÏÐÑÒÔÕ×ØÙUÛUÝÞßàâãäåæçèêëìîïðñòôõ÷øùûýþÿ.txt
áíüűúöőóéÁÍÜŰÚÖŐÓÉ!"#$%&'()+,:;<=>?#[]^_{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ¡¢£.txt<br>
<br>
it fails [http://pastebin.com/raw.php?i=iu8Pwrnr]:<br>
$ sh renamer.sh directorythathasthefiles<br>
mv: cannot stat./áíüűúöőóéÁÍÜŰÚÖŐÓÉ!"#$%&\'()+,:;<=>?#[]^{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ¡¢£': No such file or directorymv: cannot stat./áíüűúöőóéÁÍÜŰÚÖŐÓÉ!"#$%&\'()*+,:;<=>?#[]^{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ¡¢£': No such file or directorymv: cannot stat./áíüűúöőóéÁÍÜŰÚÖŐÓÉ!"#$%&\'()+,:;<=>?#[]^_{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ¡¢£': No such file or directorymv: cannot stat./áíüűúöőóéÁÍÜŰÚÖŐÓÉ!"#$%&\'()+,:;<=>?#[]^{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ¡¢£': No such file or directorymv: cannot stat./áíüűúöőóéÁÍÜŰÚÖŐÓÉ!"#$%&\'()*+,:;<=>?#[]^{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ¡¢£': No such file or directorymv: cannot stat./áíüűúöőóéÁÍÜŰÚÖŐÓÉ!"#$%&\'()+,:;<=>?#[]^_{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ¡¢£': No such file or directorymv: cannot stat./áíüűúöőóéÁÍÜŰÚÖŐÓÉ!"#$%&\'()+,:;<=>?#[]^_`{|}~€‚ƒ„…†....and so on
$
so "mv" can't handle special chars.. :\
i worked on it for many hours..
does anyone has a working one? [that can handle chars [filenames] in that 2 lines too?]
Reading that script was almost painful...
For one, you should read this.
Then you should read about bash functions. After that you should read about sed and tr
Then you should consider this: do you really want to move the file each time that you perform a transformation on its name?
Then after all this thinking, you should come up with something a bit saner.
Wtf is going on your system? You should consider re setting up and pay attention on sane applicaitons and security.
However its very likely that you are just running into the max length of command arguments if i am looking at that.
If not, well install UTF8 locales and install them as system default.
On debian based systems this is usually just a matter of dpkg-reconfigure locales
also work on your accept rate.