How to specify a set of strings in bash command arguments - linux

I am trying to use a type of set notation in a bash command and can't find the answer anywhere. I want to do something like:
cp combined.[txt|jpg|pdf] ~
Where this will copy all files, named combined, with txt jpg or pdf endings.
How can I do this?

From your description, it sounds like you want:
cp combined.{txt,jpg,pdf} ~
But I may be misunderstanding you. I'm not sure why you wrote *.combined instead of combined, given that your description simply says that the files are "named combined".
Either way, see the Bash Reference Manual, §3.1 "Brace Expansion", for more information.

Related

What is the .bashrc equivalent of $MYVIMRC?

I can get the path of .vimrc file by echo $MYVIMRC, then is their a $MYBASHRC-like approach for .bashrc?
I have tried $MYBASHRC, $BASHRC and $BASH, but all failed.
If there is one, what is it? If not, how can I define one myself?
Look in the Bash manual under startup files.
BASH_ENV seems to be closest to what you're looking for, but read the description very carefully — and compare and contrast with the other subsections under 'startup files'.

how to use do loop to read several files with similar names in shell script

I have several files named scale1.dat, scale2.dat scale3.dat ... up to scale9.dat.
I want to read these files in do loop one by one and with each file I want to do some manipulation (I want to write the 1st column of each scale*.dat file to scale*.txt).
So my question is, is there a way to read files with similar names. Thanks.
The regular syntax for this is
for file in scale*.dat; do
awk '{print $1}' "$file" >"${file%.dat}.txt"
done
The asterisk * matches any text or no text; if you want to constrain to just single non-zero digits, you could say for file in scale[1-9].dat instead.
In Bash, there is a non-standard additional glob syntax scale{1..9}.dat but this is Bash-only, and so will not work in #!/bin/sh scripts. (Your question has both sh and bash so it's not clear which you require. Your comment that the Bash syntax is not working for you suggests that you may need a POSIX portable solution.) Furthermore, Bash has something called extended globbing, which allows for quite elaborate pattern matching. See also http://mywiki.wooledge.org/glob
For a simple task like this, you don't really need the shell at all, though.
awk 'FNR==1 { if (f) close (f); f=FILENAME; sub(/\.dat/, ".txt", f); }
{ print $1 >f }' scale[1-9]*.dat
(Okay, maybe that's slightly intimidating for a first-timer. But the basic point is that you will often find that the commands you want to use will happily work on multiple files, and so you don't need shell loops at all in those cases.)
I don't think so. Similar names or not, you will have to iterate through all your files (perhaps with a for loop) and use a nested loop to iterate through lines or words or whatever you plan to read from those files.
Alternatively, you can copy your files into one (say, scale-all.dat) and read that single file.

How can I pass all the filenames in a directory as an argument to a command in Bash?

I'm using Subliminal (a tool to find subtitles for any given media file) to get the subtitles for a bunch of TV series episodes, and right now, I'm doing it manually, for every single episode. It's a tedious process. Instead, I'd like to automate it using Bash.
Me not being a Bash-ninja, tried this first:
for i in /dir/*.avi; do subliminal -l en -- "$i"; done;
But obviously, that didn't work.
subliminal also accepts multiple filenames as parameters, so the following works as well:
subliminal -l en -- file1.avi file2.avi ... filen.avi
But it's quite a lot of work to manually type and tab-complete every file name. I figured there'd be some easier way to accomplish this? Maybe using xargs, but I'm not sure.
What are your ideas?
Wildcards expand before the command is run.
subliminal -l en -- file*.avi

Replace pwd with USER in a file

I know that this is quite an easy thing for any advanced Vim programmer, but I have been trying to find a solution for a couple of hours now.
In my results file, there are certain lines like:
/Users/name/Project/Task1/folder1 : INFO : Random Info message
Here, /Users/name/Project/Task1/folder1 is my pwd i.e present working directory.
I want to replace all the occurrences of my pwd above in the file with 'USER'. How can I do that?
:%s#/Users/name/Project/Task1/folder1#USER#g
or
:%s#<C-r>=getcwd()<CR>#USER#g
If I understand you correctly you can simply use the search and replace functionality and escape the / character like this:
:%s/\/Users\/name\/Project\/Task1\/folder1/USER/
If you need to replace multiple current working directories (and thus want to have the pwd to be dynamic) it is probably easier to use something like sed:
sed "s~$(pwd)~USER~" < file
Note that the ~ is used as a delimiter for the command instead of the /, this way we do not need to escape the / in the path.

Find required files by pattern and the change the pattern on Linux

I need to find all *.xml files that matched by pattern on Linux. I need to have written the file name on the screen and then change the pattern in the file just was found.
For instance.
I can start the script with arguments for keyword and for value, i.e
script.sh keyword "another word"
Script should find all files with keyword and do the following changes in the files containing keyword.
<keyword></keyword> should be the same <keyword></keyword>
<keyword>some word</keyword> should be like this <keyword>some word, another word</keyword>
In other words if initially value in keyword node was empty, then I don't need to change it and if it contains some value then I need to extend it with the value I will specify.
What is best way to do this on Linux? Using find, grep, sed?
Performance is also important since the number of files are thousands.
Thank you.
It seems using a combination of find, grep and sed would do this and they are pretty fast since you'll be doing text processing so there might not be a need for xml processing but if you could you give an example or rephrase your question I might be able to provide more help.

Resources