How to rename files with filename from one txt file to filename from another txt file in bash? [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I have some filename in a .txt file and some other filename in another .txt file. I want to rename files stored in that folder from filename in the text file line by line.
FROM :
$cat oldname.txt
file1.mp4
file2.mp4
TO :
$cat newname.txt
video1.mp4
video2.mp4
I want some bash script that can execute mv command line by line for each file.
Like
$mv file1.mp4 video1.mp4

Use a proper loop over the file to rename with bash . Open the files separately in different file-descriptors.
#!/bin/bash
while read oldname <&3 && read newname <&4
do
mv "$oldname" "$newname"
done 3<oldname.txt 4<newname.txt

Try:
while read oldname; do
read -u 3 newname
echo mv $oldname $newname
done < oldname.txt 3< newname.txt
This will merely echo the commands. If you like the result, omit the echo.

Related

How do I copy grep output to another text file in different directory? [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 2 days ago.
Improve this question
I am trying to copy specific words from a text file in a directory to another using grep. I have the retrieval of the words I want from the text file, now I just am wondering how I would go about moving it another text file, say in my home directory.
Here is the grep command.
grep -E '^.[*ing]{5}$' words
and here is what I have tried
grep -E '^.[*ing]{5}$' words > words.txt $home
grep -E '^.[*ing]{5}$' words > words.txt /home/
Any help is appreciated!
grep -E '^.[*ing]{5}$' words > /home/words.txt

inserting text in filename using linux shell script [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a couple of files that needs to be renamed by using a linux shell script. I need to insert the text "ID" next to the second character of the filenames:
HP0001.txt
HP0002.txt
HP0003.txt
the script should be able to rename it to
HPID0001.txt
HPID0002.txt
HPID0003.txt
If it's just a "couple of files", it's probably easiest to just rename them manually.
Otherwise, here's a trivial script you can adapt:
for f in HP*.txt; do
f2=`echo $f|sed -e 's/HP/HPID/'`
echo mv $f $f2
done
SAMPLE OUTPUT:
ls HP*
HP001.txt HP002.txt HP003.txt
bash ./tmp.sh
mv HP001.txt HPID001.txt
mv HP002.txt HPID002.txt
mv HP003.txt HPID003.txt
Files in directory before script execution :
HP0001.txt
HP0002.txt
HP0003.txt
main
main script content
cat main.sh
for file in *
do
new_name=$(echo $file|sed 's/HP/&ID/')
mv $file $new_name 2>/dev/null
done
Directory contents after script execution:
ls -1
HPID0001.txt
HPID0002.txt
HPID0003.txt
main

Why `ls` list multiple files per line, but `ls pipe/redirect` list just 1 file per line? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
Just curious, this is normal-expected behavior of ls:
user#host:~$ ls
Codes Documents Music Pictures Templates
Desktop Downloads Papers Public Videos
But when I use ls with pipe/redirection, it behave like ls -1:
user#host:~$ ls | cat
Codes
Desktop
Documents
Downloads
Music
Papers
Pictures
Public
Templates
Videos
Why? (and how to write such program that gives difference output between stdout and pipe like this?)
P.S. I also set alias l='ls -F', and this time pipe/redirection is no longer ls -1 style:
user#host:~$ l | cat
Codes/ Documents/ Music/ Pictures/ Templates/
Desktop/ Downloads/ Papers/ Public/ Videos/
Without using the alias, it does the command in ls -1 style, however:
$ ls -F | cat
Codes/
Desktop/
Documents/
Downloads/
Music/
Papers/
Pictures/
Public/
Templates/
Videos/
You can check this line from the source:
if (format == long_format)
format = (isatty (STDOUT_FILENO) ? many_per_line : one_per_line);
It uses the isatty function to check if stdout points to a tty, and to print many_per_line if it does or one_per_line if it does not.
Here is how GNU ls does it (ls.c):
if (isatty (STDOUT_FILENO))
{
format = many_per_line;
}
else
{
format = one_per_line;
}

Trimming linux log files [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
It seems like a trivial issue, but I did not find a solution.
I have a number of log files in a php installation on Debian/Linux that tend to grow quite a bit and I would like to trim nightly to the last 500 lines or so.
How do I do it, possibly in shell and applying a command to *log?
For this, I would suggest to use logrotate with a configuration to your liking instead of programming your own script.
There might be a more elegant way to do this programmatically, but it is possible to use tail and a for-loop for this:
for file in *.log; do
tail -500 "$file" > "$file.tmp"
mv -- "$file.tmp" "$file"
done
If you want to save history of older files, you should check out logrotate.
Otherwise, this can be done trivially with the command line:
LOGS="/var/log"
MAX_LINES=500
find "$LOGS" -type f -name '*.log' -print0 | while read -d '' file; do
tmp=$(mktemp)
tail -n $MAX_LINES $file > $tmp
mv $tmp $file
done

Appending the contents of a file at the beginning of another file in UNIX [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
The community reviewed whether to reopen this question last year and left it closed:
Not suitable for this site This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
I know that
cat file1 >> file2
would append the contents of file1 at the end of file2. On the other hand, how can I append the contents of file1 at the beginning of file 2, and not at its end?
Actually, i have a single master file M, and several other files in a directory D. I want to append the contents of file M at the beginning of all the files in the directory D.
Just do:
cat file1 file2 > tmp && mv tmp file2
For each file you could do:
cat MASTER file >> file.tmp
And then move file.tmp over file.
You will have to use a temporary file and rename it after merge.
Example:
echo -e "a\nb\nc" > LETTERS
echo -e "1\n2\n3" > NUMBERS
cat NUMBERS LETTERS > TMP
mv TMP LETTERS
cat LETTERS
Your command might look something like:
for file in $( find -name "*.java" ); do cat PREPEND ${file} > ${file}.tmp; mv ${file}.tmp ${file}; done

Resources