I need rename files in my folders.
Each foder contain next files:
1.png
2.png
3.png
I need to rename as below:
1.png > a.png
2.png > b.png
3.png > c.png
Thanks a lot!
Open desired folder in TotalCmd and select files to rename (Ctrl-A for all files)
Press Ctrl-M to open Multi-Rename dialog
Fill string "[Ca]" (without quotation marks) into the field "Rename mask file name" and click on button "Start!"
You can replace the letter "a" in string "[Ca]" with any letter or number, and the renaming-sequence will begin with this character (try "[Caaa]" for output files "aaa.png", "aab.png", "aac.png", ...).
Here's what I do from the shell prompt to rename a bunch of files in a PictsFolder.
Create a file called rename.sh
#!/bin/bash
cd /Users/jscarry/Desktop/PictsFolder
mv 1.png a.png
mv 2.png b.png
exit
chmod it to be executable and then run it from the command line.
jscarry$ /Users/jscarry/Files/renamePicts.sh
Be sure to escape special characters like spaces in your file names.
e.g. File\ Name\ With\ Spaces.png
Related
I have a list of file names contained within a text file (a.txt). I want to extract from a directory (b) the files listed in a.txt to a new directory (c). The syntax of the filenames in a.txt and b match. The files in a.txt are empty and the files in b contain the json message of interest.
For example, the contents of a.txt look like:
ML3DBHCN___005.json
OCO2_L2_Standard___10r.json
GPM_3IMERGM___06.json
and b:
b/ML3DBHCN___005.json
b/OCO2_L2_Standard___10r.json
b/GPM_3IMERGM___06.json
Do i need to write a small .sh file that iterates through a.txt and extracts from b or can this be completed at once via command line?
If you know the filenames don't contain whitespace or wildcard characters, you can do it as a simple one-liner:
cp $(<a.txt) b/
If they can contain special characters, you can read them into an array:
readarray files <a.txt
cp "${files[#]}" b/
If you want to move from b to c the files named in a.txt (and they don't have spaces or wildcards):
(cd /path/to/b && mv $(< /path/to/a.txt) /path/to/c/)
I want to change multiple different strings across all files in a folder to one new string.
When the string in the text files (within a same directory) is like this:
file1.json: "url/1.png"
file2.json: "url/2.png"
file3.json: "url/3.png"
etc.
I would need to point them all to a single PNG, i.e., "url/static.png", so all three files have the same URL inside pointing to the same PNG.
How can I do that?
you can use the command find and sed for this. make sure you are in the folder that you want to replace files.
find . -name '*.*' -print|xargs sed -i "s/\"url\/1.png\"/\"url\/static.png\"/g"
Suggesting bash script:
#!/bin/bash
# for each file with extension .json in current directory
for currFile in *.json; do
# extract files ordinal from from current filename
filesOrdinal=$(echo "#currFile"| grep -o "[[:digit:]]\+")
# use files ordinal to identify string and replace it in current file
sed -i -r 's|url/'"$filesOrdinal".png'|url/static.png|' $currFile
done
I want to undo a move command that I did by moving back all the files in a folder ("MySourceDir") to corresponding paths specified in a .txt file ("ListOfFilePaths.txt).
For eg.:
MySourceDir
File1.txt
File2.txt
File3.txt
.
.
.
I have a text file containing the file paths for each of these files, indicating the directories they were originally in and thus, need to be moved back to.
ListOfFilePaths.txt
/path/to/dirA/File1.txt
/path/to/dirB/File2.txt
/path/to/dirC/File3.txt
I could probably do this in two ways. Write a loop to 1) grep the directory path for each file and then move it to the corresponding grepped directory path OR 2) remove the "File#.txt" portion from the directory path and then do a mv command for each file in the list such that the nth file is moved to the nth directory.
In either case I'm not familiar with writing loops in bash, so any help would be much appreciated!
(Similarly, would also appreciate a command to copy these files back to the original folder instead of moving them, keeping the timestamp unchanged :-) )
From what I understand, you need to:
Loop through the text file
Get the line, extract the text after the final slash (to give you the file name)
Get the destination dir from the line
Copy the file from the source dir to the dest dir
The code below should help:
while read line; do
fileName=$(basename $line)
dirName=$(dirname $line)
cp SourceDir/"$fileName" "$dirName"
done < ListOfFilePaths.txt
basename extracts the filename from a file path.
dirname extracts the dir from a file path.
References:
https://ss64.com/bash/dirname.html
https://linux.die.net/man/1/basename
I have 2 loops which create some folders that contain words that begin with 0,1,..,26 each.
(e.g.
file 0 contains:
0yes
0no
file 1 contains:
1yes
1no
....
file 26 contains:
26yes
26no
Those show the files, but the words inside are not sorted.
file=$1
directory=$2
I made this final loop, where file is the file that contains some text, and directory is the directory that contains the file.
for i in {0..26}; do
echo The file "$i" from Directory $2 contains:
cat "$i"
done
Where should I place the sort -d ? Because they are alphanumeric words, I must use sort -d.
Assuming the files themselves are not to be changed, just replace cat with sort -g.
I found the slimier post from STO but those does not filter files with extension. So writing again.
I an writing a shell script to keep last (most latest) 3 .txt files in directory and wants to remove all other .txt files.
For Example... In Directory "Home" I have following files.
test.txt
my.txt
image.jpg
test.avi
sample.txt
country.txt
study.txt
When I run linux script, output should be like as below....
Keep File (keep only last 3 .txt files only)
test.txt
my.txt
image.jpg
test.avi
sample.txt
Delete File
country.txt
study.txt
Thanks
List entries by ctime (newest first), skip the first three items, delete the rest:
ls -c *.txt | tail -n +4 | xargs rm