Is there a grep command that allows me to grep multiple folders and copy them using a text file containing the file names - linux

So I have a text file containing the names of ~1000 folder names, and a directory with around ~30,000 folders. What I need to do is to find a bash command that will read the text file for the folder names, and grep those folders from the directory and copy them to a new destination. Is this at all possible?
I am new to coding, my apologies if this isn't worded well.

you can use a bash scrit like this one:
fileList=$(cat nameFIle)
srcDir="/home/ex/src"
destDir="/home/ex/dest"
for name in fileList
do
cp -r "${srcDir}/${name}" "${destDir}"/
done

Definitely possible - and you don't even need grep. Assuming your text file has one file per line.
cp -r `cat filenames.txt` path_to_copy_location/

I would write:
xargs cp -t /destination/directory < file.of.dirnames

Related

How to replace an unknown string in multiple files under Linux?

I want to change multiple different strings across all files in a folder to one new string.
When the string in the text files (within a same directory) is like this:
file1.json: "url/1.png"
file2.json: "url/2.png"
file3.json: "url/3.png"
etc.
I would need to point them all to a single PNG, i.e., "url/static.png", so all three files have the same URL inside pointing to the same PNG.
How can I do that?
you can use the command find and sed for this. make sure you are in the folder that you want to replace files.
find . -name '*.*' -print|xargs sed -i "s/\"url\/1.png\"/\"url\/static.png\"/g"
Suggesting bash script:
#!/bin/bash
# for each file with extension .json in current directory
for currFile in *.json; do
# extract files ordinal from from current filename
filesOrdinal=$(echo "#currFile"| grep -o "[[:digit:]]\+")
# use files ordinal to identify string and replace it in current file
sed -i -r 's|url/'"$filesOrdinal".png'|url/static.png|' $currFile
done

How to copy over all files from one directory to another excluding ones that start with a given string in a bash script

I have a directory containing a large amount of files ~1gb. I need to copy over all of them except ones that start with "name" to a different directory. I tried using this: "ls src_folder | grep -v '^name' | xargs cp -t dest_folder" from this pervious question In Linux, how to copy all the files not starting with a given string?
I get the following error when trying to copy over test1.txt from src_folder which contains test1.txt and name.txt to dest_folder
cp: cannot stat `test1.txt': No such file or directory
My current work around is to copy over all of the files, then use find to delete the ones starting with "name" in the dest_folder. This works, but I imagine I could save some time by only copying over the files I really want. Any suggestions?
You can use the shell option extglob. This option extends the bash's pattern matching, so you can use more advanced expressions.
shopt -s extglob
cp src_folder/!(name*) dest_folder
For more info run nam bash and look for extglob.

How to read file containing a string and finding it in a directory

I have a file with a list of file names. I want to find each of those files and copy it to some directory, is this possible in linux?
ListOfFileNames.txt
xyz.txt
ags.txt
shd.txt
...
Directory_to_be searched
dsf.txt
xyz.txt
shd.txt
...
Empty_new_directory
So copy xyz.txt, ags.txt, shd.txt and place them in Empty_new_directory
Any help will be appreciated
xargs cp -t /app/dest/ < ListOfFileNames.txt
Does not work?
Maybe use of find command
If there's no nesting, then you can use a simple loop with cp:
SOURCE='Directory_to_be_searched'
TARGET='Empty_new_directory'
cat File.txt | while read filename; do
cp "${SOURCE}/${filename}" "${TARGET}/${filename}"
done

How do I copy multiple files at once in linux? With the source and the destination locations of these files being the same directory

I have some files located in one directory /home/john
I want to copy all the files with *.text extension from this directory and save them as *.text.bkup, again in the same directory, i.e. /home/john
Is there a single command with which I can do that?
Also, with extension of the same idea, is it possible to copy all the files with multiple extentions (e.g. *.text & *.doc) as *.text.bkup & *.doc.bkup repectively (again in the same directory)?
This is best accomplished with a Shell loop:
~/tmp$ touch one.text two.text three.doc four.doc
~/tmp$ for FILE in *.text *.doc; do cp ${FILE} ${FILE}.bkup; done
~/tmp$ ls -1
four.doc
four.doc.bkup
one.text
one.text.bkup
three.doc
three.doc.bkup
two.text
two.text.bkup
What happens in the code above is the shell gets all .text and .doc files and then loops through each value one by one, assigning the variable FILE to each value. The code block between the "do" and the "done" is executed for every value of FILE, effectively copying each file to filename.bkup.
You can achieve this easily with find:
find /home/john -iname '*.text' -type f -exec cp \{} \{}.backup \;
No, there is no single/simple command to achieve this with standard tools
But you can write a script like this to do it for you.
for file in *.text
do
cp -i "${file}" "${file}.bkup"
done
with -i option you can confirm each overwriting operation
I sort of use a roundabout way to achieve this. It involves a Perl script and needs additional steps.
Step 1:
Copy the names of all the text files into a text file.
find -maxdepth 1 -type f -name '*.txt' > file_name1.txt
Step 2:
Make a duplicate of the copied file.
cp file_name1.txt file_name2.txt
Now open the file_name2.txt in vi editor and do a simple string substitution.
%s/.text/.text.backup/g
Step 3: Merge the source and destination file names into a single file separated by a comma.
paste -d, file_name1.txt file_name2.txt > file_name.txt
Step 4: Run the below perl script to achieve the desired results
open(FILE1,"<file_name.txt") or die'file doesnt exist'; #opens a file that has source and destination separated beforhand using commas
chomp(#F1_CONTENTS=(<FILE1>)); # copies the content of the file into an array
close FILE1;
while()
{
foreach $f1 (#F1_CONTENTS)
{
#file_name=split(/,/,$f1); # separates the file content based on commas
print "cp $file_name[0] $file_name[1]\n";
system ("cp $file_name[0] $file_name[1]"); # performs the actual copy here
}
last;
}

copy multiple files from directory tree to new different tree; bash script

I want to write a script that do specific thing:
I have a txt file e.g.
from1/from2/from3/apple.file;/to1/to2/to3;some not important stuff
from1/from2/banana.file;/to1/to5;some not important stuff
from1/from10/plum.file;/to1//to5/to100;some not important stuff
Now i want to copy file from each line (e.g. apple.file), from original directory tree to new, non existing directories, after first semicolon (;).
I try few code examples from similar questions, but nothing works fine and I'm too weak in bash scripting, to find errors.
Please help :)
need to add some conditions:
file not only need to be copy, but also rename. Example line in file.txt:
from1/from2/from3/apple.file;to1/to2/to3/juice.file;some1
from1/from2/banana.file;to1/to5/fresh.file;something different from above
so apple.file need to be copy and rename to juice.file and put in to1/to2/to3/juice.file
I think thaht cp will also rename file but
mkdir -p "$to"
from answer below will create full folder path with juice.file as folder
In addidtion after second semicolon in each line will be something different, so how to cut it off?
Thanks for all help
EDIT: There will be no spaces in input txt file.
Try this code..
cat file | while IFS=';' read from to some_not_important_stuff
do
to=${to:1} # strip off leading space
mkdir -p "$to" # create parent for 'to' if not existing yet
cp -i "$from" "$to" # option -i to get a warning when it would overwrite something
done
Using awk
(run the awk command first and confirm the output is fine, then add |sh to do the copy)
awk -F";" '{printf "cp %s %s\n",$1,$2}' file |sh
Using shell (get updated that need manually create folder, base on alfe's
while IFS=';' read from to X
do
mkdir -p $to
cp $from $to
done < file
I had this same problem and used tar to solve it! Posted here:
tmpfile=/tmp/myfile.tar
files="/some/folder/file1.txt /some/other/folder/file2.txt"
targetfolder=/home/you/somefolder
tar --file="$tmpfile" "$files"​
tar --extract --file="$tmpfile" --directory="$targetfolder"
In this case, tar will automatically create all (sub)folders for you! Best,
Nabi

Resources