Batch copy files from text list of file names in linux - linux

I have a list of images in a text file in the following format:
abc.jpg
xyz.jpg
The list contains about a hundred images in various directories within a specific directory. I would like to create a shell script that finds and copies these files into a specified directory.
Can this script be adapted to what I need?
Copy list of file names from multiple directories

you do not need a script for this, a simple oneliner will do:
(assuming, that the full filepath, or the relative filepath to were you executing the command is written in your_file.txt file for every image)
cat your_file.txt | xargs find path_to_root_dir -name | xargs -I{} cp {} specfic_directory/
xargs will take multiple lines and runs the command you give it with the input of every line with -I you can specify a variable, for where the content of the line is placed in the command (default is at the end).
So this will take every line from you file, search the file in all subdirectories of path_to_root_dir and then does a copy.

Related

Adding a line to multiple files in directory

I have a line to be added to 3rd line of all files in directory. What's the commandline to do this operation
Lets say I want to add "color #c2451 " to 3rd line of files in Class directory
Try and use the find cmd piped into xargs and the sed cmd.
You'd have to cd into the directory first with the files.
find . -type f -name '*' | xargs sed -i "3i color #c2451"
Add text to file at certain line in Linux
Change multiple files

Linux - Match Filename passed to a script against a wildcard list in a text file and move based off that

I am trying to write a script that will perform the following. Here is the detail.
1) Files will land into a staging folder
/tmp/staging
2) We need to move files to two folders
/tmp/folder1 (files we dont want but can archive)
/tmp/folder2 (files we want to process)
In order to do this, we need to match the landed file against a list of exlusions that are wildcards such as TST_, PMP_, if it matches them, we move the file to folder1, if it does not match not we move it to folder2.
I have tried to setup a find command that will look through the list and move based off what it finds below, what what I really want is to only match the file that I pass as a parameter to the list.
FILTER=/tmp/filter.txt
STG=$1 #this is passed as a paremter which is the filename
FOL1=/tmp/folder1
FOL2=/tmp/folder2
FILE=$2
while read line
do
find "$STG" -type f -name "$line" -exec mv -t "{$STG%/*}$FOL2" {} \;"
done < "$FILTER"

Find a zip file, print path and zip contents

I have a series of numbered sub-directories that may or may not contain zip files, and within those zip files are some single-line .txt files I need. Is it possible to use a combination of find and unzip -p to list the file path and the single line contents on the same output line? I'd like to save the results to a .txt and import it into excel to work with.
From the main directory I can successfully find and output the single line:
find . -name 'file.zip' -exec unzip -p {} file.txt \;
How can I prefix the find output (i.e. the file path) to the output of this unzip command? Ideally, I'd like each line of the text file to resemble:
./path/to/file1.zip "Single line of file1.txt file"
./path/to/file2.zip "Single line of file2.txt file"
and so on. Can anyone provide some suggestions? I'm not very experienced with linux command line beyond simple commands.
Thank you.
Put all the code you want to execute into a shell script, then use the exec feature to call the shell script, i.e.
cat finder.bash
#!/bin/bash
printf "$# : " # prints just the /path/to/file/file.zip
unzip -p "$#" file.txt
For now, get that to work, you can make it generic to pass others besides file.txt later.
Make the script executable
chmod 755 finder.bash
Call it from find. i.e.
find . -name 'file.zip' -exec /path/to/finder.bash {} \;
(I don't have an easy way to test this, so reply in comments with error msgs).

How to extract name of all files contained in a folder into a .txt file?

I want to extract name of all files contained in a folder into a .txt file for ex: when we type ls command in terminal it shows list all the files and folder names.
Can we store all these names in a .txt file.
You can redirect the output of the ls command to a file with > like so:
ls > files.txt
Note this will overwrite any previous contents of files.txt. To append, use >> instead like so:
ls >> files.txt
By using the > character, a command's output can be written to a named file. For example,
ls > list.txt
This will directly post the output of the ls command into the file list.txt. However, this will also overwrite anything that's in list.txt. To append to the end of a file instead of overwriting, use >>:
ls >> list.txt
This will leave the previous contents of list.txt in tact, and add the output of ls to the end of the file.
If you really care about files and want to skip directories, I'd use find.
find . -type f --mindepth 1 --maxdepth 1 >> list.txt
Note, that this list will also contain list.txt, as it is created before spawning find in the current directory.
The advantage of using find instead of ls is that it will include "hidden" files (files starting with a . in their name).

Linux: Find all folders with a particular name, remove them and have a folder copied into the parent directory of those folders

I am trying to see if I can do the following with a single line of command in Linux:
I have a folder called FolderA that sits in 3 different spots in my PC. I have to run a command across a few Linux machines to replace FolderA (they could all be hidden in separate parent folders, get their locations and replace FolderB (which I know where it is and it is a fixed path, say in my current directory, which is different from where FolderA is.) Delete FolderA, and copy FolderB into where FolderA is.
I know this is a lot to do and I can roughly figure out to use find command to get the locations, rm -rf to remove the folders (but I don't know how I can make use of the results in find) and then use cp to copy the folder. However how can I do them in a single line?
Thanks!
Here, I think this should do what you want.
find / -name '*FolderA' -delete -print | xargs -l dirname | xargs -l cp FolderB
The find command will search through your whole filesystem for a path that ends in FolderA, delete it, then print the path of the folder. xargs -l takes each line from the find output and call dirname with each line as the argument. dirname takes a path and truncates the final item on the path. The last command uses xargs to put each line of output from the previous command as the destination of the cp command. Warning: this has not been tested with spaces in the path.

Resources