I have around 120 folders with sub directories but they all contain one file, a "download.txt" file that is created in its own directory bu 120 scripts executing in parallel.
I want to make all these scripts share the same file, but I need to merge or "cat"(I believe cat can be used for this purpose)them first.
What would be the best way do to this in bash/shell?
After the files are closed by writers:
find root_dir -name download.txt -exec cat {} \; > merged_download.txt
where the root_dir is the path to the parent of those 120 directories.
If these txt files are in same folder structure, such as always in folder1/folder2/download.txt, then you can run a simple one:
cat */*/download.txt > merged_download.txt
Related
We have an Ubuntu Server that is only accessed via terminal, and users transfer files to directories within 1 parent directory (i.e. /storage/DiskA/userA/doc1.doc /storage/DiskA/userB/doc1.doc). I need to copy all the specific files within the user folders to another dir, and I'm trying to specifically target the .doc extension.
I've tried running the following:
cp -R /storage/diskA/*.doc /storage/diskB/monthly_report/
However, it keeps telling me there is no such file/dir.
I want to be able to just pull the .doc files from all the user dirs and transfer to that dir, /storage/monthly_report/.
I know this is an easy task, but apparently, I'm just daft enough to not be able to figure this out. Any assistance would be wonderful.
EDIT: I updated the original to show that I have 2 Disks. Moving from Disk A to Disk B.
I would go for find -exec for such a task, something like:
find /storage/DiskA -name "*.doc" -exec cp {} /storage/DiskB/monthly_report/ \;
That should do the trick.
Use
rsync -zarv --include="*/" --include="*.doc" --exclude="*" /storage/diskA/*.doc /storage/diskB/monthly_report/
I have multiple files in different folders with the same name and extension. For example: There are 460 folders and each folder has one file with the name of snps.vcf. I want to copy/move these files to one folder and later on, I will do some analysis that I need to do.
I have tried:
find -type f -name "*.vcf" -exec cp {} /home/AWAN/try';'
but this code overwrites the files and only one file remains there in the end.
I have tried rename but I don't know how to select multiple files by find command then rename. Even with the mmv I couldn't find the possible solution.
You need to write an external script and pass it to -exec.
Your script may use mktemp to generate a random file name. Example:
mktemp /your/directory/try-XXX
The XXX part will be replaced by mktemp with a different value for each call.
I have a bunch of files in separate folders, and all of the folders are in one directory.
/var/www/folder1/file1.txt
/var/www/folder1/file2.txt
/var/www/folder1/file3.txt
/var/www/folder2/file4.jpg
/var/www/folder2/file5.jpg
/var/www/folder2/file6.jpg
/var/www/folder3/file7.pdf
/var/www/folder3/file8.doc
/var/www/folder3/file9.gif
I need everything inside of the folders that are inside of /var/www/ to be copied to another directory (say, /var/my-directory/), but not the actual folders. Based on the example above, I need /var/my-directory/` to look as follows:
/var/my-directory/file1.txt
/var/my-directory/file2.txt
/var/my-directory/file3.txt
/var/my-directory/file4.jpg
/var/my-directory/file5.jpg
/var/my-directory/file6.jpg
/var/my-directory/file7.pdf
/var/my-directory/file8.doc
/var/my-directory/file9.gif
I can't seem to figure out the command to do this. I've tried the following:
sudo cp -R /var/www/./. /var/my-directory/
But, that still copies all of the folders.
Is there any way to do what I'm trying to do?
Use find.
find /var/www/ -type f -exec cp '{}' /var/my-directory/ \;
The trick is -type f that only selects file.
There are some .app files in the folder, such as
folder_A/1.app
/2.app
/subF/3.app
/3.txt
Then i want to use ls command to check if there are any .app files under folder_A, i can use ls -R folder_A to list all the files under "folder_A" and sub folder "subF", but on Macos, the app file is also considered as an directory that ls will list all the files contained in 1.app,2.app and so on.
For example, 1.app contains some .png,.txt; then ls -R folder_A will return all the png and txt files, not the 1.app itself. But i want to list all the app files under folder_A and its sub folder without list all the files included in .app.
The trick is to use the right tool for the job.
find folder_A -name '*.app'
The find command is better suited for traversing a directory hierarchy.
find folder_A -name '*.app'
Find can be easily used to search on specific files
find folder_name -name "*.app" -print
folder_name can be an absolute path or a relative path.
I have a program that extracted files to a series of sub-folders each with a "header" sub-folder. For example:
/share/Videos/Godfather.Part.1
/share/Videos/Godfather.Part.1/<packagename>
/share/Videos/Godfather.Part.1/<packagename>/Godfather.avi
/share/Videos/Godfather.Part.2
/share/Videos/Godfather.Part.2/<packagename>
/share/Videos/Godfather.Part.2/<packagename>/Godfather2.avi
I'd like to take the files in the specified folder <packagename> and move them up one directory so that the file structure looks like this:
/share/Videos/Godfather.Part.1
/share/Videos/Godfather.Part.1/Godfather.avi
/share/Videos/Godfather.Part.2
/share/Videos/Godfather.Part.2/Godfather2.avi
How can I accomplish this task in bash command line? Mind you this is an example using 2 folders, I have 100's like this.
Share and enjoy.
for i in `find . -name "*avi"`
do
dest=`dirname $i`
mv $i $dest/..
done