How to extract all .tgz files in subdirectories? [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
I have a list of *tgz files in a ./main/ directory such as
$ tree -L 3
.
├── 191009-Grace_587_8G_R2
│   └── Grace_587_8G_R2
│   └── output.tgz
├── 191009-Grace_V0G_R2
│   └── Grace_V0G_R2
│   └── output.tgz
├── 191009-Grace_V8G_R2
│   └── Grace_V8G_R2
│   └── output.tgz
├── 191014-Grace_587_0G_R2
│   └── Grace_587_0G_R2
│   └── output.tgz
├── 191014-Grace_587_8G_R2
│   └── Grace_587_8G_R2
│   └── output.tgz
├── 191014-Grace_V0G_R2
│   └── Grace_V0G_R2
│   └── output.tgz
└── 191014-Grace_V8G_R2
└── Grace_V8G_R2
└── output.tgz
I am wondering how to extract them all together to the directory containing them.

Using tar's -C / --directory option:
-C, --directory=DIR
Change to DIR before performing any operations. This option is order-sensitive, i.e. it affects all options that follow.
for i in *-Grace_*/Grace_*/output.tgz; do
tar xzf "$i" --directory="${i%/*}"
done
The parameter expansion ${i%/*} removes the filename from the path (like the dirname command). To extract the files to the main directory, remove the --directory option.
Using find with the -execdir option:
find . -type f -name 'output.tgz' -execdir tar xfz {} +

Related

Script to move files from sub directories to root folder [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 months ago.
Improve this question
I have the following directory and sub directories structure:
eva tree
.
├── 1061
│   └── 2022
│   └── 09
│   └── 21
│   └── a0e51f58-5057-4002-b4c4-d3fb870e9b3a.json
├── 1769
│   └── 2022
│   └── 08
│   └── 30
│   └── e36d8e21-5184-489f-89b5-eb1fd5eba5f6.json
├── 1991
│   └── 2022
│   └── 09
│   └── 16
│   └── 1d0a4162-7e66-44c8-8b61-f3bc5dbdb107.json
I need to move all .json files to root folder eva.
Expected output:
.
│a0e51f58-5057-4002-b4c4-d3fb870e9b3a.json
│e36d8e21-5184-489f-89b5-eb1fd5eba5f6.json
│1d0a4162-7e66-44c8-8b61-f3bc5dbdb107.json
How can I do it using bash?
You can use find . -type f -name '*.json' -exec mv {} . \;
find . searches across the current directory;
-type f all objects of type f (aka file);
-name '*.json' file names that ends with the .json;
-exec [COMMAND]\; for each file found, run [COMMAND];
mv {} ., the curly brackets {} represent the found file, the command mves it to the current directory (.)

Is it possible to compress files without keeping the structure in Linux? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I have the following directory structure:
A0
├── A1
│   ├── A1_B1
│   │   ├── A1_B1_1.docx
│   │   ├── A1_B2_2.pptx
│   ├── A1_B2
│   │   └── A1_B2_C1
│   │   ├── A1_B2_C1_D1
│   │   │   ├── A1_B2_C1_D1_1.docx
│   │   │   └── A1_B2_C1_D1_2.docx
│   │   └── A1_B2_C1.xlsx
├── A2
└── A0.txt
I want to create a .7z file that will contain only the files. I don't want to keep the folders. I have tried this answer and this answer but they don't work in Linux.
Is it possible to do it with 7z or I should extract files to a single directory first and then compress.
If for some reason the answers you reference to don't work try this instead.
Create a directory
mkdir flat_dir
Link all files from the desired folder recursively in flat_dir, for me the desired folder was cpptest.
for full_path_file in $(find ../cpptest -type f)
do
echo "$full_path_file"
filename=$(echo "$full_path_file" | rev | cut -d '/' -f 1 | rev)
echo "$filename"
ln -s -T "$full_path_file" "$filename"
done
Zip the files
7z a test.zip -l ~/flat_dir

Recursively compressing images in a folder-structure, preserving the folder-structure

I have this folder-strucutre, with really heavy high-quality images in each subfolder
tree -d ./
Output:
./
├── 1_2dia-pedro
├── 3dia-pedro
│   ├── 101MSDCF
│   └── 102MSDCF
├── 4dia-pedro
└── Wagner
├── 410_0601
├── 411_0701
├── 412_0801
├── 413_2101
├── 414_2801
├── 415_0802
├── 416_0902
├── 417_1502
├── 418_1602
├── 419_1702
├── 420_2502
└── 421_0103
18 directories
And, I want to compress it, just like I would do with ffmpeg, or imagemagick.
e.g.,
ffmpeg -i %%F -q:v 10 processed/%%F"
mogrify -quality 3 $F.png
I'm currently think of creating a vector of the directories, using shopt, as discussed here
shopt -s globstar dotglob nullglob
printf '%q\n' **/*/
Then, create a new folder-compressed, with the same structure
mkdir folder-compressed
<<iterate the array-vector-out-of-shopt>>
Finally, compress, for each subfolder, something in the lines of
mkdir processed
for f in *.jpg;
do ffmpeg -i "$f" -q:v 1 processed/"${f%.jpg}.jpg";
done
Also, I read this question, and this procedure seems close to what I would like,
for f in mydir/**/*
do
# operations here
done
Major problem: I'm bash newbie. I know all tools needed are at my disposal!
EDIT: There is a program that, for the particular purpose of compressing images with lossless quality, gives us a a-liner, and a lot of options to this compression. The caveat: make another copy of the original folder-structure-files, because it will change them permanently in the folder-structure-files you give it.
cp -r ./<folder-structure-files> ./<folder-structure-files-copy>
image_optim -r ./<folder-structure-files-copy>
I think #m0hithreddy solution is pretty cool, though. Certainly, I will be using that logic elsewhere anytime soon.
Instead of pre-mkdiring directories, you can create the required directories on the fly. Recursion solutions look elegant to me then compared to loops. Here is a straight-forward approach. I echoed the file names and directories to keep track of whats going on. I am not ffmpeg pro, I used cp instead but should work fine for your use case.
Shell script:
source=original/
destination=compressed/
f1() {
mkdir -p ${destination}${1}
for file in `ls ${source}${1}*.jpg 2>/dev/null`
do
echo 'Original Path:' ${file}
echo 'Compressed Path:' ${destination}${1}$(basename $file) '\n'
cp ${file} ${destination}${1}$(basename $file)
done
for dir in `ls -d ${source}${1}*/ 2>/dev/null`
do
echo 'Enter sub-directory:' ${dir} '\n'
f1 ${dir#*/}
done
}
f1 ''
Terminal Session:
$ ls
original script.sh
$ tree original/
original/
├── f1
│   ├── f16
│   │   └── f12.jpg
│   ├── f5
│   │   └── t4.jpg
│   └── t3.txt
├── f2
│   └── t5.txt
├── f3
├── f4
│   └── f10
│   ├── f2
│   │   └── f6.jpg
│   └── f3.jpg
├── t1.jpg
└── t2.txt
8 directories, 8 files
$ sh script.sh
Original Path: original/t1.jpg
Compressed Path: compressed/t1.jpg
Enter sub-directory: original/f1/
Enter sub-directory: original/f1/f16/
Original Path: original/f1/f16/f12.jpg
Compressed Path: compressed/f1/f16/f12.jpg
Enter sub-directory: original/f1/f5/
Original Path: original/f1/f5/t4.jpg
Compressed Path: compressed/f1/f5/t4.jpg
Enter sub-directory: original/f2/
Enter sub-directory: original/f3/
Enter sub-directory: original/f4/
Enter sub-directory: original/f4/f10/
Original Path: original/f4/f10/f3.jpg
Compressed Path: compressed/f4/f10/f3.jpg
Enter sub-directory: original/f4/f10/f2/
Original Path: original/f4/f10/f2/f6.jpg
Compressed Path: compressed/f4/f10/f2/f6.jpg
$ tree compressed/
compressed/
├── f1
│   ├── f16
│   │   └── f12.jpg
│   └── f5
│   └── t4.jpg
├── f2
├── f3
├── f4
│   └── f10
│   ├── f2
│   │   └── f6.jpg
│   └── f3.jpg
└── t1.jpg
8 directories, 5 files

Batch rename images to folder name + sequential number [duplicate]

This question already has answers here:
Batch renaming files with Bash
(10 answers)
Closed 6 years ago.
I want to rename files in a directory with subdirectories to parent it's directory name + sequential numbers.
For example:
hello-images/
├── first-black
│   ├── full_b200056_m.png
│   ├── full_b200056_x_DSC01973.JPG
│   ├── full_b200056_x_DSC01978.JPG
│   ├── full_b200056_x_DSC01988.JPG
│   ├── full_b200056_x_DSC01994.JPG
│   ├── full_b200056_x_DSC02003.JPG
├── second-atlas
│   ├── full_b200035_m1.png
│   ├── full_b200035_x_3926.JPG
│   ├── full_b200035_x_3928.JPG
│   ├── full_b200035_x_3931.JPG
│   ├── full_b200035_x_3944.JPG
Desidered result:
hello-images/
├── first-black
│   ├── first-black_1.png
│   ├── first-black_2.JPG
│   ├── first-black_3.JPG
│   ├── first-black_4.JPG
│   ├── first-black_5.JPG
│   ├── first-black_6.JPG
├── second-atlas
│   ├── second-atlas_1.png
│   ├── second-atlas_2.JPG
│   ├── second-atlas_3.JPG
│   ├── second-atlas_4.JPG
│   ├── second-atlas_5.JPG
From hello-images directory, do:
for d in */; do i=1; for f in "$d"/*.*; do echo mv -- "$f" "$d${d%/}_${i}.${f##*.}"; ((i++)); done; done
This is dry-run, it will show the mv commands to be run. If satisfied with the changes to be made, remove echo for actual action:
for d in */; do i=1; for f in "$d"/*.*; do mv -- "$f" "$d${d%/}_${i}.${f##*.}"; ((i++)); done; done
Expanded form:
for d in */; do
i=1
for f in "$d"/*.*; do
mv -- "$f" "$d${d%/}_${i}.${f##*.}"
((i++))
done
done

Linux/shell - Remove all (sub)subfolders from a directory except one

I've inherited a structure like the below, a result of years of spaghetti code...
gallery
├── 1
│   ├── deleteme1
│   ├── deleteme2
│   ├── deleteme3
│   └── full
│   ├── file1
│   ├── file2
│   └── file3
├── 2
│   ├── deleteme1
│   ├── deleteme2
│   ├── deleteme3
│   └── full
│   ├── file1
│   ├── file2
│   └── file3
└── 3
├── deleteme1
├── deleteme2
├── deleteme3
└── full
├── file1
├── file2
└── file3
In reality, this folder is thousands of subfolders large. I only need to keep ./gallery/{number}/full/* (i.e. the full folder and all files within, from each numbered directory within gallery), with everything else no longer required and needs to be deleted.
Is it possible to construct a one-liner to handle this? I've experimented with find/maxdepth/prune could not find an arragement which met my needs.
(Update: To clarify, all folders contain files - none are empty)
Using PaddyD answer you can first clean unwanted directories and then delete them:
find . -type f -not -path "./gallery/*/full/*" -exec rm {} + && find . -type d -empty -delete
This can easily be done with bash extglobs, which allow matching all files that don't match a pattern:
shopt -s extglob
rm -ri ./gallery/*/!(full)
How about:
find . -type d -empty -delete

Resources