Linux Copy All Files with specific filename length - linux

I want to Copy all files in my directory with a specific file name length.
e.g.
These files exist:
1.py
12.py
123.py
321.py
1234.py
Than I want to copy only the files 123.py and 312.py (because of length of 3)
I am new to Linux and donĀ“t know how to accomplish this. Anyone can help me?

If I understood correctly, you want to copy files whose names consist of three characters followed by .py. This could be done using:
cp ???.py destination_directory/
(Note: this could fail if you have a very large number, but the limit is typically large on modern systems.)

You can do it using the command find
find directory1 -type f -size 3k -exec cp -nv {} directory2/ \;

Related

How to search for files ending/starting/containing a certain letter in terminal?

I have been looking all over the internet to help me with this. I want to list all files that start/end/contain a certain letter but the results I found on the internet do not seem to work for me. I need to use the ls command for this (assignment).
I tried this code from another question:
ls abc* # list all files starting with abc---
ls *abc* # list all files containing --abc--
ls *abc # list all files ending with --abc
but when ever I try any of those it comes back with "ls: cannot access '*abc': No such file or directory"
Use find for finding files:
find /path/to/folder -maxdepth 1 -type f -name 'abc*'
This will give you all regular filenames within /path/to/folder which start with abc.

How to copy recursive directories to a flat directory

I am trying to copy all the *.psd files, currently in a multi directories structure, into one single directory.
Is there an rsync parametrization to allow it?
The solution proposed at Copying files from multiple directories into a single destination directory is not a multilevel recursive directories, only single level subdirectories.
In my current case I have files in multiple recursive directories (up to 7 levels) that I would like to reconcile in a single directory.
I fear rsync can't help you here. You can use find to find all the files and copy them to the destination directory, though:
find /path/to/source/topdir -type f -name '*.psd' -exec cp {} /path/to/destination/ \;
In my opinion #choroba's answer is the right one.
For completeness (or if for any reason you needed the files to be copied with rsync) you can do something way less efficient using rsync (which is just like using cp in this case), using find, a loop and other things not really necessary.
for file in $(find ./path/to/source/topdir -name "*psd" ); do rsync $file /path/to/destination/; done

Zipping and deleting files with certain age

i'm trying to elaborate a command that will find files that haven't been modified in over 6 months and zip them with one command. Afterwards i want to delete all those files and i just archived.
My current command to find the directories with the files is
find /var/www -type d -mtime -400 ! -mtime -180 | xargs ls -l > testd.txt
This gave me all the directories including the files that are older than 6 months
Now i was wondering if there was a way of zipping all the results and deleting them afterwards. Something amongst the line of
find /var/www -type f -mtime -400 ! -mtime -180 | gzip -c archive.gz
If anyone knows the proper syntax to achieve this i'd love to know. Thakns!
Edit, after a few tests this command results in a corrupted file
find /var/www -mtime -900 ! -mtime -180 | xargs tar -cf test4.tar
Any ideas?
Break this into several distinct steps that you can implement and thoroughly test separately:
Build a list of files to be archived and then deleted, saved to a temp file
Use the list from step 1 to add the files to .tar.gz archives. Give the archive file a name following a specific pattern that won't appear in the files to be archived, and put it in a directory outside the hierarchy of files being archived.
Read back the files from the .tar.gz and compare them (or their hashes) to the original files to ENSURE that you got them all without corruption
Use the list from step 1 to delete the files. Do not use a wildcard for deletion. Put in some guard code to prevent deletion of any file matching the name pattern of the archive .tar.gz file(s) created in step 2.
When testing a script that can do irreversible damage, always code the dangerous command with a leading echo and leave it that way until you are sure everything works. Only then remove the echo.
Consider zip, it should meet your requirements.
find ... | zip -m# archive.zip
-m (move) deletes the input directories/files after making the specified zip archive.
-# takes the list of input files from standard input.
You may find more options which are useful to you in the zip manual, e. g.
-r (recurse) travels the directory structure recursively.
-sf (show-files) shows the files that would be operated on, then exits.
-t or --from-date operates on files not modified prior to the specified date.
-tt or --before-date operates on files not modified after or at the specified date.
This could possibly make findexpendable.
zip -mr --from-date 2012-09-05 --before-date 2013-04-13 archive /var/www

Need to touch a file in several directories using one liner in linux

I have lots of directories. I need to cd to all the directories and create 2 files. I tried doing this using xargs, but I couldn't do it. Can you please tell me how to achieve this?
If you don't want or need to run find but have a list of directories, something like this:
xargs -i touch {}/a {}/b <directories.txt
If the directory paths are completely regular (e.g. all subdirectories two levels down), it might be as easy as
touch */*/a */*/b
find <path> -type d -exec touch {}/a {}/b \;
path may be . if you are already in the top directory you are interested to work on.

Bash script to recursively step through folders and delete files

Can anyone give me a bash script or one line command i can run on linux to recursively go through each folder from the current folder and delete all files or directories starting with '._'?
Change directory to the root directory you want (or change . to the directory) and execute:
find . -name "._*" -print0 | xargs -0 rm -rf
xargs allows you to pass several parameters to a single command, so it will be faster than using the find -exec syntax. Also, you can run this once without the | to view the files it will delete, make sure it is safe.
find . -name '._*' -exec rm -Rf {} \;
I've had a similar problem a while ago (I assume you are trying to clean up a drive that was connected to a Mac which saves a lot of these files), so I wrote a simple python script which deletes these and other useless files; maybe it will be useful to you:
http://github.com/houbysoft/short/blob/master/tidy
find /path -name "._*" -exec rm -fr "{}" +;
Instead of deleting the AppleDouble files, you could merge them with the corresponding files. You can use dot_clean.
dot_clean -- Merge ._* files with corresponding native files.
For each dir, dot_clean recursively merges all ._* files with their corresponding native files according to the rules specified with the given arguments. By default, if there is an attribute on the native file that is also present in the ._ file, the most recent attribute will be used.
If no operands are given, a usage message is output. If more than one directory is given, directories are merged in the order in which they are specified.
Because dot_clean works recursively by default, use:
dot_clean <directory>
If you want to turn off the recursively merge, use -f for flat merge.
dot_clean -f <directory>
find . -name '.*' -delete
A bit shorter and perform better in case of extremely long list of files.

Resources