Find empty files, if found update files with [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
Can someone with more Linux knowledge answer this correctly for me.
On our web server, we host and run ALOT of web scripts.
we control these via Datestamp files, So the script is not over ran, or ran more than once.
A lot of files are all 0 KB. I wanted to know if there is a quick way in Linux to locate the files and update them.
I have located the files using:
find /var/www/vhosts/DOMAINNAME.co.uk/httpdocs -name "datestamp.*" -type f -empty
I have a long list of files, Can i update these with a simple datestamp format:
i.e.
20150923114046

You can use the -exec option of find:
find /var/www/vhosts/DOMAINNAME.co.uk/httpdocs -name "datestamp.*" -type f -empty \
-exec bash -c 'echo 20150923114046 > {}' \;
To get the timestamp dynamically, use date:
bash -c 'echo $(date +%Y%m%d%H%M%S) > {}'
To use the last modified timestamp, use the -r option:
bash -c 'echo $(date +%Y%m%d%H%M%S -r {}) > {}'

Related

Copy files from one folder to another folder with filters [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I have these files in my source folder
source_path/date=20191230/somefile.txt
source_path/date=20191231/somefile.txt
source_path/date=20200101/somefile.txt
source_path/date=20200102/somefile.txt
If I do the bellow command all files will be copied to my dest_path folder
cp --recursive source_path/ dest_path/
I just want to copy all folders where dates are in 2020 or something
I just need these to files of 2020
source_path/date=20200101/somefile.txt
source_path/date=20200102/somefile.txt
How can I add filters with cp command
This question is not suitable for Stack Overflow, but this is the answer:
cp --recursive source_path/date=20200* dest_path/
Or does dest_path not exist? Then you would write
mkdir -p dest_path && cp --recursive source_path/date=20200* dest_path/
You can use find with the -name, -type and -exec flags and so if the source directory was /home/foo and the destination directory /tmp:
find /home/foo -type d -name "*date-2020*" -exec cp '{}' /tmp \;
-type signifies that we only searching directories, -name is the name we are searching for and exec for command we are executing with the results

Find empty directory in Alpine Linux [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I am trying to find all empty directory using find command and also using alpinelinux , but unfortunately we don't have any option of action as empty in alpine. Anyone have any suggestion on how can achieve this.
I am trying to use below
find . -depth -type d -empty -mmin +120 -print;
Anyone can suggest if any package/library can solve this issue .
You may google how to check empty dir in shell and execute a subshell for each directory and make the subshell exit with zero exit status if the directory is empty:
find . -depth -type d -exec sh -c '[ -z "$(ls -A "$1")" ]' _ {} \; -print
Other then that, install GNU find.

Rich globbing `ls [G-S]*` in fish shell? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
In Bash it is possible to
ls [G-S]*
and it would list all files from g-s and G-S.
How is that done in Fish shell?
Fish currently does not support a rich glob syntax. The current thinking is that a glob command should be added in keeping with the fish goal of doing things via commands rather than magic syntax. See, for example, https://github.com/fish-shell/fish-shell/issues/3681. The solution is to create a function that filters the results. For example, the ** glob matches all files and directories in and below the CWD. I frequently want just the plain files and want to ignore the .git subdir. So I wrote this function:
function ff --description 'Like ** but only returns plain files.'
# This also ignores .git directories.
find . \( -name .git -type d -prune \) -o -type f | sed -n -e '/\/\.git$/n' -e 's/^\.\///p'
end
Which I can then use like this: grep something (ff). You could create a similar function that uses the find -name pattern matching feature or filter the results with string match --regex.
You can use find -iregex "./[G-S].*". Fish is quite limited in this regard.

No such file or directory find command on linux [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I've created script which delete old backup file from directory but this command was worked fine before 1 week and Nothing change on script or packages but still getting below error:
root#:# find /var/backups/abc/* -type d -mtime +6
/var/backups/abc/2016-03-09_0321
root#:~# find /var/backups/abc/* -type d -mtime +6 -exec rm -rf {} \;
find: `/var/backups/abc/2016-03-08_0321': No such file or directory
Problem is that, this script run every day on cron, I getting a mail like " find: `/var/backups/abc/2016-03-08_0321': No such file or directory". files are deleted but such a mails are getting from root.
find /var/backups/abc/* -type d -mtime +6 -prune -exec rm -rf {} \;
Here, we useĀ -pruneĀ on the directories that we're about to delete, so find will then not try to read their contents.
This is because of after having returned your directory, find will try to look in it (to continue his recursive search), and will fail because of you just removed it.

Linux/Unix Command Needed for finding files on a particular date [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I need help finding files in a directory which contain a word/string and on a particular date.
Currently I am using this command:
find . -exec grep -l .string. {} \;
This command returns all the files containing that string in that directory. I would like to get those files on from a particular date, for example 12/24/2013.
You can use:
find . -type f -exec grep 'string' {} \; -exec ls -l {} \; | grep 'Dec 24'
Which will search any files which contain the string string, and then execute ls -l on only those files, and finally, grep out any that match Dec 24.
This works because find will apply it's arguments in order, so only those that match previous results will be passed on.
Maybe this could help you with grep:
find /path/to/find -type d -atime -7
The last parameter is days here 7 days before you can modify to particular dat ,atime is the file access time ,'d' is directory search for directory for find a file replace 'd' with 'f' give the path where to find and then finally make pipeline this with grep to string to search

Resources