Linux is it possible to empty the contents of all files in a directory [closed] - linux

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Is it possible in a single command (no loop) to clear the contents of each file in a directory?

Use truncate:
truncate -s 0 directory/* &> /dev/null

This is ugly as hell, but it works:
find . -type f -exec sh -c 'echo -n "" > $1' sh {} \;
This will clear every file in every subdirectory.
To just clear the files in the current directory:
for i in *; do cat /dev/null > $i; done
(Yes, it's a loop, but it's one line.)

Related

Find empty files, if found update files with [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
Can someone with more Linux knowledge answer this correctly for me.
On our web server, we host and run ALOT of web scripts.
we control these via Datestamp files, So the script is not over ran, or ran more than once.
A lot of files are all 0 KB. I wanted to know if there is a quick way in Linux to locate the files and update them.
I have located the files using:
find /var/www/vhosts/DOMAINNAME.co.uk/httpdocs -name "datestamp.*" -type f -empty
I have a long list of files, Can i update these with a simple datestamp format:
i.e.
20150923114046
You can use the -exec option of find:
find /var/www/vhosts/DOMAINNAME.co.uk/httpdocs -name "datestamp.*" -type f -empty \
-exec bash -c 'echo 20150923114046 > {}' \;
To get the timestamp dynamically, use date:
bash -c 'echo $(date +%Y%m%d%H%M%S) > {}'
To use the last modified timestamp, use the -r option:
bash -c 'echo $(date +%Y%m%d%H%M%S -r {}) > {}'

Recursively doing the command ls without -R [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I am trying to find a way to recreate the output of ls -R (linux) without using the option -R i.e without the recursion command, is this at all possible?
There are no other constraints.
shopt -s globstar nullglob
printf "%s\n" **
or
find .
The closest I can think of right now is to recurse through all given directories using find and to perform a listing on each. I used ls -1 because I noticed that ls -R defaults to a single column when redirected into a file; you may choose to omit the -1 option.
for dir in `find . -type d`; do
echo $dir:
ls -1 $dir
done
However, it doesn't work with filenames that contain spaces. I'm still looking for a way around that...

exclude directories mv unix [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
The command below moves every hidden/normal file ending with *string without . or _ before it.
mv {.,}*[!._]string /destination
How can I also exclude moving all directories in the above command?
Try
find /WHERE/TO/FIND -name '*STRING' \( ! -name '*_STRING' -o ! -name '*.STRING' \) -type f -exec mv \{\} /WHERE/TO/MOVE \;
Note, if you want to move every file from only the /WHERE/TO/FIND directory, you should add -maxdepth 1 (after e.g. the -type f part).
How about:
for file in {.,}*[!._]string; do test -f "$file" && mv "$file" /destination; done
In what shell does the [!._] glob actually work when used with {.,}? You would probably be better off avoiding the {} notation and do:
for file in .*[!._]string *[!._]string; do ... ; done

linux command to empty all files of a directory [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I´d like to empty all files from a directory. I´d tried this:
find myFolderPath/* -exec cat /dev/null > {} ';'
but it does not work. How can I do it?
You can't use redirection (>) within find -exec directly because it happens before the command runs and creates a file called {}. To get around this you need to do it in a new shell by using sh -c.
Also, note that you don't need to cat /dev/null > file in order to clobber a file. You can simply use > file.
Try this:
find . -type f -exec sh -c '>"{}"' \;
This will do what you want:
for f in *; do >$f; done

Change filenames to lowercase in Ubuntu in all subdirectories [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I know it's been asked but what I've found has not worked out so far.
The closet I came is this : rename -n 'y[A-Z]/[a-z]/' *
which works for the current directory. I'm not too good at Linux terminal so what
should I add to this command to apply it to all of the files in all the sub-directories from which I am in, thanks!
Here's one way using find and tr:
for i in $(find . -type f -name "*[A-Z]*"); do mv "$i" "$(echo $i | tr A-Z a-z)"; done
Edit; added: -name "*[A-Z]*"
This ensures that only files with capital letters are found. For example, if files with only lowercase letters are found and moved to the same file, mv will display the are the same file error.
Perl has a locale-aware lc() function which might work better:
find . -type f | perl -n -e 'chomp; system("mv", $_, lc($_))'
Note that this script handles whitespace in filenames, but not newlines. And there's no protection against collisions, if you have "ASDF.txt" and "asdf.txt" one is going to get clobbered.

Resources