I have a directory that has 108k files in it.
I am using the KSH shell on RHEL5
ls *
-dash_bin_ksh: ls: /bin/ls: cannot execute [Argument list too long]
Only command that seems to work is the find command.
find .
file 1
file 2
file n
I tried using find with then exec option to run the file command but I am not getting anywhere.
find . -exec file {}
find: missing argument to `-exec'
What am I missing? I just want to run the file command on every file in this directory and output to a file_output.txt
For find's exec, you have to end the argument with with \;
You can also try:
find . -print0 | xargs -0 file
xargs works by taking its STDIN and adding each element (line or delimited string) as an argument to the given executable to be executed as few times as possible. The argument list is split by --max-chars (Platform dependent upto 128Kib) into groups for execution.
-print0 adds null chars instead of new lines which makes it safe for file names with spaces. -0 on xargs is used to recognise null chars.
-print0 and -0 are GNU extensions and can be dropped for non GNU environments at the cost of versatility.
xargs also has the -I option which makes it work more like find -exec where the executable is run for each element.
Thanks to #glennjackman for his in-depth knowledge on this subject.
Add a \; to the end of your command, e.g. find . -exec file {} \;
Related
i recently started learning linux because a ctf contest is coming in the next months. The problem that I struggle with is that i am trying to make a bash script that starts from a directory, checks if the content is a directory or other kind of file. If it is a file,image etc apply strings $f | grep -i 'abcdef', if it is a directory cd to that directory and start over. i have c++ experience and i understand the logic but i can't really make it work.I can't succesfully implement the loop that goes thru all the subdirectories. All help would be appreciated!
you don not need a loop for this implementation. The find command can do what you are looking after.
for instance:
find /home -type f -exec sh -c " strings {} | grep abcd " \;
explain:
/home is you base directory can be anything
-type f: means a regular file
-exec from the man page:
"Execute command; true if 0 status is returned. All
following arguments to find are taken to be arguments to
the command until an argument consisting of ;' is encountered. The string {}' is replaced by the current
file name being processed everywhere it occurs in the
arguments to the command, not just in arguments where it
is alone, as in some versions of find. Both of these
constructions might need to be escaped (with a `') or
quoted to protect them from expansion by the shell. See
the EXAMPLES section for examples of the use of the -exec
option. The specified command is run once for each
matched file. The command is executed in the starting
directory. There are unavoidable security problems
surrounding use of the -exec action; you should use the
-execdir option instead."
If you want to just find the string in a file and you do not HAVE TO first find a directory and then a file and then search, you can just simply find the text with grep.
Go to the the parent directory and execute :
grep -iR "abcd"
Or from any place,
grep -iR "abcd" /var/log/mylogs/
Suggesting a grep command on find filter results:
grep "abcd" $(find . -type f)
I use this really useful command :
file *
to get quality/identity of listed files.
But I'd like to list recursively, from a given folder.
In another words, doing something like that :
(command below does not exist)
file * -r
Any trick to do it ?
You can use find for that, using the -exec switch:
find ./ -type f -exec file {} \;
Small explanation:
{} : result of the "find" command, used as an input for the "file" command
\; : terminator of the "find ... -exec ..." command
Another option is to use xargs(1):
find . | xargs file
Sample output:
./.config/xfe: directory
./.config/xfe/xfirc: ASCII text
./.Xauthority: X11 Xauthority data
./line/serialLG1800.py: Python script, ...
If file names contain spaces or other special characters, it is best to use the -print0 option for find and, doing so, also must add -0 option for xargs:
find . -print0 | xargs -0 file
Why does running this command give me error message: No such file or directory ?
for i in `find ~/desktop -name '*.py'` ; do ./$i ; done
The complete error message makes it much more clear what the problem is:
bash: .//home/youruser/desktop/foo.py: No such file or directory
You can see that there is indeed no such file:
$ .//home/youruser/desktop/foo.py
bash: .//home/youruser/desktop/foo.py: No such file or directory
$ ls -l .//home/youruser/desktop/foo.py
ls: cannot access './/home/youruser/desktop/foo.py': No such file or directory
Here's instead how you can run a file /home/youruser/desktop/foo.py:
$ /home/youruser/desktop/foo.py
Hello World
So to run it in your loop, you can do:
for i in `find ~/desktop -name '*.py'` ; do $i ; done
Here's a better way of doing the same thing:
find ~/desktop -name '*.py' -exec {} \;
or with a shell loop:
find ~/desktop -name '*.py' -print0 | while IFS= read -d '' -r file; do "$file"; done
For an explanation of what ./ is and does, and why it makes no sense here, see this question
Try find and exec option. http://man7.org/linux/man-pages/man1/find.1.html
-exec command ;
Execute command; true if 0 status is returned. All following
arguments to find are taken to be arguments to the command
until an argument consisting of `;' is encountered. The
string `{}' is replaced by the current file name being
processed everywhere it occurs in the arguments to the
command, not just in arguments where it is alone, as in some
versions of find. Both of these constructions might need to
be escaped (with a `\') or quoted to protect them from
expansion by the shell. See the EXAMPLES section for examples
of the use of the -exec option. The specified command is run
once for each matched file. The command is executed in the
starting directory. There are unavoidable security problems
surrounding use of the -exec action; you should use the
-execdir option instead.
-exec command {} +
This variant of the -exec action runs the specified command on
the selected files, but the command line is built by appending
each selected file name at the end; the total number of
invocations of the command will be much less than the number
of matched files. The command line is built in much the same
way that xargs builds its command lines. Only one instance of
`{}' is allowed within the command, and (when find is being
invoked from a shell) it should be quoted (for example, '{}')
to protect it from interpretation by shells. The command is
executed in the starting directory. If any invocation with
the `+' form returns a non-zero value as exit status, then
find returns a non-zero exit status. If find encounters an
error, this can sometimes cause an immediate exit, so some
pending commands may not be run at all. This variant of -exec
always returns true.
The paths returned by the find statement will be absolute paths, like ~/desktop/program.py. If you put ./ in front of them, you get paths like ./~/desktop/ which don’t exist.
Replace ./$i with "$i" (the quotes to take care of file names with spaces etc.).
You should use $i and not ./$i
I was doing the same thing this exact moment. I wanted a script to find if there's any flac files in the directory and convert it to opus.
Here is my solution:
if test -n "$(find ./ -maxdepth 1 -name '*.flac' -print -quit)"
then
do this
else
do nothing
fi
I'm trying to count the total lines in the files within a directory. To do this I am trying to use a combination of find and wc. However, when I run find . -exec wc -l {}\;, I recieve the error find: missing argument to -exec. I can't see any apparent issues, any ideas?
You simply need a space between {} and \;
find . -exec wc -l {} \;
Note that if there are any sub-directories from the current location, wc will generate an error message for each of them that looks something like that:
wc: ./subdir: Is a directory
To avoid that problem, you may want to tell find to restrict the search to files :
find . -type f -exec wc -l {} \;
Another note: good idea using the -exec option . Too many times people pipe commands together thinking to get the same result, for instance here it would be :
find . -type f | xargs wc -l
The problem with piping commands in such a manner is that it breaks if any files has spaces in it. For instance here if a file name was "a b" , wc would receive "a" and then "b" separately and you would obviously get 2 error messages: a: no such file and b: no such file.
Unless you know for a fact that your file names never have any spaces in them (or non-printable characters), if you do need to pipe commands together, you need to tell all the tools you are piping together to use the NULL character (\0) as a separator instead of a space. So the previous command would become:
find . -type f -print0 | xargs -0 wc -l
With version 4.0 or later of bash, you don't need your find command at all:
shopt -s globstar
wc -l **/*
There's no simple way to skip directories, which as pointed out by Gui Rava you might want to do, unless you can differentiate files and directories by name alone. For example, maybe directories never have . in their name, while all the files have at least one extension:
wc -l **/*.*
I want to insert a line into the start of multiple specified type files, which the files are located in current directory or the sub dir.
I know that using
find . -name "*.csv"
can help me to list the files I want to use for inserting.
and using
sed -i '1icolumn1,column2,column3' test.csv
can use to insert one line at the start of file,
but now I do NOT know how to pipe the filenames from "find" command to "sed" command.
Could anybody give me any suggestion?
Or is there any better solution to do this?
BTW, is it work to do this in one line command?
Try using xargs to pass output of find and command line arguments to next command, here sed
find . -type f -name '*.csv' -print0 | xargs -0 sed -i '1icolumn1,column2,column3'
Another option would be to use -exec option of find.
find . -type f -name '*.csv' -exec sed -i '1icolumn1,column2,column3' {} \;
Note : It has been observed that xargs is more efficient way and can handle multiple processes using -P option.
This way :
find . -type f -name "*.csv" -exec sed -i '1icolumn1,column2,column3' {} +
-exec do all the magic here. The relevant part of man find :
-exec command ;
Execute command; true if 0 status is returned. All following arguments
to find are taken to be arguments to the command until an argument consisting
of `;' is encountered. The string `{}' is replaced by the current file name
being processed everywhere it occurs in the arguments to the command, not just
in arguments where it is alone, as in some versions of find. Both of
these constructions might need to be escaped (with a `\') or quoted to protect
them from expansion by the shell. See the EXAMPLES section for examples of
the use of the -exec option. The specified command is run once for each
matched file. The command is executed in the starting directory. There
are unavoidable security problems surrounding use of the -exec action;
you should use the -execdir option instead