cp: invalid option -- 'D' - linux

My goal is to find all .pdf files from multiple subfolder structures and then move them to another folder.
For this I have assembled the following.
find /mnt/user/Data/01_Persönliche_Dokumente/01_Firmen -iname \*.pdf -type f | xargs cp -t /mnt/user/Data/01_Persönliche_Dokumente/Paperless_input/
But as an error you get the following:
root#Tower:/mnt/user/Data/01_Persönliche_Dokumente/01_Firmen# find "/mnt/user/Data/01_Persönliche_Dokumente/01_Firmen" -iname \*.pdf -type f | xargs cp -t "/mnt/user/Data/01_Persönliche_Dokumente/Paperless_input"
cp: invalid option -- 'D'
Try 'cp --help' for more information.
I try diffrent options and get some help in the Unraid Discord.

I got a hint from a Friend of mine.
The correct comand looks like this:
find "/mnt/user/Data/01_Persönliche_Dokumente/01_Firmen" -iname \*.pdf -type f -print0 | xargs -0 cp -t "/mnt/user/Data/01_Persönliche_Dokumente/Paperless_input"
for the find command, I added -print0 which means:
print the full file name on the standard output, followed by a null
character (instead of the newline character that -print uses). This
allows file names that contain newlines or other types of white space
to be correctly interpreted by programs that process the find output.
for the xargs command I added 0 which means:
-0 : input items are terminated by null character instead of white spaces.
basically removing any characters that would screw with the result of find

Related

any way to use linux command file recursively?

I use this really useful command :
file *
to get quality/identity of listed files.
But I'd like to list recursively, from a given folder.
In another words, doing something like that :
(command below does not exist)
file * -r
Any trick to do it ?
You can use find for that, using the -exec switch:
find ./ -type f -exec file {} \;
Small explanation:
{} : result of the "find" command, used as an input for the "file" command
\; : terminator of the "find ... -exec ..." command
Another option is to use xargs(1):
find . | xargs file
Sample output:
./.config/xfe: directory
./.config/xfe/xfirc: ASCII text
./.Xauthority: X11 Xauthority data
./line/serialLG1800.py: Python script, ...
If file names contain spaces or other special characters, it is best to use the -print0 option for find and, doing so, also must add -0 option for xargs:
find . -print0 | xargs -0 file

I'm using xargs, but the argument list is too long

I'm using Linux. I have a directory tree with over 100,000 files that originated on a MS Windows system. Some of the files have spaces in their names. I want to convert those files to unix. I ran this command
find . -type f | xargs -0 dos2unix
And received this error message
xargs: argument line too long
How can I fix this?
If you want to use xargs with -0 to prevent issues with spaces/special characters in file names you must also use -print0 with find so it will delimit its output with null bytes:
find . -type f -print0 | xargs -0 dos2unix
You don't need xargs here, you can do
find . -type f -exec dos2unix '{}' +

search a string in a file with case insensitive file name

I want to grep for a string in all the files which have a particular patter in their name and is case-insensitive.
For eg if I have two files ABC.txt and aBc.txt, then I want something like
grep -i 'test' *ABC*
The above command should look in both the files.
You can use find and then grep on the results of that:
find . -iname "*ABC*" -exec grep -i "test" {} \;
Note that this will run grep once on each file found. If you want to run grep once on all the files (in which case you risk running into the command line length limit), you can use a plus at the end:
find . -iname "*ABC*" -exec grep -i "test" {} \+
You can also use xargs to process a really large number of results more efficiently:
find . -iname "*ABC*" -print0 | xargs -0 grep -i test
The -print0 makes find output 0-terminated results, and the -0 makes xargs able to deal with this format, which means you don't need to worry about any special characters in the filenames. However, it is not totally portable, since it's a GNU extension.
If you don't have a find that supports -print0 (for example SVR4), you can still use -exec as above or just
find . -iname "*ABC*" | xargs grep -i test
But you should be sure your filenames don't have newlines in them, otherwise xargs will treat each line of a filename as a new argument.
You should use find to match file and search string that you want with command grep which support regular expression, for your question, you should input command like below:
find . -name "*ABC*" -exec grep \<test\> {} \;

find -exec doesn't recognize argument

I'm trying to count the total lines in the files within a directory. To do this I am trying to use a combination of find and wc. However, when I run find . -exec wc -l {}\;, I recieve the error find: missing argument to -exec. I can't see any apparent issues, any ideas?
You simply need a space between {} and \;
find . -exec wc -l {} \;
Note that if there are any sub-directories from the current location, wc will generate an error message for each of them that looks something like that:
wc: ./subdir: Is a directory
To avoid that problem, you may want to tell find to restrict the search to files :
find . -type f -exec wc -l {} \;
Another note: good idea using the -exec option . Too many times people pipe commands together thinking to get the same result, for instance here it would be :
find . -type f | xargs wc -l
The problem with piping commands in such a manner is that it breaks if any files has spaces in it. For instance here if a file name was "a b" , wc would receive "a" and then "b" separately and you would obviously get 2 error messages: a: no such file and b: no such file.
Unless you know for a fact that your file names never have any spaces in them (or non-printable characters), if you do need to pipe commands together, you need to tell all the tools you are piping together to use the NULL character (\0) as a separator instead of a space. So the previous command would become:
find . -type f -print0 | xargs -0 wc -l
With version 4.0 or later of bash, you don't need your find command at all:
shopt -s globstar
wc -l **/*
There's no simple way to skip directories, which as pointed out by Gui Rava you might want to do, unless you can differentiate files and directories by name alone. For example, maybe directories never have . in their name, while all the files have at least one extension:
wc -l **/*.*

Linux: Redirecting output of a command to "find"

I have a list of file names as output of certain command.
I need to find each of these files in a given directory.
I tried following command:
ls -R /home/ABC/testDir/ | grep "\.java" | xargs find /home/ABC/someAnotherDir -iname
But it is giving me following error:
find: paths must precede expression: XYZ.java
What would be the right way to do it?
ls -R /home/ABC/testDir/ | grep -F .java |
while read f; do find . -iname "$(basename $f)"; done
You can also use ${f##*/} instead of basename. Or;
find /home/ABC/testDir -iname '*.java*' |
while read f; do find . -iname "${f##*/}"; done
Note that, undoubtedly, many people will object to parsing the output of ls or find without using a null byte as filename separater, claiming that whitespace in filenames will cause problems. Those people usually ignore newlines in filenames, and their objections can be safely ignored. (As long as you don't allow whitespace in your filenames, that is!)
A better option is:
find /home/ABC/testDir -iname '*.java' -exec find . -iname {}
The reason xargs doesn't work is that is that you cannot pass 2 arguments to -iname within find.
find /home/ABC/testDir -name "\.java"

Resources