cant get rid of file with weird name/encoding on linux [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I have a file That appears as ? ? in ls:
drwxr--r-- 1 johndoe gid-johndoe 93756 Aug 22 09:10 .
drwxr--r-- 1 johndoe gid-johndoe 574633 Aug 22 09:18 ..
-rw-r--r-- 1 johndoe gid-johndoe 874857 Aug 12 15:25 ? ?
-rw------- 1 johndoe gid-johndoe 96342 Aug 22 08:41 .bash_history
When I try to grab the filename with sed, i get this strange output:
ls -la | sed 's/.*[0-9] //'
.
..
ash_history
I think the filename must have a weird encoding, but I dont know how to get a 'handle on it' so I can open/rename/delete it.
How can I get a handle to this file to open/rename/delete it?
Edit
When I type ls and hit tab to auto-complete, the file shows up as ^[ ^[

Thanks to this link from the comments I found a solution:
run ls -i to get inode number
run find . -inum <inode> -ok rm '{}' \; with the inode number to delete with 'are you sure?' prompt.

rm -i "$(ls|grep -E -m1 '\?|[[:cntrl:]]')"
be sure to use rm -i to verify matched files - match one at a time - the regex matches a literal questionmark as well as any ? displayed for invisible control characters
tested - created 3 files - with literal ? - with tab between ? - with ESC control characters - use CTRL-V to type control characters in a bash terminal
$touch '? ?' '? ?' '^[ ^['; ls -A1
? ?
???
? ?
$rm -i "$(ls|grep -E -m1 '[?[:cntrl:]]')"
rm: remove regular empty file '\033 \033'? y
$rm -i "$(ls|grep -E -m1 '[?[:cntrl:]]')"
rm: remove regular empty file '?\t?'? y
$rm -i "$(ls|grep -E -m1 '[?[:cntrl:]]')"
rm: remove regular empty file '? ?'? y
$ls -A

You should try with find and a regex.
find . -name "*" -exec ls '{}' \;
Find the test:
➜ t1 touch \?
➜ t1 ll
total 8.0K
-rw-rw-r-- 1 netsamir netsamir 0 Aug 22 19:02 ?
➜ t1 find . -name "*" -exec rm {} \;
rm: cannot remove '.': Is a directory
➜ t1 ll
total 0
➜ t1
Another solution is :
➜ t1 touch "\? \?"
➜ t1 ll
total 8.0K
-rw-rw-r-- 1 netsamir netsamir 0 Aug 22 19:05 \? \?
➜ t1 rm "\? \?"
➜ t1 ll
total 0
➜ t1

Related

How to handle files with leading dashes [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I am searching for files and directories using find, the files have leading dashes in the filenames:
ls -al
total 0
-rw-r--r-- 1 razhal staff 0 May 22 23:58 -x
drwxr-xr-x 3 razhal staff 96 May 22 23:58 .
drwxr-xr-x 12 razhal staff 384 May 22 17:06 ..
find * -maxdepth 1 -type file
The above gives the following error message:
find: illegal option -- m
I tried to terminate the options using --, but still having the same problem:
find * -maxdepth 1 -type file --
The strange thing is that if the folder contains a file without a leading dash I am getting no error message:
ls -al
total 0
-rw-r--r-- 1 razhal staff 0 May 22 23:58 -x
drwxr-xr-x 3 razhal staff 96 May 22 23:58 .
drwxr-xr-x 12 razhal staff 384 May 22 17:06 ..
-rw-r--r-- 1 razhal staff 0 May 23 00:03 x
find * -maxdepth 1 -type file
The above returns the x and no error message.
My question is how can I find and list both files/directories with and without leading dashes using find?
Notice that I really want to use find and not some other command such as xargs or similar.
Use . instead of *:
find . -maxdepth 1 -type file
. refers to the current folder you are in. You can also use .. instead of . to search from the parent directory.
Another option would be to put ./ in front of * like this:
find ./* -maxdepth 1 -type file
This way it won't interpret the files whose names start with a dash as options.

Remove Files whit name '\' [duplicate]

This question already has an answer here:
Delete files with backslash in linux
(1 answer)
Closed 2 years ago.
I make mistake file with this name '' and i do not know how to clear this file
How to remove the file with this name '' ?
rw-r--r-- 1 root root 1555 Sep 15 12:54 '\'
You could follow 2 steps to do this.
1- Get the inode number of that specific file by doing ls -litr Input_file_name
2- Then use following command to delete it by inode number: (replace 1235 with your actual inode number which you get in your previous step)
find . -inum 1235 -exec rm {} \;
Working example: Its a dummy/test example only for understanding purposes.
1- Do ls -lihtr to get inode number:
total 16K
1227 -rw-r--r-- 1 singh singh 0 Sep 15 08:05 \\\\
2- Now place that in find command as follows to delete that specific file:
find . -inum 1227 -exec rm {} \;
NOTE: As per #JRFerguson's comment, there could be same inode number files/symlinks, so better to give either . or complete path in find command to make sure it deletes the correct file and add -xdev option to above find command too.
$ rm \\
Works for me (with bash). Or, if you have an interactive file manager available (which might be mc if all you have is a terminal) just use a point-and-click method. It's the shell's escaping that's causing all the problems here.
You need to double-quote the filename and escape the backslash in the shell with another backslash
rm "'\\'"
You need to escape both ' and \ with \.
The following command
rm \'\\\'
should do the trick.

How to redirect out put of xargs when using sed

Since swiching over to a better management system I am wanting to remove all the redundant logs at the top of each of our source files. In Notepad++ I was able to achieve the result by using "replace in files" and replacing matches to \A(//.*\n)+ with blank. On Linux however I am having no such luck and am needing to resort to 'xargs' and 'sed'.
The sed expression I'm using is:
sed '1,/^[^\/]/{/^[^\/]/b; d}'
Ugly to be sure but it does seem to work.
The problem I'm having is when I try to run that through 'xargs' in order to feed it all the source files in our system I am unable to redirect the output to 'stripped' files, which I then intend to copy over the originals.
I want something in the line of:
find . -name "*.com" -type f -print0 | xargs -0 -I file sed '1,/^[^\/]/{/^[^\/]/b; d}' "file" > "file.stripped"
However I'm having grief passing the ">" through to the receiving environment (shell) as I'm already using too many quote marks. I have tried all manner of escaping and shell "wrappers" but I just can't get it to play ball.
Anyone care to point me in the right direction?
Thanks,
Slarti.
I made a similar scenario with a simpler sed expression just as an example, see if it works for you:
I created 3 files with the string "abcd" inside each:
# ls -l
total 12
-rw-r--r-- 1 root root 5 Oct 6 09:05 test.aaaaa.com
-rw-r--r-- 1 root root 5 Oct 6 09:05 test2.aaaaa.com
-rw-r--r-- 1 root root 5 Oct 6 09:05 test3.aaaaa.com
# cat test*
abcd
abcd
abcd
Running the find command as you showed using the -exec option instead of xargs, and replacing the sed expression for a silly one that simply replaces every "a" for "b" and the option -i, that writes directly do the input file:
# find . -name "*.com" -type f -print0 -exec sed -i 's/a/b/g' {} \;
./test2.aaaaa.com./test3.aaaaa.com./test.aaaaa.com
# cat test*
bbcd
bbcd
bbcd
In your case it should look like this:
# find . -name "*.com" -type f -print0 -exec sed -i '1,/^[^\/]/{/^[^\/]/b; d}' {} \;

find -amin doesn't work if -name is excluded [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I am running GNU Linux 6.2 on x86_64 hardware. I'm in a directory with 1237 files. I want to list the files created in the last 36 hours and since I cannot get -atime to work, I use "-amin 2160":
$ find . -amin -2160 -name 'Ar*' -exec ls -l {} \;
-rw-r--r-- 1 oracle dba 2318 Aug 30 04:04 ./Archivelog_backup_08302015040300.log
-rw-r--r-- 1 oracle dba 2317 Aug 30 10:03 ./Archivelog_backup_08302015100321.log
-rw-r--r-- 1 oracle dba 1920 Aug 30 16:21 ./Archivelog_backup_08302015160300.log
-rw-r--r-- 1 oracle dba 2318 Aug 30 22:04 ./Archivelog_backup_08302015220300.log
-rw-r--r-- 1 oracle dba 2318 Aug 31 04:03 ./Archivelog_backup_08312015040300.log
-rw-r--r-- 1 oracle dba 2318 Aug 31 10:04 ./Archivelog_backup_08312015100320.log
But since I don't care what the name is and I want to see ALL files touched in the last 2160 minutes, I type this command,
find . -amin -2160 -exec ls -l {} \;
but it lists all 1237 files in the directory PLUS THE 6 that meet the criterion. Why?
Humbly,
Because one of the matches is the directory entry.
The real lesson here is to not use ls in scripts. find has excellent, unambiguous replacements like the -printf predicate. See also http://mywiki.wooledge.org/ParsingLs
You probably also want to add -type f to avoid listing directories.
find . -type f -amin -2160 -printf '%s %f\n'
What you put in the format string obviously depends on which information exactly you actually want to extract for each matched file.
The first name output by
find .
is
.
which, when sent to ls will list all the files in the current directory. In your first example, . was excluded by -name "Ar*" You could get the same effect by telling find to only emit regular-files (not directories) with
find . -type f …

Renaming a '.' file in Ubuntu [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I downloaded a file using rsync and accidentaly provided it the destination as '.' (I thought it is the directory to download into). So it downloaded the multi-gig file but named it '.'.
drwxr-sr-x 2 root apache 4096 May 7 00:42 .
drwxr-sr-x 7 me apache 4096 May 7 00:25 ..
-rw-r--r-- 1 1006 1006 2008805206 Apr 5 04:49 .
-rw------- 1 root apache 1675 May 7 00:25 somefile
-rw------- 1 root apache 392 May 7 00:26 anotherfile.txt
How do I rename the 2GB+ '.' file to something meaningful. Nothing I do seems to work (i've tried mv, rename, etc.) but they all say
Device or resource busy
You can use this mv:
mv ./.[[:blank:]]* myfile
Or else try this find:
find . -type f -maxdepth 1 -name '. *' -exec mv '{}' myfile \;
Yes its a superuser question. But I found a solution elsewhere, so thanks everyone for trying. We could do this using:
find . -type f | (let i=0; while read f; do mv "$f" "$f"-$i ; let i=$i+1; done)
Not the most elegant way and probably very insecure too (as there is no undo).
Type this from the directory where it is located.
mv ./. newfile

Resources