Current directory after was moved by another user - linux

I have a terminal which is opened on a folder a:
hostname:/path/to> mkdir a
hostname:/path/to> cd a
hostname:/path/to/a> cat > b.txt
Another user has moved the folder to another location
hostname:/path/to> mv a /another/hidden/path/i/dont/know
I would like to know where he moved it.
The old terminal still works, but pwd shows the old path because the way that the linux file system works. Old absolute path not exists of course:
hostname:/path/to/a> ls
b.txt
hostname:/path/to/a> pwd
/path/to/a
hostname:/path/to/a> ls /path/to/a
ls: cannot access /path/to/a: No such file or directory
I thought about traversing the upper hierarchy and look for the correct folder in each level:
hostname:/path/to/a> ls ../
...
hostname:/path/to/a> ls ../../
...
hostname:/path/to/a> ls ../../../
...
However this solution may be very hard if one of the levels contain a lot of subdirectories. In my specific case it is not possible as I don't have permissions in one of the upper levels.
I guess it may be impossible to find the exact path, because of the way that linux fs works (e.g. there may be a lot of hard-links for the same directory). I don't care to get some candidates for the path, is there a way to find an absolute path which enables me to approach the directory?

You can try find / -d -name "a" -exec cd {} \; to change to the moved directory, assuming that the user didn't change it's name. If you would like to get the path to the moved directory use find / -d -name "a" -exec echo "path is {}" \;
EDIT
If there are more directories with the same name and you know when the dir was moved, use find / -d -name "a" -mmin -$minutes -exec echo "path is {}" \; where $minutes is the time it was changed at to find all dirs name "a" changed in the specified time period.
If someone renamed and moved the diretory, you need to do some manual work. Use find / -d -mmin -$minutes -ls to list all dirs changed within the specified time period.
As hellerpop says, you might want to use -cmin instead of -mmin

Answering my own question: I may use lsof on a uniquely-named subfolder (I should traverse into it so it is considered as "opened"):
hostname:/path/to/a> mkdir uniquely_named_directory
hostname:/path/to/a> cd uniquely_named_directory
hostname:/path/to/a/uniquely_named_directory> lsof | grep uniquely_named_directory
It can be done only if I have such folder or have write permissions to the folder. Alternatively, if I know that the folder was not renamed and has a name which is special enough, I may simply try "lsof | grep a".
(Based on idea suggested here in a deleted answer...).

Related

copying files from etc ending with digit to test1 directory

I'm new to linux and as an exercice I need to copy the "etc" files that end with a digit from home directory to the test1 directory
(with one command).
I tried this but it dosn't work
find /etc -type f -iname "*[3-9]" -exec cp {} ../test1/ \;
this should work for your home directory files ending with digit
mv `ls . |grep -Eo "^.*[0-9]$"` your-directory
lets says in the current directory you have some files like ofjweifhwef9 or kfhiofeh8 ( files ending with digit)
so ls will list them.
this grep expression "^.*[0-9]$"` will find only files ending with digit. ( because in your home directory system wont allow to have a file like this "/etc/somefile123")
and then mv will move those files to your-directory
note :- if grep cannot find the files ending with number you will see an error ofcourse because mv needs 2 operands but since it wasn't there so error.
mv: missing destination file operand after './your-directory'
It is probably because /etc is a link in the system that you're using, and find doesn't seem to consider it a path until you add an extra / at the end. Try this instead:
find /etc/ -type f -iname "*[3-9]" -exec cp {} ../test1/ \;
Notice the /etc/ instead of /etc. I get the same behavior on my Mac where /etc is a link to another directory.
Of course, also make sure that you have files which names end on a digit under the /etc/ directory tree. I have none in my mac. You should get some files when you run:
find /etc/ -type f -iname "*[3-9]"
If you don't, you don't have any files to copy. You may also try: find /etc/ to see all files under the directory tree.
Finally, you may want to add the option: -depth 1 if you only want to copy the files in the /etc/ directory, as opposed to all the files that match in the directory tree under /etc/.

How do you move files from one folder to the other?

I am trying to move specific files from one folder to another. Would the below work?
mkdir test
touch test1.sh
touch test2.sh
touch test3.sh
mkdir test2
find test/ | xargs -I% mv % test2
I think this can work:
find ./ -name "test*.sh" | xargs -I% mv % test2
There is somethin odd in your example:
If test1 does not contain any subdirectories, or if you want to move the subdirectories as they are, you could simply do a
mv test1/* test2
(Note that this would (by default) not move entries which start with a period. If this is a problem, you either should consider not using Posix shell but, say, bash or Zsh, or indeed could use find, for the safe side with the -prune option) .
The problem starts with subdirectories. The output of find contains all directories along with the files at the end. The mv inside the xargs would then move, say, a directory test1/foo, and if it later wants to process a file test1/foo/bar/baz.txt, the file is not here anymore. The overall effect would be that you would have moved all the subdirectory (as in my first solution which does not need find), but get in addition plenty of error messages.

Linux recursive copy files to its parent folder

I want to copy recursively files to its parent folder for a specific file extension. For example:
./folderA/folder1/*.txt to ./folderA/*.txt
./folderB/folder2/*.txt to ./folderB/*.txt
etc.
I checked cp and find commands but couldn't get it working.
I suspect that while you say copy, you actually mean to move the files up to their respective parent directories. It can be done easily using find:
$ find . -name '*.txt' -type f -execdir mv -n '{}' ../ \;
The above command recurses into the current directory . and then applies the following cascade of conditionals to each item found:
-name '*.txt' will filter out only files that have the .txt extension
-type f will filter out only regular files (eg, not directories that – for whatever reason – happen to have a name ending in .txt)
-execdir mv -n '{}' ../ \; executes the command mv -n '{}' ../ in the containing directory where the {} is a placeholder for the matched file's name and the single quotes are needed to stop the shell from interpreting the curly braces. The ; terminates the command and again has to be escaped from the shell interpreting it.
I have passed the -n flag to the mv program to avoid accidentally overwriting an existing file.
The above command will transform the following file system tree
dir1/
dir11/
file3.txt
file4.txt
dir12/
file2.txt
dir2/
dir21/
file6.dat
dir22/
dir221/
dir221/file8.txt
file7.txt
file5.txt
dir3/
file9.dat
file1.txt
into this one:
dir1/
dir11/
dir12/
file3.txt
file4.txt
dir2/
dir21/
file6.dat
dir22/
dir221/
file8.txt
file7.txt
dir3/
file9.dat
file2.txt
file5.txt
To get rid of the empty directories, run
$ find . -type d -empty -delete
Again, this command will traverse the current directory . and then apply the following:
-type d this time filters out only directories
-empty filters out only those that are empty
-delete deletes them.
Fine print: -execdir is not specified by POSIX, though major implementations (at least the GNU and BSD one) support it. If you need strict POSIX compliance, you'll have to make do with the less safe -exec which would need additional thought to be applied correctly in this case.
Finally, please try your commands in a test directory with dummy files, not your actual data. Especially with the -delete option of find, you can loose all your data quicker than you might imaging. Read the man page and, if that is not enough, the reference manual of find. Never blindly copy shell commands from random strangers posted on the internet if you don't understand them.
$cp ./folderA/folder1/*.txt ./folderA
Try this commnad
Run something like this from the root(ish) directory:
#! /bin/bash
BASE_DIR=./
new_dir() {
LOC_DIR=`pwd`
for i in "${LOC_DIR}"/*; do
[[ -f "${i}" ]] && cp "${i}" ../
[[ -d "${i}" ]] && cd "${i}" && new_dir
cd ..
done
return 0
}
new_dir
This will search each directory. When a file is encountered, it copies the file up a directory. When a directory is found, it will move down into the directory and start the process over again. I think it'll work for you.
Good luck.

Linux command to create the empty file called 'test1'

Enter a Linux command to create the empty file called 'test1' in the directory 'systems' (you are still in your home directory).
Assuming 'systems' is a subdirectory of the current directory:
touch systems/test1
Assuming that you only know that the directory 'systems' is some subdirectory in the directory tree of the current directory then: find . -name systems -type d -exec touch "{}/test1" \; Will create such a file. Alternately, so will find . -name systems -type d -execdir touch systems/test1 \; However, both will do so in every subdirectory named 'systems' in the current directory tree. We could limit that action to only the first, the last, or some other criteria, but the list of possible permutations is just too long.
You really have not provided enough information for us to provide a complete answer.

How can I generate a list of files with their absolute path in Linux?

I am writing a shell script that takes file paths as input.
For this reason, I need to generate recursive file listings with full paths. For example, the file bar has the path:
/home/ken/foo/bar
but, as far as I can see, both ls and find only give relative path listings:
./foo/bar (from the folder ken)
It seems like an obvious requirement, but I can't see anything in the find or ls man pages.
How can I generate a list of files in the shell including their absolute paths?
If you give find an absolute path to start with, it will print absolute paths. For instance, to find all .htaccess files in the current directory:
find "$(pwd)" -name .htaccess
or if your shell expands $PWD to the current directory:
find "$PWD" -name .htaccess
find simply prepends the path it was given to a relative path to the file from that path.
Greg Hewgill also suggested using pwd -P if you want to resolve symlinks in your current directory.
readlink -f filename
gives the full absolute path. but if the file is a symlink, u'll get the final resolved name.
Use this for dirs (the / after ** is needed in bash to limit it to directories):
ls -d -1 "$PWD/"**/
this for files and directories directly under the current directory, whose names contain a .:
ls -d -1 "$PWD/"*.*
this for everything:
ls -d -1 "$PWD/"**/*
Taken from here
http://www.zsh.org/mla/users/2002/msg00033.html
In bash, ** is recursive if you enable shopt -s globstar.
You can use
find $PWD
in bash
ls -d "$PWD/"*
This looks only in the current directory. It quotes "$PWD" in case it contains spaces.
Command: ls -1 -d "$PWD/"*
This will give the absolute paths of the file like below.
[root#kubenode1 ssl]# ls -1 -d "$PWD/"*
/etc/kubernetes/folder/file-test-config.txt
/etc/kubernetes/folder/file-test.txt
/etc/kubernetes/folder/file-client.txt
Try this:
find "$PWD"/
You get list of absolute paths in working directory.
You can do
ls -1 |xargs realpath
If you need to specify an absolute path or relative path You can do that as well
ls -1 $FILEPATH |xargs realpath
The $PWD is a good option by Matthew above. If you want find to only print files then you can also add the -type f option to search only normal files. Other options are "d" for directories only etc. So in your case it would be (if i want to search only for files with .c ext):
find $PWD -type f -name "*.c"
or if you want all files:
find $PWD -type f
Note: You can't make an alias for the above command, because $PWD gets auto-completed to your home directory when the alias is being set by bash.
If you give the find command an absolute path, it will spit the results out with an absolute path. So, from the Ken directory if you were to type:
find /home/ken/foo/ -name bar -print
(instead of the relative path find . -name bar -print)
You should get:
/home/ken/foo/bar
Therefore, if you want an ls -l and have it return the absolute path, you can just tell the find command to execute an ls -l on whatever it finds.
find /home/ken/foo -name bar -exec ls -l {} ;\
NOTE: There is a space between {} and ;
You'll get something like this:
-rw-r--r-- 1 ken admin 181 Jan 27 15:49 /home/ken/foo/bar
If you aren't sure where the file is, you can always change the search location. As long as the search path starts with "/", you will get an absolute path in return. If you are searching a location (like /) where you are going to get a lot of permission denied errors, then I would recommend redirecting standard error so you can actually see the find results:
find / -name bar -exec ls -l {} ;\ 2> /dev/null
(2> is the syntax for the Borne and Bash shells, but will not work with the C shell. It may work in other shells too, but I only know for sure that it works in Bourne and Bash).
Just an alternative to
ls -d "$PWD/"*
to pinpoint that * is shell expansion, so
echo "$PWD/"*
would do the same (the drawback you cannot use -1 to separate by new lines, not spaces).
fd
Using fd (alternative to find), use the following syntax:
fd . foo -a
Where . is the search pattern and foo is the root directory.
E.g. to list all files in etc recursively, run: fd . /etc -a.
-a, --absolute-path Show absolute instead of relative paths
If you need list of all files in current as well as sub-directories
find $PWD -type f
If you need list of all files only in current directory
find $PWD -maxdepth 1 -type f
You might want to try this.
for name in /home/ken/foo/bar/*
do
echo $name
done
You can get abs path using for loop and echo simply without find.
find jar file recursely and print absolute path
`ls -R |grep "\.jar$" | xargs readlink -f`
/opt/tool/dev/maven_repo/com/oracle/ojdbc/ojdbc8-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/ons-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/oraclepki-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/osdt_cert-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/osdt_core-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/simplefan-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/ucp-19.3.0.0.jar
This works best if you want a dynamic solution that works well in a function
lfp ()
{
ls -1 $1 | xargs -I{} echo $(realpath $1)/{}
}
lspwd() { for i in $#; do ls -d -1 $PWD/$i; done }
Here's an example that prints out a list without an extra period and that also demonstrates how to search for a file match. Hope this helps:
find . -type f -name "extr*" -exec echo `pwd`/{} \; | sed "s|\./||"
This worked for me. But it didn't list in alphabetical order.
find "$(pwd)" -maxdepth 1
This command lists alphabetically as well as lists hidden files too.
ls -d -1 "$PWD/".*; ls -d -1 "$PWD/"*;
stat
Absolute path of a single file:
stat -c %n "$PWD"/foo/bar
This will give the canonical path (will resolve symlinks): realpath FILENAME
If you want canonical path to the symlink itself, then: realpath -s FILENAME
Most if not all of the suggested methods result in paths that cannot be used directly in some other terminal command if the path contains spaces. Ideally the results will have slashes prepended.
This works for me on macOS:
find / -iname "*SEARCH TERM spaces are okay*" -print 2>&1 | grep -v denied |grep -v permitted |sed -E 's/\ /\\ /g'
for p in <either relative of absolute path of the directory>/*; do
echo $(realpath -s $p)
done
Recursive files can be listed by many ways in Linux. Here I am sharing one liner script to clear all logs of files(only files) from /var/log/ directory and second check recently which logs file has made an entry.
First:
find /var/log/ -type f #listing file recursively
Second:
for i in $(find $PWD -type f) ; do cat /dev/null > "$i" ; done #empty files recursively
Third use:
ls -ltr $(find /var/log/ -type f ) # listing file used in recent
Note: for directory location you can also pass $PWD instead of /var/log.
If you don't have symbolic links, you could try
tree -iFL 1 [DIR]
-i makes tree print filenames in each line, without the tree structure.
-f makes tree print the full path of each file.
-L 1 avoids tree from recursion.
Write one small function
lsf() {
ls `pwd`/$1
}
Then you can use like
lsf test.sh
it gives full path like
/home/testuser/Downloads/test.sh
I used the following to list absolute path of files in a directory in a txt file:
find "$PWD" -wholename '*.JPG' >test.txt
find / -print will do this
ls -1 | awk -vpath=$PWD/ '{print path$1}'

Resources