Bash: Cut current path until a certain folder - linux

lets say I have three bash files in different directories:
/a/b/c/d/e/f/script1.sh
/a/bb/c/d/script2.sh
/aa/b/c/d/e/f/g/h/script3.sh
If I call $(pwd) I get the path of the current directory. Is there a way to somehow "crop" this path until a certain folder? In the following an example is shown if the certain folder would be called "c":
In the case of script1.sh I would like to have the path: /a/b/c
In the case of script2.sh I would like to have the path: /a/bb/c
In the case of script3.sh I would like to have the path: /aa/b/c
Thank you for your help

I assume what you want is parameter expansion :
$ path="/a/b/c/d/e/f/script1.sh"
$ echo "${path#*/c}"
/d/e/f/script1.sh
Edit
Inversed :
$ path="/a/b/c/d/e/f/script1.sh"
$ echo "${path%/d*}"
/a/b/c
Regards!

Use cut command:
echo '/a/b/c/d/e/f/script1.sh' | cut -d '/' -f 1-4
echo '/a/bb/c/d/script2.sh' | cut -d '/' -f 1-4
echo '/aa/b/c/d/e/f/g/h/script3.sh' | cut -d '/' -f 1-4

Bash regex:
#!/bin/bash
[[ "$PWD" =~ (^/[^/]+/[^/]+/[^/]+)(.*) ]] && echo ${BASH_REMATCH[1]}
It returns the first three components of your path (if there are three components). You could also set the path tp for example $pwd and:
$ pwd=/a/b/c/d/e/f/script1.sh
$ [[ "$pwd" =~ (^/[^/]+/[^/]+/[^/]+)(.*) ]] && echo ${BASH_REMATCH[1]}
/a/b/c
Also, pay notice to #123's comment below; that is the correct way, my mind was off. Thank you, sir.

Given a situation like this:
$ pwd
/a/b/c/d/cc/e/c/f
$ FOLDER=c
You could use shell parameter expansion on the $PWD variable like this:
$ echo "${PWD%/${FOLDER}/*}/${FOLDER}"
/a/b/c/d/cc/e/c
$ echo "${PWD%%/${FOLDER}/*}/${FOLDER}"
/a/b/c
The difference is in the single or double %. With % the result of the expansion is the expanded value of parameter (${PWD}) with the shortest matching pattern. With %% it will be the longest matching pattern.
Just make sure you'll enclose the ${FOLDER} variable with forward slashes, because otherwise it could match directories that contain the match in their name (like in this example the directory cc)
Because you would like to include the folder you were looking for, it's included at the end of the string, prefixed with a forward slash.

Try this:
[sahaquiel#sahaquiel-PC h]$ pwd
/home/sahaquiel/a/b/c/d/e/f/g/h
[sahaquiel#sahaquiel-PC h]$ pwd | cut -d"/" -f1-6
/home/sahaquiel/a/b/c

Related

Linux Bash Script - match lower case path in argument with actual filesystem path

I have a linux script that gets an argument passed to it that originates from MSDOS (actually DOSEMU running MS DOS 6.22). The argument that gets passed is case insensitive (as DOS didn't do cases) but of course Linux does.
I am trying to get from the following passed argument
/media/zigg4/vol1/database/scan/stalbans/docprint/wp23452.wpd
to
/media/zigg4/vol1/Database/SCAN/STALBANS/DOCPRINT/Wp23452.WPD
I do not know the actual case sensitive path so I need to somehow determine it from the argument that is passed to the script. I have absolutely no idea where to start with this so any help is greatly appreciated.
edited for extra information and clarity
UPDATE
Thanks to the answer by #anubhava I used the following:-
#!/bin/bash
copies=1
if [ ! -z "$2" ]; then
copies=$2
fi
find / -readable -ipath $1 2>&1 | grep -v "Permission denied" | while IFS= read -r FILE; do
lpr -o Collate=True -#$copies -sP $FILE
done
Works great :-)
You can use -ipath option of find for ignore case path matching:
# assuming $arg contains path argument supplied
find . -ipath "*$arg*"
I would employ awk for this (of course without salary)
#!/bin/bash
awk -varg="$1" -vactual="/media/zigg4/vol1/Database/SCAN/STALBANS/DOCPRINT/Wp23452.WPD" 'BEGIN{
if (tolower(arg)==tolower(actual)){
printf "Argument matches actual filepath\n"
}
}'
Run the script as
./script "/media/zigg4/vol1/database/scan/stalbans/docprint/wp23452.wpd"
Something like this:
if [ "$( echo $real | tr A-Z a-z )" = "$lower" ]; then
echo "matchy"
else
echo "no is matchy"
fi
Some notes:
tr is doing a to-lower translate.
The $( ... ) bit is placing the result of the enclosed command into a string.
You could do the translate on either side if you aren't sure if your "lower case" string can be trusted...

Bash loop through directory including hidden file

I am looking for a way to make a simple loop in bash over everything my directory contains, i.e. files, directories and links including hidden ones.
I will prefer if it could be specifically in bash but it has to be the most general. Of course, file names (and directory names) can have white space, break line, symbols. Everything but "/" and ASCII NULL (0×0), even at the first character. Also, the result should exclude the '.' and '..' directories.
Here is a generator of files on which the loop has to deal with :
#!/bin/bash
mkdir -p test
cd test
touch A 1 ! "hello world" \$\"sym.dat .hidden " start with space" $'\n start with a newline'
mkdir -p ". hidden with space" $'My Personal\nDirectory'
So my loop should look like (but has to deal with the tricky stuff above):
for i in * ;
echo ">$i<"
done
My closest try was the use of ls and bash array, but it is not working with, is:
IFS=$(echo -en "\n\b")
l=( $(ls -A .) )
for i in ${l[#]} ; do
echo ">$i<"
done
unset IFS
Or using bash arrays but the ".." directory is not exclude:
IFS=$(echo -en "\n\b")
l=( [[:print:]]* .[[:print:]]* )
for i in ${l[#]} ; do
echo ">$i<"
done
unset IFS
* doesn't match files beginning with ., so you just need to be explicit:
for i in * .[^.]*; do
echo ">$i<"
done
.[^.]* will match all files and directories starting with ., followed by a non-. character, followed by zero or more characters. In other words, it's like the simpler .*, but excludes . and ... If you need to match something like ..foo, then you might add ..?* to the list of patterns.
As chepner noted in the comments below, this solution assumes you're running GNU bash along with GNU find GNU sort...
GNU find can be prevented from recursing into subdirectories with the -maxdepth option. Then use -print0 to end every filename with a 0x00 byte instead of the newline you'd usually get from -print.
The sort -z sorts the filenames between the 0x00 bytes.
Then, you can use sed to get rid of the dot and dot-dot directory entries (although GNU find seems to exclude the .. already).
I also used sed to get read of the ./ in front of every filename. basename could do that too, but older systems didn't have basename, and you might not trust it to handle the funky characters right.
(These sed commands each required two cases: one for a pattern at the start of the string, and one for the pattern between 0x00 bytes. These were so ugly I split them out into separate functions.)
The read command doesn't have a -z or -0 option like some commands, but you can fake it with -d "" and blanking the IFS environment variable.
The additional -r option prevents a backslash-newline combo from being interpreted as a line continuation. (A file called backslash\\nnewline would otherwise be mangled to backslashnewline.) It might be worth seeing if other backslash-combos get interpreted as escape sequences.
remove_dot_and_dotdot_dirs()
{
sed \
-e 's/^[.]\{1,2\}\x00//' \
-e 's/\x00[.]\{1,2\}\x00/\x00/g'
}
remove_leading_dotslash()
{
sed \
-e 's/^[.]\///' \
-e 's/\x00[.]\//\x00/g'
}
IFS=""
find . -maxdepth 1 -print0 |
sort -z |
remove_dot_and_dotdot_dirs |
remove_leading_dotslash |
while read -r -d "" filename
do
echo "Doing something with file '${filename}'..."
done
It may not be the most favorable way but I tried bellow thing
while read line ; do echo $line; done <<< $(ls -a | grep -v -w ".")
check the below trail which I did
Try the find command, something like:
find .
That will list all the files in all recursive directories.
To output only files excluding the leading . or .. try:
find . -type f -printf %P\\n

Get and print directories from $PATH in bash

The script that I have to write must find the directories from the $PATH variable and print only the ones that end with an i.
How am I thinking about doing it
Get each directory from the variable with a for loop.
Find the length of each directory and get the last character from each using a substring
Use an If condition to print the directories that end with an i
Problems
The directories are not separated with a new line and I can't read them using a for loop.
Any ideas on how to get over this problem,or can you think of something more appropriate.
You can use this BASH one-liner for that job:
(IFS=':'; for i in $PATH; do [[ -d "$i" && $i =~ i$ ]] && echo "$i"; done)
IFS=':' sets input field separator to :
$PATH is iterated in a for loop
Each path element is tested if it is a directory and if it is ending with i using BASH regex
If test passes then it is pritned
Use bash's parameter expansion to replace all delimiters.
${parameter//pat/string}
For example,
mypaths="${PATH//:/ }"
will split the path by directory, so then you can run:
for directory in $mypaths
do
...
done
You can change the Inter Field Separator (IFS) to colon then path is dissected auto_magically. ;-)
IFS=:
for i in $PATH
do
echo $i | egrep -e 'i$'
done
grep 'i$' <<<"${PATH//:/$'\n'}"
The $PATH entries are split into individual lines by replacing : instances with newlines ($'\n') in a parameter expansion; $'\n' is an ANSI C-quoted string.
The resulting strings is passed to the stdin of grep as a here-string
(<<<...).
grep is then used to match only those lines that end in ($) the letter i.
To match case-insensitively, use grep -i 'i$'.
A demonstration:
$ (PATH='/ends/in_i:/usr/bin:/also/ends_in_i'; grep 'i$' <<<"${PATH//:/$'\n'}")
/ends/in_i
/also/ends_in_i

Removing an additional leading slash if one exists

I'm writing a script where I have a default directory for outputting data or the user can specify a directory. The problem is, I don't know how to do this eloquently. Here is what I have:
#!/bin/bash
OUTPUT="$1"
DEFAULT_DIR=/Default/Dir/For/Me
if [ -z "$OUTPUT" ]
then
OUTPUT=.${DEFAULT_DIR}
else
OUTPUT=""${OUTPUT_DIR}""${DEFAULT_DIR}""
fi
echo "$OUTPUT"
If I do this ./script / I get //Default/Dir/For/Me
If I do this ./script /home I get /home/Default/Dir/For/Me
If I do this ./script /home/ I get /home//Default/Dir/For/Me
Is there any way to make this pretty and handle the first scenario properly? Obviously, the first scenario won't work because the directory // does not exist.
(Just to make it clear from the comments)
What I suggest is to pipe tr -s "/" so that it removes duplicate slashes:
$ echo "/home//Default/Dir/For/Me" | tr -s "/"
/home/Default/Dir/For/Me
$ echo "/home//Default/Dir/For/M//////////e" | tr -s "/"
/home/Default/Dir/For/M/e
Here's another solution without having to fork another process:
DEFAULT_DIR=${DEFAULT_DIR//\/\///}
That replaces all occurrences of // with / in the string.

Looping through the elements of a path variable in Bash

I want to loop through a path list that I have gotten from an echo $VARIABLE command.
For example:
echo $MANPATH will return
/usr/lib:/usr/sfw/lib:/usr/info
So that is three different paths, each separated by a colon. I want to loop though each of those paths. Is there a way to do that? Thanks.
Thanks for all the replies so far, it looks like I actually don't need a loop after all. I just need a way to take out the colon so I can run one ls command on those three paths.
You can set the Internal Field Separator:
( IFS=:
for p in $MANPATH; do
echo "$p"
done
)
I used a subshell so the change in IFS is not reflected in my current shell.
The canonical way to do this, in Bash, is to use the read builtin appropriately:
IFS=: read -r -d '' -a path_array < <(printf '%s:\0' "$MANPATH")
This is the only robust solution: will do exactly what you want: split the string on the delimiter : and be safe with respect to spaces, newlines, and glob characters like *, [ ], etc. (unlike the other answers: they are all broken).
After this command, you'll have an array path_array, and you can loop on it:
for p in "${path_array[#]}"; do
printf '%s\n' "$p"
done
You can use Bash's pattern substitution parameter expansion to populate your loop variable. For example:
MANPATH=/usr/lib:/usr/sfw/lib:/usr/info
# Replace colons with spaces to create list.
for path in ${MANPATH//:/ }; do
echo "$path"
done
Note: Don't enclose the substitution expansion in quotes. You want the expanded values from MANPATH to be interpreted by the for-loop as separate words, rather than as a single string.
In this way you can safely go through the $PATH with a single loop, while $IFS will remain the same inside or outside the loop.
while IFS=: read -d: -r path; do # `$IFS` is only set for the `read` command
echo $path
done <<< "${PATH:+"${PATH}:"}" # append an extra ':' if `$PATH` is set
You can check the value of $IFS,
IFS='xxxxxxxx'
while IFS=: read -d: -r path; do
echo "${IFS}${path}"
done <<< "${PATH:+"${PATH}:"}"
and the output will be something like this.
xxxxxxxx/usr/local/bin
xxxxxxxx/usr/bin
xxxxxxxx/bin
Reference to another question on StackExchange.
for p in $(echo $MANPATH | tr ":" " ") ;do
echo $p
done
IFS=:
arr=(${MANPATH})
for path in "${arr[#]}" ; do # <- quotes required
echo $path
done
... it does take care of spaces :o) but also adds empty elements if you have something like:
:/usr/bin::/usr/lib:
... then index 0,2 will be empty (''), cannot say why index 4 isnt set at all
This can also be solved with Python, on the command line:
python -c "import os,sys;[os.system(' '.join(sys.argv[1:]).format(p)) for p in os.getenv('PATH').split(':')]" echo {}
Or as an alias:
alias foreachpath="python -c \"import os,sys;[os.system(' '.join(sys.argv[1:]).format(p)) for p in os.getenv('PATH').split(':')]\""
With example usage:
foreachpath echo {}
The advantage to this approach is that {} will be replaced by each path in succession. This can be used to construct all sorts of commands, for instance to list the size of all files and directories in the directories in $PATH. including directories with spaces in the name:
foreachpath 'for e in "{}"/*; do du -h "$e"; done'
Here is an example that shortens the length of the $PATH variable by creating symlinks to every file and directory in the $PATH in $HOME/.allbin. This is not useful for everyday usage, but may be useful if you get the too many arguments error message in a docker container, because bitbake uses the full $PATH as part of the command line...
mkdir -p "$HOME/.allbin"
python -c "import os,sys;[os.system(' '.join(sys.argv[1:]).format(p)) for p in os.getenv('PATH').split(':')]" 'for e in "{}"/*; do ln -sf "$e" "$HOME/.allbin/$(basename $e)"; done'
export PATH="$HOME/.allbin"
This should also, in theory, speed up regular shell usage and shell scripts, since there are fewer paths to search for every command that is executed. It is pretty hacky, though, so I don't recommend that anyone shorten their $PATH this way.
The foreachpath alias might come in handy, though.
Combining ideas from:
https://stackoverflow.com/a/29949759 - gniourf_gniourf
https://stackoverflow.com/a/31017384 - Yi H.
code:
PATHVAR='foo:bar baz:spam:eggs:' # demo path with space and empty
printf '%s:\0' "$PATHVAR" | while IFS=: read -d: -r p; do
echo $p
done | cat -n
output:
1 foo
2 bar baz
3 spam
4 eggs
5
You can use Bash's for X in ${} notation to accomplish this:
for p in ${PATH//:/$'\n'} ; do
echo $p;
done
OP's update wants to ls the resulting folders, and has pointed out that ls only requires a space-separated list.
ls $(echo $PATH | tr ':' ' ') is nice and simple and should fit the bill nicely.

Resources