I'm trying to identify several folders in my script that run from 101-121. The script is written to look through one specific folder at a time.
The error I get is:
command not found
Usage: grep [OPTION]... PATTERN [FILE]...
Try 'grep --help' for more information.
Piece of my code that is not working.
for i in 1;
do
case $i in
1)
projectfolder= `ls -l| grep "1*"` ;; #trying to identify individual folders 101-121
esac
done
does not localize the folders very well.
projectfolder= `ls -l| grep "1*"`
is a terrible thing to do. First, you probably intended to write projectfolder=$(ls -l| grep "1*") (using $() for readability, but the important detail is the lack of space after the =), but doing that is also a bad idea. Why not just do for i in 1*; do ...?
If your project folders all follow the naming pattern you describe, you should use brace expansion to expand to the numbers 101..121 and then easily iterate over them:
for projectfolder in {101..121} ; do
[ -d "$projectfolder" ] && echo "'${projectfolder}' exists and is a directory."
done
Brace expansion does not check for any of the directories' existence, so to see which one are actually there, you would test each one using [ -d.
Search for Brace Expansion in the bash(1) manual page and type help test for more information
Related
I need to accommodate spaces in filepaths. Why does "find" not work from the script, but works from the cli?
MyLaptop$ ./my-bash-script.sh
find: '/Sandbox/test folder/testfiles-good/ResidentFile_1.pcapng': No such file or directory
MyLaptop$ find '/Sandbox/test folder/testfiles-good/ResidentFile_1.pcapng'
/Sandbox/test folder/testfiles-good/ResidentFile_1.pcapng
Using echo find -f "'$line'"
output: find -f '/Sandbox/test folder/testfiles-good/ResidentFile_1.pcapng'
But in this case: FOUND="$(find -f "'$line'")"
it does not
In this case ...
FOUND="$(find -f "'$line'")"
... you are asking find for a file whose name contains leading and trailing ' characters. That is unlikely to be what you intended. Though it may look strange, this is probably what you meant:
FOUND="$(find -f "$line")"
The " characters inside the command substitution do not pair with those outside (and if they did then the original command substitution would be wrong a different way). On the other hand, word splitting is not performed on the result of the command substitution when it is used on the right-hand side of a variable assignment, so you could also just use
FOUND=$(find -f "$line")
Hi… Need a little help here…
I tried to emulate the DOS' dir command in Linux using bash script. Basically it's just a wrapped ls command with some parameters plus summary info. Here's the script:
#!/bin/bash
# default to current folder
if [ -z "$1" ]; then var=.;
else var="$1"; fi
# check file existence
if [ -a "$var" ]; then
# list contents with color, folder first
CMD="ls -lgG $var --color --group-directories-first"; $CMD;
# sum all files size
size=$(ls -lgGp "$var" | grep -v / | awk '{ sum += $3 }; END { print sum }')
if [ "$size" == "" ]; then size="0"; fi
# create summary
if [ -d "$var" ]; then
folder=$(find $var/* -maxdepth 0 -type d | wc -l)
file=$(find $var/* -maxdepth 0 -type f | wc -l)
echo "Found: $folder folders "
echo " $file files $size bytes"
fi
# error message
else
echo "dir: Error \"$var\": No such file or directory"
fi
The problem is when the argument contains an asterisk (*), the ls within the script acts differently compare to the direct ls command given at the prompt. Instead of return the whole files list, the script only returns the first file. See the video below to see the comparation in action. I don't know why it behaves like that.
Anyone knows how to fix it? Thank you.
Video: problem in action
UPDATE:
The problem has been solved. Thank you all for the answers. Now my script works as expected. See the video here: http://i.giphy.com/3o8dp1YLz4fIyCbOAU.gif
The asterisk * is expanded by the shell when it parses the command line. In other words, your script doesn't get a parameter containing an asterisk, it gets a list of files as arguments. Your script only works with $1, the first argument. It should work with "$#" instead.
This is because when you retrieve $1 you assume the shell does NOT expand *.
In fact, when * (or other glob) matches, it is expanded, and broken into segments by $IFS, and then passed as $1, $2, etc.
You're lucky if you simply retrieved the first file. When your first file's path contains spaces, you'll get an error because you only get the first segment before the space.
Seriously, read this and especially this. Really.
And please don't do things like
CMD=whatever you get from user input; $CMD;
You are begging for trouble. Don't execute arbitrary string from the user.
Both above answers already answered your question. So, i'm going a bit more verbose.
In your terminal is running the bash interpreter (probably). This is the program which parses your input line(s) and doing "things" based on your input.
When you enter some line the bash start doing the following workflow:
parsing and lexical analysis
expansion
brace expansion
tidle expansion
variable expansion
artithmetic and other substitutions
command substitution
word splitting
filename generation (globbing)
removing quotes
Only after all above the bash
will execute some external commands, like ls or dir.sh... etc.,
or will do so some "internal" actions for the known keywords and builtins like echo, for, if etc...
As you can see, the second last is the filename generation (globbing). So, in your case - if the test* matches some files, your bash expands the willcard characters (aka does the globbing).
So,
when you enter dir.sh test*,
and the test* matches some files
the bash does the expansion first
and after will execute the command dir.sh with already expanded filenames
e.g. the script get executed (in your case) as: dir.sh test.pas test.swift
BTW, it acts exactly with the same way for your ls example:
the bash expands the ls test* to ls test.pas test.swift
then executes the ls with the above two arguments
and the ls will print the result for the got two arguments.
with other words, the ls don't even see the test* argument - if it is possible - the bash expands the wilcard characters. (* and ?).
Now back to your script: add after the shebang the following line:
echo "the $0 got this arguments: $#"
and you will immediatelly see, the real argumemts how your script got executed.
also, in such cases is a good practice trying to execute the script in debug-mode, e.g.
bash -x dir.sh test*
and you will see, what the script does exactly.
Also, you can do the same for your current interpreter, e.g. just enter into the terminal
set -x
and try run the dir.sh test* = and you will see, how the bash will execute the dir.sh command. (to stop the debug mode, just enter set +x)
Everbody is giving you valuable advice which you should definitely should follow!
But here is the real answer to your question.
To pass unexpanded arguments to any executable you need to single quote them:
./your_script '*'
The best solution I have is to use the eval command, in this way:
#!/bin/bash
cmd="some command \"with_quetes_and_asterisk_in_it*\""
echo "$cmd"
eval $cmd
The eval command takes its arguments and evaluates them into the command as the shell does.
This solves my problem when I need to call a command with asterisk '*' in it from a script.
I need to find only files in directory which have a extension using ls (can't use find).
I tried ls *.*, but if dir doesn't contain any file with extension it returns "No such file or directory".
I dont want that error and want ls to return to cmd prompt if there are files with extension.
I have trying to use grep with ls to achieve the same.
ls|grep "*.*" - doesn't work
but ls | grep "\." works.
I have no idea why grep *.* doesn't work. Any help is appreciated!
Thanks!
I think the correct solution is this:
( shopt -s nullglob ; echo *.* )
It's a bit verbose, but it will always work no matter what kind of funky filenames you have. (The problem with piping ls to grep is that typical systems allow really bizarre characters in filenames, including, for example, newlines.)
The shopt -s nullglob part enables ("sets") the nullglob shell optoption, which tells Bash that if no files have names matching *.*, then the *.* should be removed (i.e., should expand into nothing) rather than being left alone.
The parentheses (...) are to set up a subshell, so the nullglob option is only enabled for this small part of the script.
It's important to understand the difference between a shell pattern and a regular expression. Shell patterns are a bit simpler, but less flexible. grep matches using a regular expression. A shell pattern like
*.*
would be done with a regular expression as
.*\..*
but the regular expressions in grep are not anchored, which means it searches for a match anywhere on the line, making the two .* parts unnecessary.
Try
ls -1 | grep "\."
list only files with extensión and nothing (empty list) if there is no file: like you need.
With Linux grep, you can add -v to get a list files with no extension.
I know how to do a search and replace amongst group of files:
perl -pi -w -e 's/search/replace/g;' *.php
So I can use that to search for a keyword or phrase and change it. But I have a more complicated task I dont know how to do.
I want to do a search and replace among all my php files to search for a specific Keyword and replace it with the File Name minus the extension.
Example: Search the file Mountains.php for the keyword Trees and everywhere you see Trees, replace it with Mountains
Of course I want to be able to do that in batch, for a few hundred php files all with different names, however, all containing the search term Trees.
If someone is looking for an extra challenge, haha, it would be even better if I could do a more complex scenario such as....
Example: Search the file MountainTowns.php for the keyword Trees and everywhere you see Trees, replace it with "Mountain Towns" (note the extra space, Capital Letters could would indicate where spaces go)
Thanks for your time and considering my question.
Well, the filename is in $ARGV, so there is not much more work needed.
perl -i -pe '($x=$ARGV)=~s{.php$}{};s{Trees}{$x}g' BlueMountains.php RedMountains.php
Add in
$x=~s{(.)([A-Z])}{$1 $2}g;
to add the space before upcased letters, for a complete line of
perl -i -pe '($x=$ARGV)=~s{.php$}{};$x=~s{(.)([A-Z])}{$1 $2}g;s{Trees}{$x}g' BlueRedMountains.php
This might work for you:
printf "%s\n" *.php |perl -pwe 's|(.*).php|perl -pi -we "s/Trees/$1/g;" $&|' | bash
This uses perl to write a script to do you bidding.
Other little languages could be employed, like awk or:
printf "%s\n" *.php |sed 'h;s/\.php//;s/\B[A-Z]/ &/;G;s|\(.*\)\n\(.*\)|sed -i "s/Trees/\1/g" \2|' | bash
This uses sed to provide a solution for the second request.
You want a separate replacement for each file, so run a separate search and replace for each:
for file in *.php; do sed -i "s/foo/${file%.*}/g" "$file"; done
And your second request is a bit harder, it at least requires a subshell.
for file in *; do sed -i "s/bar/$(echo ${file%.*} | sed 's/\(.\)\([A-Z]\)/\1 \2/')/g" "$file"; done
It's a bit more readable if you put it in a script:
#!/bin/bash
for file in "$#"; do
replacement=$(echo ${file%.*} | sed 's/\(.\)\([A-Z]\)/\1 \2/')
sed -i "s/bar/$replacement/g" "$file";
done
This will work over all the arguments passed it, so call with ./script.sh *.php.
I'm trying to find all files with a specific extension in a directory and its subdirectories with my bash (Latest Ubuntu LTS Release).
This is what's written in a script file:
#!/bin/bash
directory="/home/flip/Desktop"
suffix="in"
browsefolders ()
for i in "$1"/*;
do
echo "dir :$directory"
echo "filename: $i"
# echo ${i#*.}
extension=`echo "$i" | cut -d'.' -f2`
echo "Erweiterung $extension"
if [ -f "$i" ]; then
if [ $extension == $suffix ]; then
echo "$i ends with $in"
else
echo "$i does NOT end with $in"
fi
elif [ -d "$i" ]; then
browsefolders "$i"
fi
done
}
browsefolders "$directory"
Unfortunately, when I start this script in terminal, it says:
[: 29: in: unexpected operator
(with $extension instead of 'in')
What's going on here, where's the error?
But this curly brace
find "$directory" -type f -name "*.in"
is a bit shorter than that whole thing (and safer - deals with whitespace in filenames and directory names).
Your script is probably failing for entries that don't have a . in their name, making $extension empty.
find {directory} -type f -name '*.extension'
Example: To find all csv files in the current directory and its sub-directories, use:
find . -type f -name '*.csv'
The syntax I use is a bit different than what #Matt suggested:
find $directory -type f -name \*.in
(it's one less keystroke).
Without using find:
du -a $directory | awk '{print $2}' | grep '\.in$'
Though using find command can be useful here, the shell itself provides options to achieve this requirement without any third party tools. The bash shell provides an extended glob support option using which you can get the file names under recursive paths that match with the extensions you want.
The extended option is extglob which needs to be set using the shopt option as below. The options are enabled with the -s support and disabled with he -u flag. Additionally you could use couple of options more i.e. nullglob in which an unmatched glob is swept away entirely, replaced with a set of zero words. And globstar that allows to recurse through all the directories
shopt -s extglob nullglob globstar
Now all you need to do is form the glob expression to include the files of a certain extension which you can do as below. We use an array to populate the glob results because when quoted properly and expanded, the filenames with special characters would remain intact and not get broken due to word-splitting by the shell.
For example to list all the *.csv files in the recursive paths
fileList=(**/*.csv)
The option ** is to recurse through the sub-folders and *.csv is glob expansion to include any file of the extensions mentioned. Now for printing the actual files, just do
printf '%s\n' "${fileList[#]}"
Using an array and doing a proper quoted expansion is the right way when used in shell scripts, but for interactive use, you could simply use ls with the glob expression as
ls -1 -- **/*.csv
This could very well be expanded to match multiple files i.e. file ending with multiple extension (i.e. similar to adding multiple flags in find command). For example consider a case of needing to get all recursive image files i.e. of extensions *.gif, *.png and *.jpg, all you need to is
ls -1 -- **/+(*.jpg|*.gif|*.png)
This could very well be expanded to have negate results also. With the same syntax, one could use the results of the glob to exclude files of certain type. Assume you want to exclude file names with the extensions above, you could do
excludeResults=()
excludeResults=(**/!(*.jpg|*.gif|*.png))
printf '%s\n' "${excludeResults[#]}"
The construct !() is a negate operation to not include any of the file extensions listed inside and | is an alternation operator just as used in the Extended Regular Expressions library to do an OR match of the globs.
Note that these extended glob support is not available in the POSIX bourne shell and its purely specific to recent versions of bash. So if your are considering portability of the scripts running across POSIX and bash shells, this option wouldn't be right.
find "$PWD" -type f -name "*.in"
There's a { missing after browsefolders ()
All $in should be $suffix
The line with cut gets you only the middle part of front.middle.extension. You should read up your shell manual on ${varname%%pattern} and friends.
I assume you do this as an exercise in shell scripting, otherwise the find solution already proposed is the way to go.
To check for proper shell syntax, without running a script, use sh -n scriptname.
To find all the pom.xml files in your current directory and print them, you can use:
find . -name 'pom.xml' -print
find $directory -type f -name "*.in"|grep $substring
for file in "${LOCATION_VAR}"/*.zip
do
echo "$file"
done