I have a very simple script that's essentially an alias for find -iname, so I can find the path to a file whose name I sort of remember , but whose location I've definitely forgotten. I'd type myscript *cri*unis*, for example, to quickly locate crime_and_punishment.txt.
But now I am getting rather lazy about pressing the shift key to enter the wildcard character, so I'd like to make , be the wildcard character only when parsing the input parameters to my script.
It's similar to if I were using TeX and had to type a long table and wanted to temporarily make , be the column delimiter: I would type \bgroup \catcode`\,=4 and then enter \egroup when finished with my table.
And come to think of it, how do I enter a back-tick within an inline code snippet on this site's markdown?! The markdown should let me temporarily use ; to delimit an inline code snippet.
Not really sure I understand the question but if you would like to replace all , characters in the input with * characters. Something like this should work:
#!/bin/bash
search=`echo "$1" | tr "," "*"`
find -iname "$search"
Related
I have a json file that is download using curl. It has some information of a confluence page. I want to extract only 3 parts that downloaded information - that is page: id, status and title.
I have written a bash script for this and my constraint is that I am not sure how to pass multiple variables in grep command
id=id #hardcoded
status=status #hardcoded
echo Enter title you are looking for: #taking input from user here read title_name echo echo echo Here are details
curl -u username:password -sX GET "http://X.X.X.X:8090/rest/api/content?type=page&start=0&limit=200" | python -mjson.tool | grep -Eai "$title_name"|$id|$status"
Aside from a typo (you have an unbalanced quote - please always check the syntax for correctness before you are posting something), the basic idea of your approach would work in that
grep -Eai "$title_name|$id|$status"
would select those text lines which contain those lines which contain the content of one of the variables title_name, id or status.
However, it is a pretty fragile solution. I don't know what can be the actual content of those variables, but for instance, if title_name were set to X.Z, it would also match lines containing the string XYZ, since the dot matches any character. Similarily, if title_name would contain, say, a lone [ or (, grep would complained about an unmatched parentheses error.
If you want to match the string literally and not be taken as regular expressions, it is better to write those pattern into a file (one pattern per line) and use
grep -F -f patternfile
for searching. Of course, since you are using bash, you can also use process substitution if you prefer not using an explicit temporary file.
Writing a small script in bash (MacOS in fact) and I want to use find, with multiple sources. Not normally a problem, but the list of source directories to search is held as a string in a variable. Again, not normally a problem, but some of them contain spaces in their name.
I can construct the full command string and if entered directly at the command prompt (copy and paste in fact) it works as required and expected. But when I try and run it within the script, it flunks out on the spaces in the name and I have been unable to get around this.
I cannot quote the entire source string as that is then just seen as one single item which of course does not exist. I escape each space with a backslash within the string held in the variable and it is simply lost. If I use double backslash, they both remain in place and again it fails. Any method of quoting I have tried is basically ignored, the quotes are seen as normal characters and splitting is done at each space.
I have so far only been able to use eval on the whole command string to get it to work but I felt there ought to be a better solution than this.
Ironically, if I use AppleScript I CAN create a suitable command string and run it perfectly with doShellScript (ok, that's using JXA, but it's the same with actual AppleScript). However, I have so far been unable to find the correct escape mechanism just in a bash script, without resorting to eval.
Anyone suggest a solution to this?
If possible, don't store all paths in one string. An array is safer and more convenient:
paths=("first path" "second path" "and so on")
find "${paths[#]}"
The find command will expand to
find "first path" "second path" "and so on"
If you have to use the string and don't want to use eval, split the string into an array:
string="first\ path second\ path and\ so\ on"
read -a paths <<< "$string"
find "${paths[#]}"
Paths inside string should use \ to escape spaces; wraping paths inside"" or '' will not work. eval might be the better option here.
I have a folder that was created automatically. The user unintentionally provided smart (curly) quotes as part of the name, and the process that sanitizes the inputs did not catch these. As a result, the folder name contains the smart quotes. For example:
this-is-my-folder’s-name-“Bob”
I'm now trying to rename/remove said folder on the command line, and none of the standard tricks for dealing with files/folders with special characters (enclosing in quotes, escaping the characters, trying to rename it by inode, etc.) are working. All result in:
mv: cannot move this-is-my-folder’s-name-“Bob” to this-is-my-folders-name-BOB: No such file or directory
Can anyone provide some advice as to how I can achieve this?
To get the name in a format you can copy-and-paste into your shell:
printf '%q\n' this*
...will print out the filename in a manner the shell will accept as valid input. This might look something like:
$'this-is-my-folder200\231s-name-200\234Bob200\235'
...which you can then use as an argument to mv:
mv $'this-is-my-folder200\231s-name-200\234Bob200\235' this-is-my-folders-name-BOB
Incidentally, if your operating system works the same way mine does (when running the test above), this would explain why using single-character globs such as ? for those characters didn't work: They're actually more than one byte long each!
You can use shell globbing token ? to match any single character, so matching the smart quotes using ? should do:
mv this-is-my-folder?s-name-?Bob? new_name
Here replacing the smart quotes with ? to match the file name.
There are several possibilities.
If an initial substring of the file name ending before the first quote is unique within the directory, then you can use filename completion to help you type an appropriate command. Type "mv" (without the quotes) and the unique initial substring, then press the TAB key to request filename completion. Bash will complete the filename with the correct characters, correctly escaped.
Use a graphical file browser. Then you can select the file to rename by clicking on it. (Details of how to proceed from there depend on the browser.) If you don't have a graphical terminal and can't get one, then you may be able to do the same with a text-mode browser such as Midnight Commander.
A simple glob built with the ? or * wildcard should be able to match the filename
Use a more complex glob to select the filename, and perhaps others with the same problem. Maybe something like *[^a-zA-Z0-9-]* would do. Use a pattern substitution to assign a new name. Something like this:
for f in *[^a-zA-Z0-9-]*; do
mv "$f" "${f//[^a-zA-Z0-9-]/}"
done
The substitution replaces all appearances of a characters that are not decimal digits, appercase or lowercase Latin letters, or hyphens with nothing (i.e. it strips them). Do take care before you use this, though, to make sure you're not going to make more changes than you intend to do.
Right now I'm working on creating a script in linux bash shell that adds the word "-BACKUP" to a file name between certain points. For example, if I had a file/string called file1.txt I would want to add the "-BACKUP" between "file1" and ".txt" to make "file1-BACKUP.txt". How would I go about doing that? Would I use the basename command anywhere? In this situation, the extension and stem could be anything, not just what I gave as an example. All help is appreciated!
Use the substring processing parameter expansion operator % to remove the suffix from the string, then append the new text. The variable must be enclosed in braces for substring processing parameter expansion to work.
var="file1.txt"
echo "${var%.txt}.BACKUP.txt"
I did use the manual but I am unable to get all the options together to understand what the above code is actually doing.
awk -v v='"' 'BEGIN{FS=OFS=v}{gsub(",","",$2);print }' \
${SOURCE_LOCATION}/TEMP1_$file_name>${SOURCE_FILE_LOCATION}/TEMP2_$file_name
When do we have to use the curly brackets in a code after the '$' and when not to. Please explain. Any help is really appreciated.
This command would remove all the commas in the second field. The field separator being the quote character " (as specified by FS).
For example, the following string:
something "string, with, commas" something "else, here, and more"
would be transformed to:
something "string with commas" something "else, here, and more"
The significance of {} in variable names has been well explained by #Joni.
The input is read from the file ${SOURCE_LOCATION}/TEMP1_$file_name and output is redirected to ${SOURCE_LOCATION}/TEMP2_$file_name.
You must use the curly brackets syntax when a variable name is followed by something that's not part of the variable name but could be confused with it. For example, compare
hello="Hello"
echo $hello_world
with
hello="Hello"
echo ${hello}_world
The first one outputs an empty line (or the value of the shell variable hello_world, if it exists), and the second one outputs Hello_world.
In your case they are not necessary because a slash can never be a part of the variable name. Some people prefer to use the brackets to make it clear where the variable begins and where it ends even when they are not required.