Perl script to add newline to end of textfile - linux

I have tried to make a perl script add a newline to the end of a text file and then write in the following:
/bin/false
The file looks like this:
/bin/sh
/bin/dash
/bin/bash
/bin/rbash
/usr/bin/tmux
/usr/bin/screen
I have tried running the following script:
perl -pi -e 's///usr//bin//screen///usr//bin//screen"\n"//bin//false/g' /etc/shells
But without any luck. It does not accept the newline and just returns an error for the syntax. Can you help!?

This:
s///usr//bin//screen///usr//bin//screen"\n"//bin//false/g
Indeed has syntax errors. Starting from the fact that you used double
bars on the path, where you should have escaped them (\/), and also
the strange newline string in the middle of it. You could also use other
delimiters rather than forward slash, to avoid having to escape all.
s!/usr/bin/screen\n!/usr/bin/screen\n/bin/false!
Also, don't forget to close your last line. And you can also use \K
(see perldoc perlre for details) to simplify your regex, with a final
result of:
s!/usr/bin/screen\n\K!/bin/false\n!
But anyway, this seems to be overkill. You can use your shell features
to echo a line and redirect to a file, by appending with >>:
echo /bin/false >> /etc/shells

Related

String replacing pattern matching doesn't work in bash script

I'm trying to replace the host address of an ip address.I tried as below in a bash command line window and it works fine:
$ baseIp="10.215.5.16"
$ ip=18
$ echo ${baseIp/%\.+([0-9])/\."$ip"}
But when I write a bash script as blow,it doesn't work:
#!/bin/bash
baseIp="10.215.5.16"
ip=18
currIp=${baseIp/%\.+([0-9])/\."$ip"}
echo $currIp
It prints:
$ ./test.sh
10.215.5.16
Thanks
You appear to be using an extended glob expression. Extended globs are not enabled by default; you may have them enabled in your .bashrc and/or .bash_profile, but that doesn't affect the script. Add shopt -s extglob at the beginning of the script.
BTW, you also don't need to escape the . character in either the pattern or the replacement, but (as #CostiCiudatu pointed out) you are missing it in the replacement in the script. Also, double-quoting $ip but leaving the whole expression unquoted doesn't make much sense. I'd recommend this:
echo "${baseIp/%.+([0-9])/.$ip}"

Bash command line arguments passed to sed via ssh

I am looking to write a simple script to perform a SSH command on many hosts simultaneously, and which hosts exactly are generated from another script. The problem is that when I run the script using sometihng like sed it doesn't work properly.
It should run like sshall.sh {anything here} and it will run the {anything here} part on all the nodes in the list.
sshall.sh
#!/bin/bash
NODES=`listNodes | grep "node-[0-9*]" -o`
echo "Connecting to all nodes and running: ${#:1}"
for i in $NODES
do
:
echo "$i : Begin"
echo "----------------------------------------"
ssh -q -o "StrictHostKeyChecking no" $i "${#:1}"
echo "----------------------------------------"
echo "$i : Complete";
echo ""
done
When it is run with something like whoami it works but when I run:
[root#myhost bin]# sshall.sh sed -i '/^somebeginning/ s/$/,appendme/' /etc/myconfig.conf
Connecting to all nodes and running: sed -i /^somebeginning/ s/$/,appendme/ /etc/myconfig.conf
node-1 : Begin
----------------------------------------
sed: -e expression #1, char 18: missing command
----------------------------------------
node-1 : Complete
node-2 : Begin
----------------------------------------
sed: -e expression #1, char 18: missing command
----------------------------------------
node-2 : Complete
…
Notice that the quotes disappear on the sed command when sent to the remote client.
How do I go about fixing my bash command?
Is there a better way of achieving this?
Substitute an eval-safe quoted version of your command into a heredoc:
#!/bin/bash
# ^^^^- not /bin/sh; printf %q is an extension
# Put your command into a single string, with each argument quoted to be eval-safe
printf -v cmd_q '%q ' "$#"
while IFS= read -r hostname; do
# run bash -s remotely, with that string passed on stdin
ssh -q -o 'StrictHostKeyChecking no' "$hostname" "bash -s" <<EOF
$cmd_q
EOF
done < <(listNodes | grep -o -e "node-[0-9*]")
Why this works reliably (and other approaches don't):
printf %q knows how to quote contents to be eval'd by that same shell (so spaces, wildcards, various local quoting methods, etc. will always be supported).
Arguments given to ssh are not passed to the remote command individually!
Instead, they're concatenated into a string passed to sh -c.
However: The output of printf %q is not portable to all POSIX-derived shells! It's guaranteed to be compatible with the same shell locally in use -- ksh will always parse output from printf '%q' in ksh, bash will parse output from printf '%q' in bash, etc; thus, you can't safely pass this string on the remote argument vector, because it's /bin/sh -- not bash -- running there. (If you know your remote /bin/sh is provided by bash, then you can run ssh "$hostname" "$cmd_q" safely, but only under this condition).
bash -s reads the script to run from stdin, meaning that passing your command there -- not on the argument vector -- ensures that it'll be parsed into arguments by the same shell that escaped it to be shell-safe.
You want to pass the entire command -- with all of its arguments, spaces, and quotation marks -- to ssh so it can pass it unchanged to the remote shell for parsing.
One way to do that is to put it all inside single quotation marks. But then you'll also need to make sure the single quotation marks within your command are preserved in the arguments, so the remote shell builds the correct arguments for sed.
sshall.sh 'sed -i '"'"'/^somebeginning/ s/$/,appendme/'"'"' /etc/myconfig.conf'
It looks redundant, but '"'"' is a common Bourne trick to get a single quotation mark into a single-quoted string. The first quote ends single-quoting temporarily, the double-quote-single-quote-double-quote construct appends a single quotation mark, and then the single quotation mark resumes your single-quoted section. So to speak.
Another trick that can be helpful for troubleshooting is to add the -v flag do your ssh flags, which will spit out lots of text, but most importantly it will show you exactly what string it's passing to the remote shell for parsing and execution.
--
All of this is fairly fragile around spaces in your arguments, which you'll need to avoid, since you're relying on shell parsing on the opposite end.
Thinking outside the box: instead of dealing with all the quoting issues and the word-splitting in the wrong places, you could attempt to a) construct the script locally (maybe use a here-document?), b) scp the script to the remote end, then c) invoke it there. This easily allows more complex command sequences, with all the power of shell control constructs etc. Debugging (checking proper quoting) would be a breeze by simply looking at the generated script.
I recommend reading the command(s) from the standard input rather than from the command line arguments:
cmd.sh
#!/bin/bash -
# Load server_list with user#host "words" here.
cmd=$(</dev/stdin)
for h in ${server_list[*]}; do
ssh "$h" "$cmd"
done
Usage:
./cmd.sh <<'CMD'
sed -i '/^somebeginning/ s/$/,appendme/' /path/to/file1
# other commands
# here...
CMD
Alternatively, run ./cmd.sh, type the command(s), then press Ctrl-D.
I find the latter variant the most convenient, as you don't even need for here documents, no need for extra escaping. Just invoke your script, type the commands, and press the shortcut. What could be easier?
Explanations
The problem with your approach is that the quotes are stripped from the arguments by the shell. For example, the argument '/^somebeginning/ s/$/,appendme/' will be interpreted as /^somebeginning/ s/$/,appendme/ string (without the single quotes), which is an invalid argument for sed.
Of course, you can escape the command with the built-in printf as suggested in other answer here. But the command becomes not very readable after escaping. For example
printf %q 'sed -i /^somebeginning/ s/$/,appendme/ /home/ruslan/tmp/file1.txt'
produces
sed\ -i\ /\^somebeginning/\ s/\$/\,appendme/\ /home/ruslan/tmp/file1.txt
which is not very readable, and will look ugly, if you print it to the screen in order to show the progress.
That's why I prefer to read from the standard input and leave the command intact. My script prints the command strings to the screen, and I see them just in the form I have written them.
Note, the for .. in loop iterates $IFS-separated "words", and is generally not preferred way to traverse an array. It is generally better to invoke read -r in a while loop with adjusted $IFS. I have used the for loop for simplicity, as the question is really about invoking the ssh command.
Logging into multiple systems over SSH and using the same (or variations on the same) command is the basic use case behind ansible. The system is not without significant flaws, but for simple use cases is pretty great. If you want a more solid solution without too much faffing about with escaping and looping over hosts, take a look.
Ansible has a 'raw' module which doesn't even require any dependencies on the target hosts, and you might find that a very simple way to achieve this sort of functionality in a way that frees you from the considerations of looping over hosts, handling errors, marshalling the commands, etc and lets you focus on what you're actually trying to achieve.

Why does bash insert additional quotes

I need to pipe an expression including single quotes to a command, but bash inserts loads of extra quotes which breaks my command. As a really simple example take:
#!/bin/bash -x
echo 'EXPRESSION' | more
which gives:
+ echo EXPRESSION
+ more
EXPRESSION
As I want the single quotes to be displayed, I must escape them:
#!/bin/bash -x
echo \'EXPRESSION\' | more
Which now gives me:
+ echo ''\''EXPRESSION'\'''
+ more
'EXPRESSION'
So within the script, I get this bizarre ''\''EXPRESSION'\''' thing. The command I am piping the expression to is an executable that interacts with a document management system, and expects a specific format—which includes single quotes around EXPRESSION and not ''\'' and '\'''.
Is there any way to stop bash from adding the additional quotes and backslashes? I've messed around with strings and eval etc., but have failed to get rid of those additional quotes.
You can also try it with double quotes like this,
echo "'EXPRESSION'"|more
Output will be,
'EXPRESSION'
The /bin/bash -x is producing the top 2 lines. Your code produces the 3rd line. If you want you can just remove the -x and you should see it in a better way.
The above answer from Skynet works just fine, but with the -x option, it still shows 3 lines. It's just what the -x does.

Bash printf %q invalid directive

I want to change my PS1 in my .bashrc file.
I've found a script using printf with %q directive to escape characters :
#!/bin/bash
STR=$(printf "%q" "PS1=\u#\h:\w\$ ")
sed -i '/PS1/c\'"$STR" ~/.bashrc
The problem is that I get this error :
script.sh: 2: printf: %q: invalid directive
Any idea ? Maybe an other way to escape the characters ?
The printf command is built into bash. It's also an external command, typically installed in /usr/bin/printf. On most Linux systems, /usr/bin/printf is the GNU coreutils implementation.
Older releases of the GNU coreutils printf command do not support the %q format specifier; it was introduced in version 8.25, released 2016-10-20. bash's built-in printf command does -- and has as long as bash has had a built-in printf command.
The error message implies that you're running script.sh using something other than bash.
Since the #!/bin/bash line appears to be correct, you're probably doing one of the following:
sh script.sh
. script.sh
source script.sh
Instead, just execute it directly (after making sure it has execute permission, using chmod +x if needed):
./script.sh
Or you could just edit your .bashrc file manually. The script, if executed correctly, will add this line to your .bashrc:
PS1=\\u#\\h:\\w\$\
(The space at the end of that line is significant.) Or you can do it more simply like this:
PS1='\u#\h:\w\$ '
One problem with the script is that it will replace every line that mentions PS1. If you just set it once and otherwise don't refer to it, that's fine, but if you have something like:
if [ ... ] ; then
PS1=this
else
PS1=that
fi
then the script will thoroughly mess that up. It's just a bit too clever.
Keith Thompson has given good advice in his answer. But FWIW, you can force bash to use a builtin command by preceding the command name with builtin eg
builtin printf "%q" "PS1=\u#\h:\w\$ "
Conversely,
command printf "%s\n" some stuff
forces bash to use the external command (if it can find one).
command can be used to invoke commands on disk when a function with the same name exists. However, command does not invoke a command on disk in lieu of a Bash built-in with the same name, it only works to suppress invocation of a shell function. (Thanks to Rockallite for bringing this error to my attention).
It's possible to enable or disable specific bash builtins (maybe your .bashrc is doing that to printf). See help enable for details. And I guess I should mention that you can use
type printf
to find out what kind of entity (shell function, builtin, or external command) bash will run when you give it a naked printf. You can get a list of all commands with a given name by passing type the -a option, eg
type -a printf
You can use grep to see the lines in your .bashrc file that contain PS1:
grep 'PS1' ~/.bashrc
or
grep -n0 --color=auto 'PS1=' ~/.bashrc
which gives you line numbers and fancy coloured output. And then you can use the line number to force sed to just modify the line you want changed.
Eg, if grep tells you that the line you want to change is line 7, you can do
sed -i '7c\'"$STR" ~/.bashrc
to edit it. Or even better,
sed -i~ '7c\'"$STR" ~/.bashrc
which backs up the original version of the file in case you make a mistake.
When using sed -i I generally do a test run first without the -i so that the output goes to the shell, to let me see what the modifications do before I write them to the file.

How to search and replace text in a file from a shell script?

I'm trying to write a shell script that does a search and replace inside a configuration file upon start-up.
The string we're trying to replace is:
include /etc/nginx/https.include;
and we want to replace it with a commented version:
#include /etc/nginx/https.include;
The file that contains the string that we want to replace is:
/etc/nginx/app-servers.include
I'm not a Linux guru and can't seem to find the command to do this.
perl -p -i -e 's%^(include /etc/nginx/https.include;)$%#$1%' /etc/nginx/ap-servers.include
If the line might not end in the ;, use instead:
perl -p -i -e 's%^(include /etc/nginx/https.include;.*)$%#$1%' /etc/nginx/ap-servers.include
If you want to preserve the original file, add a backup extension after -i:
perl -p -i.bak -e 's%^(include /etc/nginx/https.include;)$%#$1%' /etc/nginx/ap-servers.include
Now, explaining. The -p flag means replace in-place. All lines of the file will be fed to the expression, and the result will be used as replacement. The -i flag indicates the extension of the backup file. By using it without anything, you prevent generation of backups. The -e tells Perl to get the following parameter as an expression to be executed.
Now, the expression is s%something%other%. I use % instead of the more traditional / to avoid having to escape the slashes of the path. I use parenthesis in the expression and $1 in the substituted expression for safety -- if you change one, the other will follow. Thus, %#$1% is actually the second % of s, followed by the desired #, $1 indicating the pattern inside parenthesis, and the last % of s.
HTH. HAND.
sed -i 's/foo/bar/g' config.txt
This replaces all instances of foo (case insensitive) with bar in the file config.txt
Check out sed:
sed -i -r 's|^(include /etc/nginx/https.include;)$|#\1|' /etc/nginx/app-servers.include
-i means do the substitution in-place and -r means to use extended regular expressions.
cd pathname
for y in `ls *`;
do sed "s/ABCD/DCBA/g" $y > temp; mv temp $y;
done
This script shold replace string ABCD to DCBA in all the files in pathname

Resources