linux bash script -> update remote mysql table -> error [0m - linux

i'm using some bash command to extract the current teamviewer id.
Therefore i use this:
#!/bin/bash
OUTPUT="$(teamviewer --info | grep "TeamViewer ID:" | tr -s " " | cut -d ":" -f$
TEAMVIEWERID="${OUTPUT}"
echo $TEAMVIEWERID
mysql --host=xxx --user=xxx --password=xxx xxx$
update table SET teamviewerID="$TEAMVIEWERID" WHERE client="$1";
EOF
echo "DONE"
if i run it:
pi#xxx:~/Documents/xxx/tv $ sudo ./tv.sh client_xxx
4975XXXXX
DONE
pi#xxx:~/Documents/xxx/tv $
ok everything seems to be fine BUT in mysql i receive the following thing:
[0m 4975XXXXX
I'm confused what is happening here...
thx for helping

The characters at the beginning ([0m) are a so called escape sequence. This specific one is used to clear all terminal formattings.
You can easily strip it by using sed.
Just replace your TEAMVIEWERID= line with the following:
TEAMVIEWERID=$(echo "$OUTPUT" | sed 's/\[0m\s//g')
Edit: If the TeamViewer ID always consists of numbers only, we can strip the unknown character by only allowing numbers:
TEAMVIEWERID=$(echo "$id" | sed 's/\[0m\s//g' | sed -re 's/[^0-9]+//g')
This will only allow numbers.

Related

Trimming string up to certain characters in Bash

I'm trying to make a bash script that will tell me the latest stable version of the Linux kernel.
The problem is that, while I can remove everything after certain characters, I don't seem to be able to delete everything prior to certain characters.
#!/bin/bash
wget=$(wget --output-document - --quiet www.kernel.org | \grep -A 1 "latest_link")
wget=${wget##.tar.xz\">}
wget=${wget%</a>}
echo "${wget}"
Somehow the output "ignores" the wget=${wget##.tar.xz\">} line.
You're trying remove the longest match of the pattern .tar.xz\"> from the beginning of the string, but your string doesn't start with .tar.xz, so there is no match.
You have to use
wget=${wget##*.tar.xz\">}
Then, because you're in a script and not an interactive shell, there shouldn't be any need to escape \grep (presumably to prevent usage of an alias), as aliases are disabled in non-interactive shells.
And, as pointed out, naming a variable the same as an existing command (often found: test) is bound to lead to confusion.
If you want to use command line tools designed to deal with HTML, you could have a look at the W3C HTML-XML-utils (Ubuntu: apt install html-xml-utils). Using them, you could get the info you want as follows:
$ curl -sL www.kernel.org | hxselect 'td#latest_link' | hxextract a -
4.10.8
Or, in detail:
curl -sL www.kernel.org | # Fetch page
hxselect 'td#latest_link' | # Select td element with ID "latest_link"
hxextract a - # Extract link text ("-" for standard input)
Whenever I need to extract a substring in bash I always see if I can brute force it in a couple of cut(1) commands. In your case, the following appears to work:
wget=$(wget --output-document - --quiet www.kernel.org | \grep -A 1 "latest_link")
echo $wget | cut -d'>' -f3 | cut -d'<' -f1
I'm certain there's a more elegant way, but this has simple syntax that I never forget. Note that it will break if 'wget' gets extra ">" or "<" characters in the future.
It is not recommended to use shell tools grep, awk, sed etc to parse HTML files.
However if you want a quick one liner then this awk should do the job:
get --output-document - --quiet www.kernel.org |
awk '/"latest_link"/ { getline; n=split($0, a, /[<>]/); print a[n-2] }'
4.10.8
sed method:
wget --output-document - --quiet www.kernel.org | \
sed -n '/latest_link/{n;s/^.*">//;s/<.*//p}'
Output:
4.10.8

bash escape exclamation character inside variable with backtick

I have this bash script:
databases=`mysql -h$DBHOST -u$DBUSER -p$DBPASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`
and the issue is when the password has all the characters possible. how can i escape the $DBPASSWORD in this case? If I have a password with '!' and given the fact that command is inside backticks. I have no experience in bash scripts but I've tried with "$DBPASSWORD" and with '$DBPASSWORD' and it doesn't work. Thank you
LATER EDIT: link to script here, line 170 -> https://github.com/Ardakilic/backmeup/blob/master/backmeup.sh
First: The answer from #bishop is spot on: Don't pass passwords on the command line.
Second: Use double quotes for all shell expansions. All of them. Always.
databases=$(mysql -h"$DBHOST" -u"$DBUSER" -p"$DBPASSWORD" -e "SHOW DATABASES;" | tr -d "| " | grep -v Database)
Don't pass the MySQL password on the command line. One, it can be tricky with passwords containing shell meta-characters (as you've discovered). Two, importantly, someone using ps can sniff the password.
Instead, either put the password into the system my.cnf, your user configuration file (eg .mylogin.cnf) or create an on-demand file to hold the password:
function mysql() {
local tmpfile=$(mktemp)
cat > "$tmpfile" <<EOCNF
[client]
password=$DBPASSWORD
EOCNF
mysql --defaults-extra-file="$tmpfile" -u"$DBUSER" -h"$DBHOST" "$#"
rm "$tmpfile"
}
Then you can run it as:
mysql -e "SHOW DATABASES" | tr -d "| " ....
mysql -e "SELECT * FROM table" | grep -v ...
See the MySQL docs on configuration files for further examples.
I sometimes have the same problem when automating activities:
I have a variable containing a string (usually a password) that is set in a config file or passed on the command-line, and that string includes the '!' character.
I need to pass that variable's value to another program, as a command-line argument.
If I pass the variable unquoted, or in double-quotes ("$password"), the shell tries to interpret the '!', which fails.
If I pass the variable in single quotes ('$password'), the variable isn't expanded.
One solution is to construct the full command in a variable and then use eval, for example:
#!/bin/bash
username=myuser
password='my_pass!'
cmd="/usr/bin/someprog -user '$username' -pass '$password'"
eval "$cmd"
Another solution is to write the command to a temporary file and then source the file:
#!/bin/bash
username=myuser
password='my_pass!'
cmd_tmp=$HOME/.tmp.$$
touch $cmd_tmp
chmod 600 $cmd_tmp
cat > $cmd_tmp <<END
/usr/bin/someprog -user '$username' -pass '$password'
END
source $cmd_tmp
rm -f $cmd_tmp
Using eval is simple, but writing a file allows for multiple complex commands.
P.S. Yes, I know that passing passwords on the command-line isn't secure - there is no need for more virtue-signalling comments on that topic.

How to get list of commands used in a shell script?

I have a shell script of more than 1000 lines, i would like to check if all the commands used in the script are installed in my Linux operating system.
Is there any tool to get the list of Linux commands used in the shell script?
Or how can i write a small script which can do this for me?
The script runs successfully on the Ubuntu machine, it is invoked as a part of C++ application. we need to run the same on a device where a Linux with limited capability runs. I have identified manually, few commands which the script runs and not present on Device OS. before we try installing these commands i would like to check all other commands and install all at once.
Thanks in advance
I already tried this in the past and got to the conclusion that is very difficult to provide a solution which would work for all scripts. The reason is that each script with complex commands has a different approach in using the shells features.
In case of a simple linear script, it might be as easy as using debug mode.
For example: bash -x script.sh 2>&1 | grep ^+ | awk '{print $2}' | sort -u
In case the script has some decisions, then you might use the same approach an consider that for the "else" cases the commands would still be the same just with different arguments or would be something trivial (echo + exit).
In case of a complex script, I attempted to write a script that would just look for commands in the same place I would do it myself. The challenge is to create expressions that would help identify all used possibilities, I would say this is doable for about 80-90% of the script and the output should only be used as reference since it will contain invalid data (~20%).
Here is an example script that would parse itself using a very simple approach (separate commands on different lines, 1st word will be the command):
# 1. Eliminate all quoted text
# 2. Eliminate all comments
# 3. Replace all delimiters between commands with new lines ( ; | && || )
# 4. extract the command from 1st column and print it once
cat $0 \
| sed -e 's/\"/./g' -e "s/'[^']*'//g" -e 's/"[^"]*"//g' \
| sed -e "s/^[[:space:]]*#.*$//" -e "s/\([^\\]\)#[^\"']*$/\1/" \
| sed -e "s/&&/;/g" -e "s/||/;/g" | tr ";|" "\n\n" \
| awk '{print $1}' | sort -u
the output is:
.
/
/g.
awk
cat
sed
sort
tr
There are many more cases to consider (command substitutions, aliases etc.), 1, 2 and 3 are just beginning, but they would still cover 80% of most complex scripts.
The regular expressions used would need to be adjusted or extended to increase precision and special cases.
In conclusion if you really need something like this, then you can write a script as above, but don't trust the output until you verify it yourself.
Add export PATH='' to the second line of your script.
Execute your_script.sh 2>&1 > /dev/null | grep 'No such file or directory' | awk '{print $4;}' | grep -v '/' | sort | uniq | sed 's/.$//'.
If you have a fedora/redhat based system, bash has been patched with the --rpm-requires flag
--rpm-requires: Produce the list of files that are required for the shell script to run. This implies -n and is subject to the same limitations as compile time error checking checking; Command substitutions, Conditional expressions and eval builtin are not parsed so some dependencies may be missed.
So when you run the following:
$ bash --rpm-requires script.sh
executable(command1)
function(function1)
function(function2)
executable(command2)
function(function3)
There are some limitations here:
command and process substitutions and conditional expressions are not picked up. So the following are ignored:
$(command)
<(command)
>(command)
command1 && command2 || command3
commands as strings are not picked up. So the following line will be ignored
"/path/to/my/command"
commands that contain shell variables are not listed. This generally makes sense since
some might be the result of some script logic, but even the following is ignored
$HOME/bin/command
This point can however be bypassed by using envsubst and running it as
$ bash --rpm-requires <(<script envsubst)
However, if you use shellcheck, you most likely quoted this and it will still be ignored due to point 2
So if you want to use check if your scripts are all there, you can do something like:
while IFS='' read -r app; do
[ "${app%%(*}" == "executable" ] || continue
app="${app#*(}"; app="${app%)}";
if [ "$(type -t "${app}")" != "builtin" ] && \
! [ -x "$(command -v "${app}")" ]
then
echo "${app}: missing application"
fi
done < <(bash --rpm-requires <(<"$0" envsubst) )
If your script contains files that are sourced that might contain various functions and other important definitions, you might want to do something like
bash --rpm-requires <(cat source1 source2 ... <(<script.sh envsubst))
Based #czvtools’ answer, I added some extra checks to filter out bad values:
#!/usr/bin/fish
if test "$argv[1]" = ""
echo "Give path to command to be tested"
exit 1
end
set commands (cat $argv \
| sed -e 's/\"/./g' -e "s/'[^']*'//g" -e 's/"[^"]*"//g' \
| sed -e "s/^[[:space:]]*#.*\$//" -e "s/\([^\\]\)#[^\"']*\$/\1/" \
| sed -e "s/&&/;/g" -e "s/||/;/g" | tr ";|" "\n\n" \
| awk '{print $1}' | sort -u)
for command in $commands
if command -q -- $command
set -a resolved (realpath (which $command))
end
end
set resolved (string join0 $resolved | sort -z -u | string split0)
for command in $resolved
echo $command
end

Linux shell script - String trimming for dash

I am attempting to get the mac address from my Raspberry Pi take the last 6 characters of the mac to use as the hostname alongside a fixed string.
here is what I'v managed to get working from other sources so far, but I am now totally stuck trying to trim the string down.
#!/bin/sh -e
MAC="$( sed "s/^.*macaddr=\([0-9A-F:]*\) .*$/\1/;s/://g" /proc/cmdline )"
MAC1="${MAC??????%}"
echo "$MAC1"
the shell being used by the Pi appears to be Dash, so the usual BASH commands that would have this done in no-time don't want to work or seem to generate errors when run within the script.
The full script that I am using in rc.local is below.
any advice on a way to-do this would be greatly received.
MAC="pi""$( sed "s/^.*macaddr=\([0-9A-F:]*\) .*$/\1/;s/://g" /proc/cmdline )"
echo "$MAC" > "/etc/hostname"
CURRENT_HOSTNAME=$(cat /proc/sys/kernel/hostname)
sed -i "s/127.0.1.1.*$CURRENT_HOSTNAME/127.0.1.1\t$MAC/g" /etc/hosts
hostname $MAC
If you have the cut command on your Pi, you could
do
MAC1=$( echo $MAC | cut -c 7-12 )
Since you're already using sed to process the string, I'd suggest adding another command:
MAC=$(sed -e 's/^.*macaddr=\([0-9A-F:]*\) .*$/\1/' \
-e 's/://g' \
-e 's/.*\(.\{6\}\)/\1/' /proc/cmdline)
The extra sed command extracts the last 6 characters from each line (I assume that you only have one?). You can combine the commands into a single string if you prefer, though I find this approach to be more readable.

sed command on single line input

I am using SunOS 10. I am trying to replace the : character at the end of line if the word contains : in it.
I am using the below command for it.
echo -n "test:" | sed 's/:$//g'
It's not working. What did I do wrong here?
The same command is working fine in GNU/Linux.
You don't need a line feed. You need to remove that -n
echo "test:" | sed 's/:$//g'
myshell:/home/myfolderpath # echo -n "test:"|sed 's/:$//g'
testmyshell:/home/myfolderpath#
you code works on my machine.
because there is no tailing new line.you gonna see the result right before your next shell command line. -n is not necessary.
myshell:/home/myfolderpath # echo "test:"|sed 's/:$//g'
test
myshell:/home/myfolderpath#
it should be like this without -n

Resources