Storing the output into shell variable - linux

I wish to store the Output of the following command over my postgres database
sudo -u postgres -H -- psql -d firstdb -c"select count(col2) from
table2;"
into a variable defined in shell.
I saw this way somewhere-
abc=sudo -u postgres -H -- psql -d firstdb -c"select count(col2) from
table2;"
echo abc
but seems to be not working
Is there any other way to store the output so as to apply cut query to fetch the value and check for the size of column??

You can do:
abc=$(svn://solace2/svn/vrs32/branches/feature/ipv6_support 2>&1)
and then do echo $abc to see the output. Notice the 2>&1 at the end will redirect stderr to stdout before assigning stdout to the variable, meaning abc will contain error messages as well as regular messages.

Related

SSH remote execution - How to declare a variable inside EOF block (Bash script)

I have the following code in a bash script:
remote_home=/home/folder
dump_file=$remote_home/my_database_`date +%F_%X`.sql
aws_pem=$HOME/my_key.pem
aws_host=user#host
local_folder=$HOME/db_bk
pwd_stg=xxxxxxxxxxxxxxxx
pwd_prod=xxxxxxxxxxxxxxx
ssh -i $aws_pem $aws_host << EOF
mysqldump --column-statistics=0 --result-file=$dump_file -u user -p$pwd_prod -h $db_to_bk my_database
mysql -u user -p$pwd_prod -h $db_to_bk -N -e 'SHOW TABLES from my_database' > $remote_home/test.txt
sh -c 'cat test.txt | while read i ; do mysql -u user -p$pwd_prod -h $db_to_bk -D my_database --tee=$remote_home/rows.txt -e "SELECT COUNT(*) as $i FROM $i" ; done'
EOF
My loop while is not working because "i" variable is becoming empty. May anyone give me a hand, please? I would like to understand how to handle data in such cases.
The local shell is "expanding" all of the $variable references in the here-document, but AIUI you want $i to be passed through to the remote shell and expanded there. To do this, escape (with a backslash) the $ characters you don't want the local shell to expand. I think it'll look like this:
ssh -i $aws_pem $aws_host << EOF
mysqldump --column-statistics=0 --result-file=$dump_file -u user -p$pwd_prod -h $db_to_bk my_database
mysql -u user -p$pwd_prod -h $db_to_bk -N -e 'SHOW TABLES from my_database' > $remote_home/test.txt
sh -c 'cat test.txt | while read i ; do mysql -u user -p$pwd_prod -h $db_to_bk -D my_database --tee=$remote_home/rows.txt -e "SELECT COUNT(*) as \$i FROM \$i" ; done'
EOF
You can test this by replacing the ssh -i $aws_pem $aws_host command with just cat, so it prints the here-document as it'll be passed to the ssh command (i.e. after the local shell has done its parsing and expansions, but before the remote shell has done its). You should see most of the variables replaced by their values (because those have to happen locally, where those variables are defined) but $i passed literally so the remote shell can expand it.
BTW, you should double-quote almost all of your variable references (e.g. ssh -i "$aws_pem" "$aws_host") to prevent weird parsing problems; shellcheck.net will point this out for the local commands (along with some other potential problems), but you should fix it for the remote commands as well (except $i, since that's already double-quoted as part of the SELECT command).

Sql in bash script (postgres)

i have a command in bash script for rename base.
It's work, example
psql -U $User -t -A -q -c 'ALTER DATABASE "Old_Name" RENAME TO "New_Name"'
But if i do this -
O_Name='Old_Name'
N_Name='New_Name'
psql -U $User -t -A -q -c 'ALTER DATABASE "$O_Name" RENAME TO "$N_Name"'
It's not work, i think sql get $O_Name not Old_Name.
How to pass the value of a variable bash to sql?
Single quotes don't allow for environment variable expansion. Use double quotes instead (and escape the nested quotes). Like,
psql -U $User -t -A -q -c "ALTER DATABASE \"$O_Name\" RENAME TO \"$N_Name\""

Execute cqlsh inside script

I am trying to execute cqlsh inside bash script. My script is below. When ı try to execute sh file, it returns cql command not found
#!/bin/bash
set -x
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
cqlsh -e "SELECT * FROM msg.msg_log limit 1;" > /home/yunus/sh/cqlshcontrol.txt
error1=$( more /home/yunus/sh/cqlshcontrol.txt | wc -l )
if [ $error1 -lt 1 ]; then
curl -S -X POST --data "payload={\"text\": \" Cqlsh not responding, Connection Problem \",\"username\":\"Elevate Cassandra1\",\"icon_emoji\":\"${SLACK_ICON}\"}" https://hooks.slack.com/services/
fi
some suggestions
use [[/]] over [/].
the return value of $() is not an error value and should be named lines or something more meaningful. The lack of another error variable in the code makes the appended number (the 1 in error1) seem even odder.
There's no reason to use more or pipe inside of that subshell. Just run wc -l on your file.
Are you sure cqlsh is in the PATH? Try which cqlsh to find it.
wc will never return a negative value so comparing for equality with zero would be clear and cover just as many potential cases.
otherwise
If that doesn't get you out of your confusion please show the output when you try to run it.

How to run a SQL script in tsql

I'm using tsql (installed with FreeTDS) and I'm wondering if there's a way to run a SQL script in tsql from the command line and get the result in a text file.
For example, in psql I can do:
psql -U username -C "COPY 'SELECT * FROM some_table' to 'out.csv' with csv header"
Or:
psql -U username -C "\i script.sql"
And in script.sql do:
\o out.csv
SELECT * FROM some_table;
Is there a way for doing this in tsql? I have read the linux man page and search everywhere but I just don't find a way.
I think, you can try "bsqldb", see http://linux.die.net/man/1/bsqldb
I really didn't find how to do this and I'm starting to thinks it just can't be done in tsql. However I solved for my specific problem redirecting the stdin and stdout. I use a bash script like this:
tsql connection parameters < tsql_input.sql > tsql_output.csv
lines=`cat tsql_output.csv | wc -l`
# The output includes the header info of tsql and the "(n number of rows)" message
# so I use a combination of head and tail to select the lines I need
headvar=$((lines-2))
tailvar=$((lines-8))
head -$headvar tsql_output.csv | tail -$tailvar tsql_output.csv > new_output_file.csv
And that saves just the query result on 'new_output_file.csv'
freebcp "select * from mytable where a=b" queryout sample.csv -U anoop -P mypassword -S hostname.com -D dbname -t , -c
This command woks like a charm. Kudos to FreeTDS...
tsql -U username -P password -p 1433 > out.csv <<EOS SELECT * FROM some_table; go EOS

eval and string "return"

Edit:
I'm creating a bash script to run Netezza queries.
here's an example of what I have to do:
nzsql -host localhost -port 123456 -d db -u usr -pw pwd -A -t -c "insert into TABLE (name,surname) values ('m','sc')"
and it should return
INSERT 0 1
What I need is retrieve the number "1" which means that 1 row was inserted.
For this, I'd need to retrieve the whole string "INSERT 0 1" and work on it.
according to http://www.enzeecommunity.com/thread/2423 this should work:
cmnd_output=`nzsql -host $NZ_HOST -d $NZ_DATABASE -u $NZ_USER -pw $NZ_PASSWORD -A -t -c "insert into TEST values ('test 1')"`
But I can't get it to work with this: ($2 is right because when I run it from the terminal it works just fine)
cmd_out=`$2` or cmd_out=`"$2"` or cmd_out="`$2`" or cmd_out=`"'$2'"`
cmd_out=$($2) or cmd_out="$($2)" or cmd_out=$("$2")
It tells me command not found... just like if there was a "string quote" problem with $2
I've however managed to execute $2 with eval
eval "$2"
and it works great, the command $2 is executed just fine.
But, I can't use eval in this case as I want to store in a variable that "INSERT 0 1".
A simple
variable_int=`$function '$arg1' '$arg2'`
without the eval won't do?
To assign return values from functions to a shell variable, use command substitution
variable=$(function arg1 arg2)
Why do you need eval?
When you run into a problem like this I find it's always very useful to run with the -x option, just change the top sh-bang line like so:
#!/bin/bash -x
That'll print out each line as it's currently interpreted before
executing it. You can see how your variables are being mangled and use that to fix the problem.

Resources