BCP Command on Shell (.sh file) - linux

I have a .sh script that do this:
bcp "EXEC SPName" queryout "test.csv" -k -w -t"," -S "$server" -U "$user" -P "$pass"
The variables $server, $user and $pass are being read from a external config file.
The problem is that the variables don't work and give me always connection timeout. For example if I use the same command but with the variables hard coded works fine:
bcp "EXEC SPName" queryout "test.csv" -k -w -t"," -S "TEST" -U "admin" -P "admin"
How I can make the command dynamic?

I found the problem, I was reading the variables from a external json file created in Windows and the file contained "\r" at the end and then the command could not execute.
How I solved:
sed -i 's/\r//g' YourFile.json

Related

SSH remote execution - How to declare a variable inside EOF block (Bash script)

I have the following code in a bash script:
remote_home=/home/folder
dump_file=$remote_home/my_database_`date +%F_%X`.sql
aws_pem=$HOME/my_key.pem
aws_host=user#host
local_folder=$HOME/db_bk
pwd_stg=xxxxxxxxxxxxxxxx
pwd_prod=xxxxxxxxxxxxxxx
ssh -i $aws_pem $aws_host << EOF
mysqldump --column-statistics=0 --result-file=$dump_file -u user -p$pwd_prod -h $db_to_bk my_database
mysql -u user -p$pwd_prod -h $db_to_bk -N -e 'SHOW TABLES from my_database' > $remote_home/test.txt
sh -c 'cat test.txt | while read i ; do mysql -u user -p$pwd_prod -h $db_to_bk -D my_database --tee=$remote_home/rows.txt -e "SELECT COUNT(*) as $i FROM $i" ; done'
EOF
My loop while is not working because "i" variable is becoming empty. May anyone give me a hand, please? I would like to understand how to handle data in such cases.
The local shell is "expanding" all of the $variable references in the here-document, but AIUI you want $i to be passed through to the remote shell and expanded there. To do this, escape (with a backslash) the $ characters you don't want the local shell to expand. I think it'll look like this:
ssh -i $aws_pem $aws_host << EOF
mysqldump --column-statistics=0 --result-file=$dump_file -u user -p$pwd_prod -h $db_to_bk my_database
mysql -u user -p$pwd_prod -h $db_to_bk -N -e 'SHOW TABLES from my_database' > $remote_home/test.txt
sh -c 'cat test.txt | while read i ; do mysql -u user -p$pwd_prod -h $db_to_bk -D my_database --tee=$remote_home/rows.txt -e "SELECT COUNT(*) as \$i FROM \$i" ; done'
EOF
You can test this by replacing the ssh -i $aws_pem $aws_host command with just cat, so it prints the here-document as it'll be passed to the ssh command (i.e. after the local shell has done its parsing and expansions, but before the remote shell has done its). You should see most of the variables replaced by their values (because those have to happen locally, where those variables are defined) but $i passed literally so the remote shell can expand it.
BTW, you should double-quote almost all of your variable references (e.g. ssh -i "$aws_pem" "$aws_host") to prevent weird parsing problems; shellcheck.net will point this out for the local commands (along with some other potential problems), but you should fix it for the remote commands as well (except $i, since that's already double-quoted as part of the SELECT command).

Sql in bash script (postgres)

i have a command in bash script for rename base.
It's work, example
psql -U $User -t -A -q -c 'ALTER DATABASE "Old_Name" RENAME TO "New_Name"'
But if i do this -
O_Name='Old_Name'
N_Name='New_Name'
psql -U $User -t -A -q -c 'ALTER DATABASE "$O_Name" RENAME TO "$N_Name"'
It's not work, i think sql get $O_Name not Old_Name.
How to pass the value of a variable bash to sql?
Single quotes don't allow for environment variable expansion. Use double quotes instead (and escape the nested quotes). Like,
psql -U $User -t -A -q -c "ALTER DATABASE \"$O_Name\" RENAME TO \"$N_Name\""

Error when stacking SSH command arguments within a bash script using other scripts as variables

I have a csv file called addresses.csv which looks like this,
node-1,xx.xxx.xx.xx,us-central-a
....
node-9,xxx.xx.xxx.xx,us-east1-a
I have a script below called 0run.sh,
#!/bin/bash
username='user'
persist="bash /home/${username}/Documents/scripts/disk/persistentDisk.sh"
first="bash /home/${username}/Documents/scripts/disk/firstAttach.sh"
while IFS=, read -r int ip <&3; do
if [ "$int" == "node-1" ]; then
--->ssh -i ~/.ssh/key -o StrictHostKeyChecking=no -l ${username} ${ip} "${persist}; ${first}"<---
else
ssh -i ~/.ssh/key -o StrictHostKeyChecking=no -l ${username} ${ip} "${first}"
fi
done 3<addresses.csv
The error occurs in the part of the code where I drew the arrows.
When it runs on node-1, instead of running ..persistentDisk.sh followed by ..firstAttach.sh, it only runs ..persistentDisk.sh and gives me the following error before it runs ..persistentDisk.
bash: /home/user/Documents/scripts/disk/firstAttach.sh: No such file or directory
The rest of the script runs completely fine. The only error occurs at this one part where it misses the 2nd script.
When I run the command like this it runs fine.
ssh -i ~/.ssh/key -o StrictHostKeyChecking=no -l ${username} ${ext} "${first}"
When I run it like this, it runs fine as well.
ssh -i ~/.ssh/key -o StrictHostKeyChecking=no -l user xxx.xx.xxx.xx "bash /home/${username}/Documents/scripts/disk/persistentDisk.sh; bash /home/${username}/Documents/scripts/disk/firstAttach.sh"
When I run the command like with a \ before the ; to escape it like this,
ssh -i ~/.ssh/key -o StrictHostKeyChecking=no -l ${username} ${ext} "${persist}\; ${first}"
I get the following error, and neither scripts run within the node-1 part of the code, but the rest of the code's else loops run fine.
bash: /home/user/Documents/scripts/disk/persistentDisk.sh;: No such file or directory
Why can't I stack the 2 commands within the if statement in the ssh using variables?
If I clearly understand: your real problem consist to leave STDIN free for interaction in target host!
About read and redirection
Try using:
#!/bin/bash
username='user'
persist="bash /home/${username}/Documents/scripts/disk/persistentDisk.sh"
first="bash /home/${username}/Documents/scripts/disk/firstAttach.sh"
while IFS=, read -r -u $list int ip foo; do
if [ "$int" == "node-1" ]; then
echo CMD... $ip, $persist
else
[ "$ip" ] && echo CMD... $ip, $first
fi
done {list}<addresses.csv
Tested, this èroduce:
CMD... xx.xxx.xx.xx, bash /home/user/Documents/scripts/disk/persistentDisk.sh
CMD... xxx.xx.xxx.xx, bash /home/user/Documents/scripts/disk/firstAttach.sh
-u flag to read, tell to use file descriptor ${list} instead of STDIN
foo is some useless variable used to prevent rest of line to be stored in $ip (xx.xxx.xx.xx,us-central-a in this case)
{list}</path/to/filename create a new variable by finding any free file descriptor.
About ssh (and redirection)
You could use:
#!/bin/bash
username='user'
persist="/home/${username}/Documents/scripts/disk/persistentDisk.sh"
first="/home/${username}/Documents/scripts/disk/firstAttach.sh"
while IFS=, read -r -u $list int ip foo; do
[ "$int" = "node-1" ] && cmd=persist || cmd=first
[ "$ip" ] && ssh -i ~/.ssh/key -t -o StrictHostKeyChecking=no \
-l ${username} ${ext} /bin/bash "${!cmd}"
done {list}<addresses.csv
By using this syntax, you will keep STDIN free for script running on target host.

Shell script: execute 2 commands and keep first running

I'm trying to write a shell script for my docker image where:
a mssqql server is started
database setup happens
However with my current script my sql server instance stops as soon as the data import is done. Could anyone point me out what I'm doing wrong?
#!/bin/bash
database=myDB
wait_time=30s
password=myPw
exec /opt/mssql/bin/sqlservr &
echo importing data will start in $wait_time...
sleep $wait_time
echo importing data...
/opt/mssql-tools/bin/sqlcmd -S 0.0.0.0 -U sa -P $password -i ./init.sql
for entry in "table/*.sql"
do
echo executing $entry
/opt/mssql-tools/bin/sqlcmd -S 0.0.0.0 -U sa -P $password -i $entry
done
for entry in "data/*.csv"
do
shortname=$(echo $entry | cut -f 1 -d '.' | cut -f 2 -d '/')
tableName=$database.dbo.$shortname
echo importing $tableName from $entry
/opt/mssql-tools/bin/bcp $tableName in $entry -c -t',' -F 2 -S 0.0.0.0 -U sa -P $password
done
I did not see any clear mistakes in your shell script. I am just advising you the below:-
Try to run the server from the current shell of the script without exec
/opt/mssql/bin/sqlservr &
Put some echo in both the loop statements to check what is going on there.
Hope this will help.
Seems I need to add set -m to resolve this.

Why bash script breaks if it meets space in this example?

I need to execute following command on multiple servers:
mysql -h 127.0.0.1 -uroot -psecret mydatabase -e 'SELECT 1;'
So, i have test1.sh script, which echo-es dynamic string:
#!/bin/bash
echo -n "mysql -h 127.0.0.1 -uroot -psecret mydatabase -e 'SELECT 1'"
And test2.sh script, who executes the given string:
#!/bin/bash
CMD=`./test1.sh`
$CMD
If i execute `./test2.sh, i will see help output, command will be not executed.
If i remove spaces in mysql query SELECT 1 or the whole -e param, and then execute ./test2.sh script, everything works.
Why this is happening? Can you please describe this magic?
My bash version is 4.2.46.
As long as you control and trust command line coming from test1.sh, you can use dreaded eval in test2.sh like this:
#!/bin/bash
cmd="$(./test1.sh)"
eval "$cmd"
Why and when should eval use be avoided in shell scripts?
Can you try test1.sh script as like this
#!/bin/bash
echo -e "mysql -h 127.0.0.1 -uroot -psecret mydatabase -e"
test2.sh
#!/bin/bash
CMD=$(./test1.sh)
${CMD} "SELECT 1"

Resources