Keep process running on remote machine after exiting ssh session inside bash script [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
I have a bash script in which I login to a remote machine over ssh and run iperf and then logout and do other things locally. I want iperf to keep running after the bash script logs out. I have tried nohup, disown and setsid, but they don't seem to work when I use them inside the bash script. I have also tried running iperf inside another script, that didn't work either.
Here's the part of the script with nohup example:
ssh root#10.101.10.35 &>/dev/null & << EOF
nohup iperf -s -B 192.168.99.1 &>/dev/null &
EOF

You need to redirect stdin, stdout and stderr to somewhere else as opposed to your terminal like so:
ssh root#10.101.10.35 'iperf -s -B 192.168.99.1 < /dev/null > /tmp/iperf_combined.log 2>&1 &'
stdin is taken from /dev/null (nothing is entered)
stdout and stderr goes to /tmp/iperf_combined.log
The process will run on the remote machine until you will manually kill it or until the script/command will exit on its own.
Edit (as a reply to the poster's comment):
If you want to run multiple commands in the same ssh session, you may use:
ssh -T root#10.101.10.35 << EOF
iperf -s -B 192.168.99.1 < /dev/null > /tmp/iperf_combined_1.log 2>&1 &
iperf -s -B random_ip2 < /dev/null > /tmp/iperf_combined_2.log 2>&1 &
EOF
As per ssh man page:
-T Disable pseudo-tty allocation.
Detailed explanation on psqudo-tty here

Related

Closing an open ssh port in Linux with one line [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
I often ssh tunnel into Rstudio on a server I have set up. I'm trying to devise a single command that I can use to close the ssh port. I know that I can find the PID for localhost:1234 with:
sudo lsof -i :1234
And I also know that I can kill the process with:
sudo kill $(sudo lsof -t -i:1234)
The issue is that if I have Chrome open to run Rstudio server, the 2nd command will kill the open Chrome browswer as well. Is there a way to modify the 2nd command so that I close the open ssh port, but not the Chrome browser? There are two PID numbers, so I could theoretically grep for 'ssh' but I'm not sure how.
EDIT FOR CLARITY:
For example, I get the following output from the first command. I want to modify the 2nd command so that I can kill PID 15834, but not 30117. Apologies, I hope that makes more sense.
try this
sudo kill $(sudo lsof -t -i:1234 -c ssh)
-c => selects the listing of files for processes executing the command that begins with the characters of c.
Just firewall the port:
sudo iptables -I INPUT -p tcp --dport 1234 -j DROP

How to change desktop background in Linux using Cron [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I found Terminal command to change desktop wallpaper:
gsettings set org.gnome.desktop.background picture-uri file:///path/to/your/image.png
but this command not working in cron and other desktops like Mate. pgrep gnome-session approach shows nothing for me.
You can use dconf to change background. Here is example of simple bash script:
#!/bin/bash
WP="$(find ~+ -type f -exec mimetype {} + 2>/dev/null | awk -F': +' '{ if ($2 ~ /^image\//) print $1 }' | sort -R | tail -30 | shuf -n 1)"
dconf write /org/mate/desktop/background/picture-filename "'${WP}'"
You can find distro-specific key using GUI app - dconf-editor
But to use this script in CRON you need to set session environment variables. Command pgrep gnome-session doesn't work in Mint and other not Gnome desktops. To solve this problem you need to save environment variables of specific user by running command at system startup:
env > ~/cronenv && sed -i '/%s/d' ~/cronenv
now you have cronenv file (without substitutional vars - %s) in users home dir. Just restore them back in cron before running dconf:
*/1 7-21 * * * cd ~/Pictures && env $(cat ~/cronenv | xargs) /path/to/first/script
Use crontab -e to edit cron jobs for current user. All works fine!

Bash script check permissions to run command on remote [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I have a local development machine and from my bash script am sending commands to the remote server.
How can I write bash code to check if I am allowed to run the remote command so that I can handle the success/failure response from my script?
Alternatively, how can I capture the output so that I can parse it and detect if it succeeded. The difficulty with parsing is that the ssh command might trigger a password prompt so I can't interfere with that.
That bash script uses ssh -qt to send the remote commands
Command
ssh user#host -qt "sudo -u www /usr/local/bin/php /mnt/data/script.php"
Output:
[sudo] password for xxx:
Sorry, user xxx is not allowed to execute '/usr/local/bin/php /mnt/data/script.php' as www on host.domain.com
Assuming that user != root above: you can't - there's no way to read /etc/sudoers or /etc/sudoers.d/* in a normally set-up Linux box if you're not root, so apart from trial & error there's nothing to be done.
As for capturing the result - that's fairly simple (parsing it, of course, is a different story, depending on what you're doing over there).
output=$( ssh user#host -qt "sudo -u www /usr/local/bin/php /mnt/data/script.php" 2>&1 )
After the execution (and you typing the password for sudo)
echo $? # gives you the return-code of what happened on the far end, if it's a success that should be 0
echo $output # gives you the strings to parse

Process killed after closing terminal SSH [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
I'm trying to clear facebook cache on my server every 2 seconds so i logged in the SSH and i run this command
while true; do sleep 2; curl -F id="http://twd.ma" -F scrape=true -F access_token='token' -F appID=appID https://graph.facebook.com; done &
And every thing worked fine and the cache started to be cleaned every 2 seconds. However, when i close the Terminal SSH the cache stop being cleaned and i think the process is killed, what should i do please?
Your command will stop executing because when you log out, the shell is lost. The '&' means that the script runs in background "as long as the shell is active"
You can do the following:
Write your script into a file, i.e. clearcache.sh and omit the '&'
#!/bin/bash
while true; do
sleep 2
curl -F id="http://twd.ma" -F scrape=true -F access_token='token' -F appID=appID https://graph.facebook.com
done
Write the path to your script into /etc/rc.local
/path/to/clearcache.sh > /dev/null 2&>1 &
The ' >/dev/null 2&>1 means that all output that your script produces will be deleted.
If screen is available to you then you can start a screen session by running screen, run your commands, then press ctrl-a ctrl-d to detach the session.
When you log in later you can issue screen -r to reconnect to the detached session.

In Linux how do you make a command run in the background without it outputting to the screen? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
I know this sounds like a silly question at first glance, but I've tried everything.
I want to execute the command arpspoof in the Kali Linux terminal but I do not want to see the endless output.
First I try this:
arpspoof -t 10.1.1.1 10.1.1.2 >/dev/null
And it still outputs to the screen.
Then I try this:
arpspoof -t 10.1.1.1 10.1.1.2 & >/dev/null
And it still outputs to the screen.
Then I add another one at the end:
arpspoof -t 10.1.1.1 10.1.1.2 & >/dev/null &
And it still outputs to the freakin screen.
Try
arpspoof -t 10.1.1.1 10.1.1.2 2>/dev/null 1>/dev/null &
where:
arpspoof -t 10.1.1.1 10.1.1.2 is your command
2>/dev/null redirects standard error (STDERR) to the "bit bucket"
1>/dev/null redirects standard out (STDOUT) to the "bit bucket"
& sets the entire command line to run in the background
This line of code is more verbose and perhaps clearer to understand.
A somewhat redundant answer but I prefer the format:
arpspoof -t 10.1.1.1 10.1.1.2 >/dev/null 2>&1
Be sure that you don't have background processes still running (therefore still writing to the console/screen) from previous attempts to redirect the output.

Resources