This question already has answers here:
How to kill a child process after a given timeout in Bash?
(9 answers)
simple timeout on I/O for command for linux
(3 answers)
Closed 5 years ago.
Here's my situation: I've made a script in a while loop, but sometimes (say after 20-30 loops) it stops unexpectedly.
I tried to debug it but I couldn't.
I noticed that it stops while executing a command, and it just doesn't do anything when it stops. Now I was thinking: is there a way to tell to another script when the first script stops and it doesn't execute any command in, say 120 seconds? Maybe by constantly observing the output of the first script and when it's giving no output, the second script kills the first one and makes it start again? Sorry for my bad English hope I was clear.
Related
This question already has answers here:
How to abort execution in GHCI?
(3 answers)
Closed 9 months ago.
I found that when I execute
a = 1:a
a
in command line, I can press Ctrl+c to stop the infinite process.
But Ctrl+c fails to stop the process when I execute
length a
However, Ctrl+c works well when I execute
length [0..]
Why does Ctrl+c sometimes fail to stop the process? Is there any ways to stop the process when Ctrl+c is invalid?
Under Windows, there is a bug where, after pressing Ctrl + C, GHCI appears to exit but still seems to run in the background. The only documentation I could find is https://gitlab.haskell.org/ghc/ghc/-/issues/14150 but I've experienced it myself multiple times using version 9.2.1.when working with infinite data structures like infinite lists [1..].
This question already has answers here:
Command line command to auto-kill a command after a certain amount of time
(15 answers)
Closed 3 years ago.
I am trying to run a script which takes input from text file and based on the number of entries in it, a command is executed as many number of times.
Below is an overview:
cat /tmp/file.txt | while read name
do
<<execute a command using value of $name>>
done
What is happening is sometimes the command executed for particular $name is getting hung due to known issues. Therefore I need in such cases that the command on every value of $name runs only for X number of seconds and if it is not able to complete within that stipulated time, terminate the process and increment loop counter.
I was able to make use of sleep and kill but it is terminated the entire loop. I want the next values to be processed in case command gets hung on a row/value.
Please advise.
Sounds like you might want something like timeout.
timeout 4 <command>
This question already has answers here:
How can I launch a new process that is NOT a child of the original process?
(5 answers)
Closed 5 years ago.
I am working with a complex bash script that performs various operations and the restarts the Linux (CentOS 6) server on which it was run. This script is invoked from a couple of different places. I am looking for a way to initiate execution of this complex bash script in a new process tree.
I put together the following text diagram to illustrate the scenario:
a_process_that_calls_script
\_ subshells/processes/commands_of_calling_process
...
bash_script
\_ subshells/commands/other_scripts_called
Potential duplicate: How can I launch a new process that is NOT a child of the original process?
If you have a process invoke the script as a grandchild process, and then the child exits, the grandchild will become a child of the init process.
This question already has answers here:
Cron jobs -- to run every 5 seconds
(6 answers)
Closed 8 years ago.
Before of first, I'm a Linux (administrator|developer) newbie.
I need to run a bash script every 5 seconds, it's very simple; export service's information to text files.
I try to do this with cron daemon, but it's run every minute at least.
I'm discover Skeleton script and have many questions about this:
I need write some special code in my bash file?
How to run every 5 seconds?
There are a best practices manual?
Yes its not possible through cron as daemon runs once in every minute. Or when job list is modified
Put whatever you want to run in a script inside infinite while loop and put sleep of 1 sec
something like
while [ 1 ]
do
run_your_cmds here
sleep 1
done
BUT I dont think anything need that kind of monitoring.
Best Practice!!
Please dont try and do it with cron.
This question already has answers here:
Multithreading in Bash [duplicate]
(3 answers)
Closed 9 years ago.
I am basically trying to write a bash script that suspends some virtual machines running on a host. However, if I write the script sequentially, VMs will be suspended one at a time. Suspending a VM takes some time to save state. How can I let my script suspend the VMs concurrently. In other words, how can I run commands concurrently in a bash script instead of sequentially?
You can background the tasks.
some bash command with options and stuff then with a &
Adding the & will send the command to the background and begin the next.
Put a & after the command for suspending the the VM.
For example if
cmd_to_suspend_vm
was your command to run. You would run
cmd_to_suspend_vm &