where is the output goes when running as a background process? - linux

My process output some log information to the console windows. When I run it as a background process, where can I find the output logs?

Depends on the process and how you started it. If it writes to stdout (which is probable, given that the output is usually to the terminal), you can redirect the output to a file with
command > logfile &
If you also want to log error message from stderr, do
command > logfile 2> errorlogfile &
or
command > logfile 2>&1 &
to get everything in one file.

If it's a systemd service you can run journalctl -u <service-name>
You can check for latest logs by clicking **SHIFT + G **
Make sure systems is installed apt-get install systemd

Related

Jenkins : bash script ran with nohup is neither working nor writing anything to log

BACKGROUND
I would like to explain the scenario properly here.
I am running jenkins_2.73.3 in my cloud server with ubuntu 16.04.
Currently, there are 3 users in the server:
root
develop-user (which I had created for many reasons such as test,deploy etc)
jenkins (which was created by jenkins ofcourse, I also added this jenkins user to sudoers group)
PROBLEM
I have a bash script that I am calling from a build step in Jenkins. Within this bash script,there is a nohup command for calling a separate deployScript in the background such as:
#!/bin/bash
nohup deployScript.sh > $WORKSPACE/app.log 2>&1 & echo $! > save_pid.txt
After the build step is completed, I see that a id is generated inside save_pid.txt but app.log is surprisingly empty. I can't kill any processes with this generated pid. So, that means there isn't any process created in the first place here. Also, the deployScript.sh does not seem to have any effect at all. It's just not working. This happens everytime I run the build in Jenkins. I can assure that there is nothing wrong with the deployScript.sh.
I have tried running this bash script with the develop-user manually without Jenkins and it works perfectly. Contents are written to the log file and also I can use the generated pid to kill the process. I have also tested this in my local environment and it works.
QUESTION
I have been looking at this for days. What might be the root cause here ?Where can I look into to see some logs or other info ? How is the pid generated whereas the log file is empty ? Is it a permission issue with the jenkins user ? Please help.
You can use below line inside the execute shell in jenkins to run it in background without the process being killed.
BUILD_ID=dontKillMe <command> &
So, it turned out to be a permission issue and also the script wasn't executable I guess as pointed out in the comments above.
So, now the bash script looks like below:
#!/bin/bash
sudo chmod a+x deployScript.sh
sudo nohup deployScript.sh > $WORKSPACE/app.log 2>&1 & echo $! > save_pid.txt
This works.

Obsidian scheduler for running linux scripts

I am using obsidian scheduler for scheduling various jobs written on a linux box. And trying to call shell scripts with a nohup command like
UPDATE 1:
nohup ./script.sh > output.txt &
UPDATE 2
This is the error when i use nohup.
nohup: failed to run command â./test.sh &>./load.log &â: No such file or directory
I dont see anything writing to the output file.
And secondly how can i verify that it is using nohup command to execute the script.
Thanks

Can we save the execution log when we run a command using PuTTY/Plink

I am using Plink to run a command on remote machine. In order to fully automate the process I
need to save the execution log somewhere. I am using a bat file:
C:\Ptty\plink.exe root#<IP> -pw <password> -m C:\Ptty\LaunchFile.txt
The C:\Ptty\LaunchFile.txt contains my command that i want to run.
./Launch.sh jobName=<job name> restart.mode=false
Is there a way to save the execution log so that I can monitor it later... ?
The plink is a console application. Actually that's probably it's only purpose. As such, its output can be redirected to a file as with any other command-line command/tool.
Following example redirects both standard and error output to a file output.log:
plink.exe -m script.txt username#example.com > output.log 2>&1
See also Redirect Windows cmd stdout and stderr to a single file.
This is the one of my way to log everything when I use putty.exe on Windows.

script command - bash linux terminal

I am running the cmd
script install-log.txt
the terminal successfully returns
Script started, file is install-log.txt
If I begin typing commands and receiving output to the screen
lsblk
fdisk -l
ls
echo ok
when I check the install-log.txt
nano install-log.txt
it is empty.
I thought all cmd was supposed to be saved there until the session is finished?
I am using Arch-Linux installation CD, and wanted to save this log to record my installation setup cmds.
You need to terminate script operation by running 'exit' command. That wont exit your terminal as such. Then you can view your log file.
Here is the duplicate with more detailed info -> Bash script: Using "script" command from a bash script for logging a session

JBoss Server Stopping

I'm having a problem with keeping a JBoss server running. Here's the command I'm using to start it:
sudo /JBOSS_HOME/bin/run.sh conf -b servername.domainname.tld
JBoss starts okay after about 4 minutes or so, and when I ps it, it shows up as a process. However, if I happen to log out of SSH and ps again, it's been stopped. Is there a way to start the server so it doesn't automatically stop when a user logs out of SSH?
I think the problem here is the standard output stream.
Redirect the output to a file and start the process in background like following.
sudo /JBOSS_HOME/bin/run.sh conf -b servername.domainname.tld > log_file &
This may help.

Resources