bash redirect output to file but result is incomplete - linux

The question of redirecting output of a command was already asked many times, however I am having a strange behavior. I am using a bash shell (debian) with version
4.3.30(1)-release and tried to redirect output to a file, however not everything are logged in the file.
The bin file that I tries to run is sauce-connectv4.4.1 for linux (client of saucelabs that is publicly available in internet)
If I run
#sudo ./bin/sc --doctor
it showed me a complete lines
it prints :
INFO: resolved to '23.42.27.27'
INFO: resolving 'g2.symcb.com' using
DNS server '10.0.0.5'...
(followed by other line)
INFO: 'google.com' is not in hosts file
INFO: URL https://google.com can be reached
However, if I redirect the same command to a file with the following command
#sudo ./bin/sc --doctor > alloutput.txt 2>&1
and do
#cat alloutput.txt
the same command output is logged, but deprecated as following:
INFO: resolved to '23.42.2me#mymachine:/opt/$
It has incomplete line, and the next lines that follows are not even logged (missing).
I have tried with >> for appending, it has the same problem. Using command &> alloutput.txt also is not printing the whole stuff. Can anyone point out how to get all lines of the above command to be logged completely to the text file?
UPDATE
In the end I manage to use the native binary logging by using --log
alloutput.txt where it completely provide me with the correct output.
However I let this issue open as I am still wondering why one misses some information/lines by doing an output redirection

you should try this: stdbuf -o0
like:
stdbuf -o0 ./bin/sc --doctor 2>&1 | tee -a alloutput.txt

That is a funny problem, I've never seen that happening before. I am going to go out on a limb here and suggest this, see how it works:
sudo ./bin/sc --doctor 2>&1 | tee -a alloutput.txt

#commandtorun &> alloutput.txt
This command will redirects both the error and output to same file.

Related

How to get logs in a file when using the timeout command?

I am trying to grab the logs of my below command to a text file like below:
timeout 10 glxheads &> test.txt
But unfortunately, I am not not getting any logs transferred to the text file by this approach.
Infact any simple command done under timeout doesnt give out the output to a file.
Note:
The below command works,
glxheads &> test.txt
Could anyone suggest any ideas to get around this issue?
Thanks !
As per the link specified by Dmitri, I was able to resolve this issue by doing the following:
stdbuf -oL -eL timeout 10 glxheads &> test.txt
or using
unbuffer timeout 10 glxheads &> test.txt

Redirect output of Whatsapp bash script to file interactively for automation purpose

Yowsup-cli is a library that can allow you to send message to whatsapp users,once authenticated.
By the coommand
yowsup-cli -a --interactive <PHONE_NUMBER_HERE> --wait --autoack --keepalive --config yowsup-master/src/yowsup-cli.config
I can interactively send or receive messages.
Once executed the command you get a prompt like
MY_PHONE_NUMBER#s.whatsapp.net [27-12-2014 18:33]:THIS IS MY MESSAGE,TYPED ON MY PHONE. OPEN DOOR GARAGE
Enter Message or command: (/available, /lastseen, /unavailable)
I'm a totally beginner, but I would like to redirect this content that gets printed on terminal to a file,to further analyze it or to write a script that search into this file keyword as "OPEN GARAGE DOOR", so i could automate something.
This file obviously has to sync with the program output,but I don't know how to do.
yowsup-cli -a --interactive <PHONE_NUMBER_HERE> --wait --autoack --keepalive --config yowsup-master/src/yowsup-cli.config > /path/to/my_file
doesn't work
Running Ubuntu 12.04.
I know yowsup is a python library, but i don't know this language. I'm beginning learniing C and I would like to do that in BASH, or if not possible in C.
Thanks
Pipe the output into tee instead of redirecting it into a file:
yowsup-cli -a --interactive <PHONE_NUMBER_HERE> --wait --autoack --keepalive --config yowsup-master/src/yowsup-cli.config 2>&1 | tee -a /path/to/my_file
The reason: With redirection you don't see the command's output which makes interacting with it hard.
Piping into the tee command will echo all output the the terminal and append it to given file.
Interestingly, in your command line (using redirection) you can still type blindly or even according to the yowsup-cli ouptut you read in another terminal with:
tail -f /path/to/my_file
Tail with the -f option prints the last 10 lines of the file as well as any new ouptut from the yowsup-cli command.

Can we save the execution log when we run a command using PuTTY/Plink

I am using Plink to run a command on remote machine. In order to fully automate the process I
need to save the execution log somewhere. I am using a bat file:
C:\Ptty\plink.exe root#<IP> -pw <password> -m C:\Ptty\LaunchFile.txt
The C:\Ptty\LaunchFile.txt contains my command that i want to run.
./Launch.sh jobName=<job name> restart.mode=false
Is there a way to save the execution log so that I can monitor it later... ?
The plink is a console application. Actually that's probably it's only purpose. As such, its output can be redirected to a file as with any other command-line command/tool.
Following example redirects both standard and error output to a file output.log:
plink.exe -m script.txt username#example.com > output.log 2>&1
See also Redirect Windows cmd stdout and stderr to a single file.
This is the one of my way to log everything when I use putty.exe on Windows.

Cron / wget jobs intermittently not running - not getting into access log

I've a number of accounts running cron-started php jobs hourly.
The generic structure of the command is this:
wget -q -O - http://some.site.com/cron.php
Now, this used to be running just fine.
Lately, though, on a number of accounts it has started playing up - but only on this one server. Once or twice a day the php file is not run.
The access log is missing the relevant entry.
While the cron log shows that the job was run.
We've added a bit to the command to log things out (-o /tmp/logfile) but it shows nothing.
I'm at a loss, really. I'm looking for ideas what can be wrong, or how to sidestep this issue as it has started taking up way too much of my time.
Has anyone seen anything remotely like this?
Thanks in advance!
Try this command
wget -d -a /tmp/logfile -O - http://some.site.com/cron.php
With -q you turn off wget's output. With -d you turn on debug output (maybe -v for verbose output is already enough). With -a you append logging messages to /tmp/logfile instead of always creating a new file.
You can also use curl:
curl http://some.site.com/cron.php

Check Whether a Web Application is Up or Not

I would like to write a script to check whethere the application is up or not using unix shell scripts.
From googling I found a script wget -O /dev/null -q http://mysite.com, But not sure how this works. Can someone please explain. It will be helpful for me.
Run the wget command
the -O option tells where to put the data that is retrieved
/dev/null is a special UNIX file that is always empty. In other words the data is discarded.
-q means quiet. Normally wget prints lots of info telling its progress in downloading the data so we turn that bit off.
http://mysite.com is the URL of the exact web page that you want to retrieve.
Many programmers create a special page for this purpose that is short, and contains status data. In that case, do not discard it but save it to a log file by replacing -O /dev/null with -a mysite.log.
Check whether you can connect to your web server.
Connect to the port where you web server
If it connects properly your web server is up otherwise down.
You can check farther. (e.g. if index page is proper)
See this shell script.
if wget -O /dev/null -q http://shiplu.mokadd.im;
then
echo Site is up
else
echo Site is down
fi

Resources