How to take continuous back up of linux gnome terminal logs? commands and output of that command - linux

I want to take continuous backup of logs being printed in my linux terminal. Is it possible that whenever something will be printed in my terminal, it will automatically get printed into some text file with time stamp.

Use the script command ie
script log.txt
at the start of your session. You can also add this to your bash profile so that it starts when you open a terminal etc. You need to use
script -a log.txt
to append. Don't try and cat it or tail it while in the session, you need to CTRL-D then have a look at what got logged.

Related

Windows command prompt capture output of bash script in one step

I trigger a bash script from Windows command prompt.
postCloneSetup.sh
It opens another window and then returns. The window it spawned stays open and logs output text.
I want to capture the output from the spawned window and return that to the Windows command prompt.
I would prefer to use something like
$(postCloneSetup.sh) // Linux for capturing output to current context
for the Windows command prompt.
I'd prefer not to modify postCloneSetup.sh. I know I could have it write out to a file with
exec &> postCloneSetupLog.log
but then I must wait and manually run
type postCloneSetupLog.log
to see the output in the console. This is not possible for integrating into a CI engine, which is my goal.
How can I capture the output from the spawned console in one command?

gnome-terminal executes commands from file

I need an example of gnome-terminal command to read lines of text from a file and executes them one by one in different terminal or a tab.
So this would be the process. I would run gnome terminal command and it would read 10 commands from a file. Then it would execute those 10 commands in 10 different tabs/terminals. And of course those tabs/terminals would remain opened. I found this question Avoid gnome-terminal close after script execution?
The third answer from the top is very helpful. I managed to open 1 command from a file. But I need 1 file with 10 command lines to be opened like I wrote above.
Thanks.
I recommend to use screen for this, if that can be acceptable to you.
You could create a commands.screenrc file like this:
screen bash -c 'command1; echo press any key; read'
screen bash -c 'command2; bash'
screen mutt
screen emacs
screen
You can define as many programs as you want. Start screen with:
screen -c commands.screenrc
I don't know what kind of commands you want to run. If you want to see their output, then write like the first example above: execute the command in a bash shell, which will "pause" after the command was executed. Or the second line, which, after running the command will start another bash shell. Otherwise the screen window would exit automatically.
If you are not familiar with screen, you will need to learn some basic key strokes to get around, and to be able to switch between windows. The first few pages of this presentation should be enough to get you started.

How do I pipe the output of an LS on remote server to the local filesystem via SFTP?

I'm logged into a remote server via SFTP at the command line. The folder I'm in contains hundreds of thousands of files. I need to get a list of these files in a text file so I can access them programmatically, as none of the PHP SFTP clients are able to return such a large list of files.
When I run an ls on the directory ( within the SFTP session ), it takes about 20 minutes for the file list to finally display.
I don't have write access on this server, so I can't pipe the output to a file on the remote server.
How can I pipe the output to a text file on my local machine ... or get a list of the files to my local machine some other way?
If you're willing to wait the 20 minutes for the data to scroll across your screen you can capture all the output using "script".
Call 'script' before you start your ssh or sftp session and it will capture all terminal output to your local disk. Type 'exit' to finish the capture.
NAME
script -- make typescript of terminal session
SYNOPSIS
script [-akq] [-t time] [file [command ...]]
DESCRIPTION
The script utility makes a typescript of everything printed on your ter-
minal. It is useful for students who need a hardcopy record of an inter-
active session as proof of an assignment, as the typescript file can be
printed out later with lpr(1).
If the argument file is given, script saves all dialogue in file. If no
file name is given, the typescript is saved in the file typescript.
If the argument command is given, script will run the specified command
with an optional argument vector instead of an interactive shell.
The following options are available:
-a Append the output to file or typescript, retaining the prior con-
tents.
-k Log keys sent to program as well as output.
-q Run in quiet mode, omit the start and stop status messages.
-t time
Specify time interval between flushing script output file. A
value of 0 causes script to flush for every character I/O event.
The default interval is 30 seconds.
The script ends when the forked shell (or command) exits (a control-D to
exit the Bourne shell (sh(1)), and exit, logout or control-D (if
ignoreeof is not set) for the C-shell, csh(1)).
Certain interactive commands, such as vi(1), create garbage in the type-
script file. The script utility works best with commands that do not
manipulate the screen. The results are meant to emulate a hardcopy ter-
minal, not an addressable one.
ENVIRONMENT
The following environment variable is utilized by script:
SHELL If the variable SHELL exists, the shell forked by script will be
that shell. If SHELL is not set, the Bourne shell is assumed.
(Most shells set this variable automatically).
SEE ALSO
csh(1) (for the history mechanism).
HISTORY
The script command appeared in 3.0BSD.
BUGS
The script utility places everything in the log file, including linefeeds
and backspaces. This is not what the naive user expects.
It is not possible to specify a command without also naming the script
file because of argument parsing compatibility issues.
When running in -k mode, echo cancelling is far from ideal. The slave
terminal mode is checked for ECHO mode to check when to avoid manual echo
logging. This does not work when in a raw mode where the program being
run is doing manual echo.
Wu's answer is good if you do it remotely. Here is another option if you are logged onto the remote server and want to send the file back home to yourself:
Proper answer is here: http://scratching.psybermonkey.net/2011/02/ssh-how-to-pipe-output-from-local-to.html
your_command | ssh username#server "cat > filename.txt"
If you have ssh access, that would be very easy:
ssh user#server ls > foo.txt
Otherwise, you can just redirect sftp's STDOUT and STDERR to a file. You have to type password and commands blindly though.
In my case following worked:
ssh user#server ls /path/to/source/folder/ > /path/to/destination/folder/filenames.txt
I wrote it in Git Bash. This will first ssh then list all files of source folder and then save the file names to the destination text file.
In this way you can also save the output to json file. Just change the file extension to json instead of txt.
For appending output just put ">>" instead of ">".

Linux: using the tee command via ssh

I have written a Fortran program (let's call it program.exe) with does some simulation for me. Via ssh I'm logging ino some far away computers to start runs there whose results I collect after a few days. To be up-to-date how the program proceeds I want to write the shell output into a text file output.txt also (since I can't be logged in the far away computers all the time). The command should be something like
nohup program.exe | tee output.txt > /dev/null &
This enables me to have a look at output.txt to see the current status even though the program hasn't ended its run yet. The above command works fine on my local machine. I tried first with the command '>' but here the problem was that nothing was written into the text file until the whole program had finish (maybe related to the pipe buffer?). So I used the workaround with 'tee'.
The problem is now that when I log into the computer via ssh (ssh -X user#machine), execute the above command and look at output.txt with the VI editor nothing appears until the program has finished. If I omit the 'nohup' and '&' I will not even get any shell output until it has finished. My thought was that it might have to do something with data being buffered by ssh but I'm rather a Linux newbie. For any ideas or workaround I would be very grateful!
I would use screen utility http://www.oreillynet.com/linux/cmd/cmd.csp?path=s/screen instead of nohup. Thus I would be able to set my program to detached state (^A^D) reconnect to the host, retrieve my screen session (screen -r)
and monitor my output as if I never logged out.

What can I use to capture every command I run in bash (a-la history)

I know history will capture commands that I run, but it is shell specific. I work with multiple shells and multiple hosts and would like to write a small script which, after every command I run, dumps that command to some file along with the host name. This way, i can implement my own history command which reads from that file, and can take a host as an argument which would be handy for me. I'm not sure how to get the first part though..i.e., get every shell command I type to trigger a "dump that command into a file" part. Any ideas?
Thanks
In bash, the PROMPT_COMMAND environment variable contains a command that will be executed before the PS1 prompt is displayed. So yours could be something like history | tail -n1 | perl -npe 's/^\s+\d+\s+//' | yourcommand HOST
The script utility should solve your problem. It records everything you type and all that is printed on the terminal in a file (even including terminal control codes, so if you cat that file on the console, you even reproduce the original text colors).

Resources