save output to file instead of screen for nested script - linux

I have a script which calls several functions. These functions contain print commands (to the screen). How can I save the output of the script to a file without printing it to the screen and without changing the code of the functions.
Best,
Wouter

You can use the script command which stores on a file everything written to the screen:
script -c "myscript arguments ..." /tmp/myscript.log
That will at least allow you to get what was output to /dev/tty by the called functions, however, there is no simple way to prevent this output to go on your screen too.

Related

Writing a shell script that accepts as command line arguments a series of 3 strings representing file types

So i have a script called process_files, and when a file type is called, a directory should be created based on what was inputted into the command line. So for example, when process_files JPG is entered, a JPG directory should be created. However, when i try to do this it also creates the other directories that are in my code. This is the code.
#!/bin/sh
$jpg mkdir jpg
$gif mkdir gif
$docx mkdir docx
$png mkdir png
I don't think that you'll be able to do it in as short a script as that anyway.
Firstly, you need to refer to the arguments that you have passed in a different manner:
https://www.learnshell.org/en/Passing_Arguments_to_the_Script
Then you need to conditionally compare your input arguments to "jpg"/"gif"/"docx"/"png" strings and run a mkdir command if true:
https://linuxize.com/post/how-to-compare-strings-in-bash/
There are probably fancy shortcuts with the likes of awk but if you are just beginning shell scripting then it's probably better to go down the explicit, iterative route and keep things simple and obvious in early scripts.
There are many online resources for learning shell scripting and they can quickly cover most scenarios and answer most questions.

How to call external "interactive/TUI" command, interact, and read std output

I am trying to write my first vim script, so I apologize if this question boils down to not understanding the basics.
The main goal is that I want to call out to an external command from inside vim and read the results back into the file.
I know how to do this with simple shell commands, e.g. r !ls. However the command I want to interact with is "interactive".
I don't know if this is a meaningful description. But calling this command in the shell opens a TUI, then after interacting with the TUI the command will exit and put things into standard output. I want to read that standard output back into vim.
Possibly it will help to discuss the specific command, which is papis a cli citation manager. If you call, e.g. papis list --format '{doc[title]} {doc[author]}' in the shell it will open up a TUI that allows me to filter down and select a document. After selecting the document it will put the title and author into the standard output. This is what I want to read into vim.
However, my first few attempts have not been successful. Trying the naive :r !papis list results in an error, even though that command is valid in the shell and would result in the TUI being opened. So I'm obviously missing something.
Can anyone recommend a guide or suggest a possible solution for correctly calling out to TUI-based external commands and reading back their standard output?

Python Terminal Calls Fail to Interact with Files

I am writing a program that handles some data on a server. Throughout the program, many files are made and sent as input into other programs. To do this, I usually make the command string, then run it like so:
cmd = "prog input_file1 input_file2 > outputfile"
os.system(cmd)
When I run the command, however, the programs being called report that they cannot open the files. If I run the python code on my local computer, it is fine. When I loaded it onto the server, it started to fail. I think this is related to issues with permissions, but am not sure how I can fix this. Many of the files, particularly the output files, are being created at run time. The input files have full permissions for all users. Any help or advice would be appreciated!
Cheers!
The python code you list is simple and correct, so the problem is likely not in the two lines of your example. Here are some related areas for you to check out.
Permissions
The user running the python script must have the appropriate permission (read, write, execute). I see from comments that you've already checked this.
What command are you running
If the command is literally typed into your source code like in the example, then you know what command is being run, but if you are generating any part of it (eg. the list of operands, the name of the output file, other parameters, etc), make sure there are no bugs in the portions of your code that generate the command. For example before the call to os.system(cmd) consider including a line like print("About to execute: " + cmd) so you can see exactly what will be run.
Directly invoke the command
If all the above looks good, try to execute the command directly at a terminal on your server. What output do you get then. It's possible that the problem is with the underlying command itself rather than your python code.

How to save a command from terminal in a file

I want to save all the commands that are fired in the terminal. Using history and appending it to a file using cron job method is not suitable for my case.
If anyone types a ls -lrt in the terminal and as soon as he presses the Enter, I want to store it in a file. Any logical explanation would do, I would write the code myself.

How to track file creation and modification

We have put together a perl script that essentially looks at the argument that is being passed to it checks if is creating or modifying a file then it saves that in a mysql database so that it is easily accessible later. Here is the interesting part, how do I make this perl script run before all of the commands typed in the terminal. I need to make this dummy proof so people don't forget to run it.
Sorry I didn't formulate this question properly. What I want to do is prepend to each command such that each command will run like so "./run.pl ls" for example. That way I can track file changes if the command is mv or it creates an out file for example. The script pretty much takes care of that but I just don't know how to run it seamlessly to the user.
I am running ubuntu server with the bash terminal.
Thanks
If I understood correctly you need to execute a function before running every command, something similar to preexec and precmd in zsh.
Unfortunately bash doesn't have a native support for this but you can do it using DEBUG trap.
Here is a sample code applying this method.
This page also provide some useful information.
You can modify the ~/.bashrc file and launch your script there. Do note that each user would (and should) still have the privelege to modify this file, potentially removing the script invocation.
The /etc/bash.bashrc file is system-wide and only changeable by root.
These .bashrcs are executed when a new instance of bash is created (e.g. new terminal).
It is not the same as sh, the system shell, that is dash on Ubuntu systems.

Resources