invoking function with txt file input using GNU parallel - linux

I want to use shell script function like below using gnu parallel.
This is part of my code.
#!/bin/bash
# Figure out script absolute path
pushd `dirname $0` > /dev/null
BIN_DIR=`pwd`
popd > /dev/null
ROOT_DIR=`dirname $BIN_DIR`
export ROOT_DIR
CLASS_NAME=$3
export CLASS_NAME
invoke_driver() {
$ROOT_DIR/DRIVER_DIR $CLASS_NAME $1
}
export -f invoke_driver
parallel invoke_driver :::: 'method_list.txt'
In 'method_list.txt' file, name of method is listed line by line like below.
method1
method2
...
Driver file only gets two arguments as input.
Driver in this code is fuzzing tool which runs endlessly.
So I want to give function each method as input and run this tool in parallel.
For example, if there are 3 methods in a txt file, I would like to write code that fuzzes each method in parallel.
But when I run this code, error occurs.
So please let me know how to solve this problem.

Related

How to make a .m file read an input csv file passed as a parameter?

I am new in Matlab and facing difficulty in making a .m file read the input csv file that I am passing as an argument from the command prompt. I understand that a function has to be written to read the input file as a parameter. Here is the code I wrote inside the .m file to accept the input file:
function data=input(filename);
addpath(genpath('./matlab_and_R_scripts'));
tic
D=csvread(filename,1,1);
I want the filename passed as an argument to be read by the function "csvread" and save it in D. I am using the following command to execute the script:
matlab -nodisplay -nosplash -nodesktop -r "input 'exp2_1_DMatrix.csv';run('matlab_filename.m');exit;"
I am able to execute the script without any errors but it is not reading the input file as the downstream analysis should have saved a new file if it was able to read the file and execute some functions on it.
Can anyone please suggest how to read the input file in my matlab script and the proper command to pass?
Try using this function. Take care on not using reserved names:
readdat.m
function data=readdat(filename);
addpath(genpath('./matlab_and_R_scripts'));
tic;
data=csvread(filename,1,1);
toc;
For testing, you can execute this function directly in the Command Window and editing in the Matlab Editor, which is where most functions, most of the time, are tested until you are fully satisfied with your processing.
>> data=readdat(filename);
Looking at data you can realize if the file was or not read as it should.
>> data(:,1)
You can keep running other scripts, such as matlab_filename.m. But the best choice is having a function working with everything:
processdat.m
function data=processdat(filename)
% Original Function
addpath(genpath('./matlab_and_R_scripts'));
tic;
data=csvread(filename,1,1);
toc;
% Paste in here all the matlab_filename.m code, or do a call:
matlab_filename;
% Do not uncomment this exit in here, since you want to keep working until the function do what you need
% exit;
Later, if you want some serious automation and want some programming tasks having repetitions of your code under some schedule, you can of course arrange a Windows bat command, and in this case, you can uncomment that exit; terminator at the end of processdat.m.
processdat.bat
matlab -nodisplay -nosplash -nodesktop -r processdat
Remember to ensure the OS can access matlab, and that Matlab can access processdat. If in doubt, place the proper paths:
processdat.bat
c\programs\bin\matlab -nodisplay -nosplash -nodesktop -r c\files\processdat.m
I solved the problem by gaining some insight from #Brethlosze's answer. If you want to avoid a local function then function shouldn't have a name but start with an []. Here is what I did to pass an input argument in my myScript.m script:
function [] = myScript(input_file, output_file)
addpath(genpath('../matlab_and_R_scripts'));
tic
D=csvread(input_file,1,1);
% Some code operations
save(output_file,'save_what_you_want')
toc
end
And I executed the script from command line using the following command:
matlab -nodisplay -nosplash -nodesktop -r "myScript 'example.csv' 'example.mat'"
The input_file is 'example.csv' and output_file is 'example.mat'.

Redirect parallel process bash script output to individual log file

I have a requirement, where i need to pass multiple arguments to the script to trigger parallel process for each argument. Now i need to capture each process output in the separate log file.
for arg in test_{01..05} ; do bash test.sh "$arg" & done
Above piece of code can only give parallel processing for the input arguments. I tried with exec > >(tee "/path/of/log/$arg_filedate +%Y%m%d%H.log") 2>&1 and it was able to create single log file name with just date with empty output. Can someone suggest whats going wrong here or if there is any best way other than using parallel package
Try:
data_part=$(date +%Y%m%d%H)
for arg in test_{01..05} ; do bash test.sh "$arg" > "/path/to/log/${arg}_${data_part}.log" & done
If i use "$arg_date +%Y%m%d%H.log" it is creating a file with just date without arg
Yes, because $arg_ is parsed as a variable name
arg_=blabla
echo "$arg_" # will print blabla
echo "${arg_}" # equal to the above
To separate _ from arg use braces "${arg}_" would expand variable arg and add string _.

Convert a batch file function to bash

I'm trying to convert this batch file that runs a python script into a bash script. I needed help converting a wait function in the batch file that waits for an action to complete into bash.
script.py wait-for-job <actionID> is the actual call that waits for the specific action to complete. The wait function basically assigns a value from the log file to a variable and then passes that variable as a parameter to a python script (script.py).
The log file is written continuously after each action and the last line (from which the action ID is fetched) looks something like this:
02/10/2019 00:00:00 AM Greenwich Mean Time print_action_id():250 INFO Action ID: 123456
The wait function in the batch file is as follows:
:wait
#echo off
for /f "tokens=11" %%i in (C:\Users\DemoUser\Dir\file.log) do ^
set ID=%%i
#echo on
script.py wait-for-job --action-id %ID%
EXIT /B 0
I tried implementing the same thing in bash like below but it did not seem to work (I'm new to shell scripting and I'm sure it's all wrong):
for $a in (tail -n1 /home/DemoUser/Dir/file.log); do
ID=$($a | awk { print $12})
script.py wait-for-job --action-id $ID
done
The following reads each line of the file and pulls out the ID and uses it to call a py script. First we declare the paths and variables. Then we run a loop.
#!/bin/bash
typeset file=/home/DemoUser/Dir/file.log
typeset py_script=/path/to/script.py
readonly PY=/path/to/python
while IFS= read -r line ;do
${PY} ${py_script} wait-for-job --action-id $(${line} | awk { print $12})
done < "${file}"

Executing a bash script from a Perl program

I'm trying to write a Perl program which will execute a bash script. The Perl script looks like this
#!/usr/bin/perl
use diagnostics;
use warnings;
require 'userlib.pl';
use CGI qw(:standard);
ReadParse();
my $q = new CGI;
my $dir = $q->param('X');
my $s = $q->param('Y');
ui_print_header(undef, $text{'edit_title'}.$dir, "");
print $dir."<br>";
print $s."<br>";
print "Under Construction <br>";
use Cwd;
my $pwd = cwd();
my $directory = "/Logs/".$dir."/logmanager/".$s;
my $command = $pwd."/script ".$directory."/".$s.".tar";
print $command."<br>";
print $pwd."<br>";
chdir($directory);
my $pwd1 = cwd();
print $pwd1."<br>";
system($command, $directory) or die "Cannot open Dir: $!";
The script fail with the following error:
Can't exec "/usr/libexec/webmin/foobar/script
/path/filename.tar": No such file or directory at /usr/libexec/webmin/foobar/program.cgi line 23 (#3)
(W exec) A system(), exec(), or piped open call could not execute the
named program for the indicated reason. Typical reasons include: the
permissions were wrong on the file, the file wasn't found in
$ENV{PATH}, the executable in question was compiled for another
architecture, or the #! line in a script points to an interpreter that
can't be run for similar reasons. (Or maybe your system doesn't support #! at all.)
I've checked that the permissions are correct, the tar file I'm passing to my bash script exists, and also tried from the command line to run the same command I'm trying to run from the Perl script ( /usr/libexec/webmin/foobar/script /path/filename.tar ) and it works properly.
In Perl, calling system with one argument (in scalar context) and calling it with several scalar arguments (in list context) does different things.
In scalar context, calling
system($command)
will start an external shell and execute $command in it. If the string in $command has arguments, they will be passed to the call, too. So for example
$command="ls /";
system($commmand);
will evaluate to
sh -c "ls /"
where the shell is given the entire string, i.e. the command with all arguments. Also, the $command will run with all the normal environment variables set. This can be a security issue, see here and here for a few examples why.
On the other hand, if you call system with an array (in list context), Perl will not call a shell and give it the $command as argument, but rather try to execute the first element of the array directly and give it the other arguments as parameters. So
$command = "ls";
$directory = "/";
system($command, $directory);
will call ls directly, without spawning a shell in between.
Back to your question: your code says
my $command = $pwd."/script ".$directory."/".$s.".tar";
system($command, $directory) or die "Cannot open Dir: $!";
Note that $command here is something like /path/to/script /path/to/foo.tar, with the argument already being part of the string. If you call this in scalar context
system($command)
all will work fine, because
sh -c "/path/to/script /path/to/foo.tar"
will execute script with foo.tar as argument. But if you call it in list context, it will try to locate an executable named /path/to/script /path/to/foo.tar, and this will fail.
I found the problem.
changed the system command removing the second parameter and now it's working
system($command) or die "Cannot open Dir: $!";
In fairness I did not understand what was wrong on first example but now works fine, if anyone can explain probably it can be interesting understand
There are multiple ways to execute bash command/ scripts in perl.
System
backquate
exec

How can I run a function from a script in command line?

I have a script that has some functions.
Can I run one of the function directly from command line?
Something like this?
myScript.sh func()
Well, while the other answers are right - you can certainly do something else: if you have access to the bash script, you can modify it, and simply place at the end the special parameter "$#" - which will expand to the arguments of the command line you specify, and since it's "alone" the shell will try to call them verbatim; and here you could specify the function name as the first argument. Example:
$ cat test.sh
testA() {
echo "TEST A $1";
}
testB() {
echo "TEST B $2";
}
"$#"
$ bash test.sh
$ bash test.sh testA
TEST A
$ bash test.sh testA arg1 arg2
TEST A arg1
$ bash test.sh testB arg1 arg2
TEST B arg2
For polish, you can first verify that the command exists and is a function:
# Check if the function exists (bash specific)
if declare -f "$1" > /dev/null
then
# call arguments verbatim
"$#"
else
# Show a helpful error
echo "'$1' is not a known function name" >&2
exit 1
fi
If the script only defines the functions and does nothing else, you can first execute the script within the context of the current shell using the source or . command and then simply call the function. See help source for more information.
The following command first registers the function in the context, then calls it:
. ./myScript.sh && function_name
Briefly, no.
You can import all of the functions in the script into your environment with source (help source for details), which will then allow you to call them. This also has the effect of executing the script, so take care.
There is no way to call a function from a shell script as if it were a shared library.
Using case
#!/bin/bash
fun1 () {
echo "run function1"
[[ "$#" ]] && echo "options: $#"
}
fun2 () {
echo "run function2"
[[ "$#" ]] && echo "options: $#"
}
case $1 in
fun1) "$#"; exit;;
fun2) "$#"; exit;;
esac
fun1
fun2
This script will run functions fun1 and fun2 but if you start it with option
fun1 or fun2 it'll only run given function with args(if provided) and exit.
Usage
$ ./test
run function1
run function2
$ ./test fun2 a b c
run function2
options: a b c
I have a situation where I need a function from bash script which must not be executed before (e.g. by source) and the problem with #$ is that myScript.sh is then run twice, it seems... So I've come up with the idea to get the function out with sed:
sed -n "/^func ()/,/^}/p" myScript.sh
And to execute it at the time I need it, I put it in a file and use source:
sed -n "/^func ()/,/^}/p" myScript.sh > func.sh; source func.sh; rm func.sh
Edit: WARNING - seems this doesn't work in all cases, but works well on many public scripts.
If you have a bash script called "control" and inside it you have a function called "build":
function build() {
...
}
Then you can call it like this (from the directory where it is):
./control build
If it's inside another folder, that would make it:
another_folder/control build
If your file is called "control.sh", that would accordingly make the function callable like this:
./control.sh build
Solved post but I'd like to mention my preferred solution. Namely, define a generic one-liner script eval_func.sh:
#!/bin/bash
source $1 && shift && "#a"
Then call any function within any script via:
./eval_func.sh <any script> <any function> <any args>...
An issue I ran into with the accepted solution is that when sourcing my function-containing script within another script, the arguments of the latter would be evaluated by the former, causing an error.
The other answers here are nice, and much appreciated, but often I don't want to source the script in the session (which reads and executes the file in your current shell) or modify it directly.
I find it more convenient to write a one or two line 'bootstrap' file and run that. Makes testing the main script easier, doesn't have side effects on your shell session, and as a bonus you can load things that simulate other environments for testing. Example...
# breakfast.sh
make_donuts() {
echo 'donuts!'
}
make_bagels() {
echo 'bagels!'
}
# bootstrap.sh
source 'breakfast.sh'
make_donuts
Now just run ./bootstrap.sh.Same idea works with your python, ruby, or whatever scripts.
Why useful? Let's say you complicated your life for some reason, and your script may find itself in different environments with different states present. For example, either your terminal session, or a cloud provider's cool new thing. You also want to test cloud things in terminal, using simple methods. No worries, your bootstrap can load elementary state for you.
# breakfast.sh
# Now it has to do slightly different things
# depending on where the script lives!
make_donuts() {
if [[ $AWS_ENV_VAR ]]
then
echo '/donuts'
elif [[ $AZURE_ENV_VAR ]]
then
echo '\donuts'
else
echo '/keto_diet'
fi
}
If you let your bootstrap thing take an argument, you can load different state for your function to chew, still with one line in the shell session:
# bootstrap.sh
source 'breakfast.sh'
case $1 in
AWS)
AWS_ENV_VAR="arn::mumbo:jumbo:12345"
;;
AZURE)
AZURE_ENV_VAR="cloud::woo:_impress"
;;
esac
make_donuts # You could use $2 here to name the function you wanna, but careful if evaluating directly.
In terminal session you're just entering:
./bootstrap.sh AWS
Result:
# /donuts
you can call function from command line argument like below
function irfan() {
echo "Irfan khan"
date
hostname
}
function config() {
ifconfig
echo "hey"
}
$1
Once you defined the functions put $1 at the end to accept argument which function you want to call.
Lets say the above code is saved in fun.sh. Now you can call the functions like ./fun.sh irfan & ./fun.sh config in command line.

Resources