reading command line arguments through pipe to sh - linux

I am running a shell script by piping it to sh. For example:
curl commands.io/count-duplicate-lines-in-a-file | sh
The only way I could figure out how to pass in the filename was to use:
read file </dev/tty
You can check out the script here:
Count duplicate lines in a file
Is there another way to pass in the filename as an argument to the script without first saving it to a file locally, setting permissions and running it?
The idea is you can use Monitor to capture terminal input/output and then re-run it from the command line using curl piped to sh.

Use -s option:
echo 'echo "$#"' | sh -s 1 2 3 4
Output:
1 2 3 4
Another way is to use process substitution if shell supports it:
bash <(echo 'echo "$#"') 1 2 3 4

Related

How can I feed input within bash [Executed through the Network]

As the title says, within linux how can I feed input to the bash when I do sudo bash
Lets say I have a bash script that reads the name.
The way I execute the script is through sudo using:
cat read-my-name-script.sh | sudo bash
Lets just say this is how I execute the script throught the network.
Now I want to fill the name automatically, is there a way to feed the input. I tried doing this: cat read-my-name-script.sh < name-input-file | sudo bash where the name-input-file is a file for the input that the user will be using to feed the script.
I am new to linux and learning to automate the input and wanted to create a file for input where the user can fill it and feed it to my script.
This is convoluted, but might do what you want.
sudo bash -c "$(cat read-my-name.sh)" <name-input-file
The -c says the next quoted argument are the commands to run (so, read the script as a string on the command line, instead of from a file), and the calling shell interpolates the contents of the file inside the double quotes before the sudo command gets evaluated. So if read-my-name.sh contains
#!/bin/bash
read -p "I want your name please"
then the command gets expanded into
sudo bash -c '#!/bin/bash
read -p "I want your name please"' <name-input-file
(where of course at this time the shell has actually removed the outer double quotes altogether; I put in single quotes in their place instead to show how this would look as actually executable, syntactically valid code).
I think you need that:
while read -r arg; do sudo bash read-my-name-script.sh "$arg";done <name-input-file
So each line of name-input-file will be passed as argument to sudo bash read-my-name-script.sh
If your argslist located on http server, you can do that:
while read -r arg; do sudo bash read-my-name-script.sh "$arg";done < <(wget -q -O- http://some/address/in/internet/name-input-file)
UPD
add [[ -f name-input-file ]] && readarray -t args <name-input-file
to read-my-name-script.sh
and use "${args[#]}" as arguments of command in the script.
For example echo "${args[#]}" or cmd "${args[0]}" "${args[1]}" ... "${args[100]}" in any order.
In this case you can use
wget -q -O- http://some/address/in/internet/read-my-name-script.sh | bash
for run your script with arguments from name-input-file whitout saving script to the local machine

ssh and execute several commands as another user through a heredoc [duplicate]

This question already has answers here:
Usage of expect command within a heredoc
(1 answer)
Pass commands as input to another command (su, ssh, sh, etc)
(3 answers)
Closed 4 years ago.
I have a script that I need to execute through ssh as another use, is there a way to pass whole script like this:
ssh -t user#server.com sudo -u user2 sh -c << EOF
cd /home
ls
dir=$(pwd)
echo "$dir"
echo "hello"
....
EOF
Returns: sh: -c: option requires an argument
ssh'ing and sudo'ing separately is not an option and putting .sh file directly on the machine is not possible.
sh -c requires a command string as the argument. Since you are reading the commands from standard input (through heredoc), you need to use sh -s option:
ssh -t user#server.com sudo -u user2 sh -s << 'EOF'
cd /home
ls
dir=$(pwd)
echo "$dir"
echo "hello"
...
EOF
From man sh:
-c
string If the -c option is present, then commands are read from string. If there are arguments after the string, they are assigned to the positional parameters, starting with $0.
-s
If the -s option is present, or if no arguments remain after option processing, then commands are read from the standard input. This option allows the positional parameters to be set when invoking an interactive shell.
Need to quote the heredoc marker to prevent the parent shell from interpreting the content.

How to execute commands read from the txt file using shell? [duplicate]

This question already has answers here:
Run bash commands from txt file
(4 answers)
Closed 4 years ago.
I tried to execute commands read it from txt file. But only 1st command is executing, after that script is terminated. My script file name is shellEx.sh is follows:
echo "pwd" > temp.txt
echo "ls" >> temp.txt
exec < temp.txt
while read line
do
exec $line
done
echo "printed"
if I keep echo in the place of exec, just it prints both pwd and ls. But i want to execute pwd and ls one by one.
o/p am getting is:
$ bash shellEx.sh
/c/Users/Aditya Gudipati/Desktop
But after pwd, ls also need to execute for me.
Anyone can please give better solution for this?
exec in bash is meant in the Unix sense where it means "stop running this program and start running another instead". This is why your script exits.
If you want to execute line as a shell command, you can use:
line="find . | wc -l"
eval "$line"
($line by itself will not allow using pipes, quotes, expansions or other shell syntax)
To execute the entire file including multiline commands, use one of:
source ./myfile # keep variables, allow exiting script
bash myfile # discard variables, limit exit to myfile
A file with one valid command per line is itself a shell script. Just use the . command to execute it in the current shell.
$ echo "pwd" > temp.txt
$ echo "ls" >> temp.txt
$ . temp.txt

Collect stdout logs for a shell script that runs inside another shell script

I have a shell script called load_data.sh that runs inside a shell script called shell.sh
The contents of shell.sh
xargs --max-procs 10 -n 1 sh load_data.sh < tables.txt
This shell script runs on 10 tables at the same time in the tables.txt
Now I want to collect the Full logs of the load_data.sh So I did
xargs --max-procs 10 -n 1 sh load_data.sh < tables.txt |& tee-a logs.txt
But I am getting a mix of all the logs. What I want is the logs should be 1st table log then 2nd table log and then 3rd table logs and so on...
Is it possible to achieve that. If so how can I achieve that?
You can solve your problem by creating a separate logfile for each time your script is run. To get the logfiles to be created in sequence you can use the 'nl' utility to number each line of input.
nl -n rz tables.txt | xargs -n 2 --max-procs 10 sh -c './load_data.sh "$1" > log-$0'
Will produced logfiles in sequence
log-001
log-002
log-003
..
To turn that back into one file you can just use 'cat'
cat log* > result

How can I pass arguments to a script executed by sh read from stdin?

I download some shell script from example.com with wget and execute it immediately by streaming stdout of wget via a pipe to stdin of the sh command.
wget -O - http://example.com/myscript.sh | sh -
How can I pass arguments to the script?
You need to use -s option while invoking bash for passing an argument to the shell script being downloaded:
wget -O - http://example.com/myscript.sh | bash -s 'arg1' 'arg2'
As per man bash:
-s If the -s option is present, or if no arguments remain after option processing,
then commands are read from the standard input. This option allows
the positional parameters to be set when invoking an interactive shell.
While the accepted answer is correct, it does only work on bash and not on sh as the initial poster requested.
To do this in sh you'll have to add --:
curl https://example.com/script.sh | sh -s -- --my-arg

Resources