Is it possible to write one script that runs in bash/shell and PowerShell? - linux

I need to create ONE integrated script that sets some environment variables, downloads a file using wget and runs it.
The challenge is that it needs to be the SAME script that can run on both Windows PowerShell and also bash / shell.
This is the shell script:
#!/bin/bash
# download a script
wget http://www.example.org/my.script -O my.script
# set a couple of environment variables
export script_source=http://www.example.org
export some_value=floob
# now execute the downloaded script
bash ./my.script
This is the same thing in PowerShell:
wget http://www.example.org/my.script -O my.script.ps1
$env:script_source="http://www.example.org"
$env:some_value="floob"
PowerShell -File ./my.script.ps1
So I wonder if somehow these two scripts can be merged and run successfully on either platform?
I've been trying to find a way to put them in the same script and get bash and PowerShell.exe to ignore errors but have had no success doing so.
Any guesses?

It is possible; I don't know how compatible this is, but PowerShell treats strings as text and they end up on screen, Bash treats them as commands and tries to run them, and both support the same function definition syntax. So, put a function name in quotes and only Bash will run it, put "exit" in quotes and only Bash will exit. Then write PowerShell code after.
NB. this works because the syntax in both shells overlaps, and your script is simple - run commands and deal with variables. If you try to use more advanced script (if/then, for, switch, case, etc.) for either language, the other one will probably complain.
Save this as dual.ps1 so PowerShell is happy with it, chmod +x dual.ps1 so Bash will run it
#!/bin/bash
function DoBashThings {
wget http://www.example.org/my.script -O my.script
# set a couple of environment variables
export script_source=http://www.example.org
export some_value=floob
# now execute the downloaded script
bash ./my.script
}
"DoBashThings" # This runs the bash script, in PS it's just a string
"exit" # This quits the bash version, in PS it's just a string
# PowerShell code here
# --------------------
Invoke-WebRequest "http://www.example.org/my.script.ps1" -OutFile my.script.ps1
$env:script_source="http://www.example.org"
$env:some_value="floob"
PowerShell -File ./my.script.ps1
then
./dual.ps1
on either system.
Edit: You can include more complex code by commenting the code blocks with a distinct prefix, then having each language filter out its own code and eval it (usual security caveats apply with eval), e.g. with this approach (incorporating suggestion from Harry Johnston ):
#!/bin/bash
#posh $num = 200
#posh if (150 -lt $num) {
#posh write-host "PowerShell here"
#posh }
#bash thing="xyz"
#bash if [ "$thing" = "xyz" ]
#bash then
#bash echo "Bash here"
#bash fi
function RunBashStuff {
eval "$(grep '^#bash' $0 | sed -e 's/^#bash //')"
}
"RunBashStuff"
"exit"
((Get-Content $MyInvocation.MyCommand.Source) -match '^#posh' -replace '^#posh ') -join "`n" | Invoke-Expression

While the other answer is great (thank you TessellatingHeckler and Harry Johnston)
(and also thank you j-p-hutchins for fixing the error with true)
We can actually do way better
Work with more shells (e.g. work for Ubuntu's dash)
Less likely to break in future situations
No need to waste processing time re-reading/evaling the script
Waste less characters/lines on confusing syntax(we can get away with a mere 41 chars, and mere 3 lines)
Even Keep syntax highlighting functional
Copy Paste Code
Save this as your_thing.ps1 for it to run as powershell on Windows and run as shell on all other operating systems.
#!/usr/bin/env sh
echo --% >/dev/null;: ' | out-null
<#'
#
# sh part
#
echo "hello from bash/dash/zsh"
echo "do whatver you want just dont use #> directly"
echo "e.g. do #""> or something similar"
# end bash part
exit #>
#
# powershell part
#
echo "hello from powershell"
echo "you literally don't have to escape anything here"
How? (its actually simple)
We want to start a multi-line comment in powershell without causing an error in bash/shell.
Powershell has multi-line comments <# but as-is they would cause problems in bash/shell languages. We need to use a string like "<#" for bash, but we need it to NOT be a string in powershell.
Powershell has a stop-parsing arg --% lets write single quote without starting a string, e.g. echo --% ' blah ' will print out ' blah '. This is great because in shell/bash the single quotes do start a string, e.g. echo --% ' blah ' will print out blah
We need a command in order to use powershell's stop-parsing-args, lucky for us both powershell and bash have an echo command
So, in bash we can echo a string with <#, but powershell the same code finishes the echo command then starts a multi-line comment
Finally we add >/dev/null to bash so that it doesn't print out --% every time, and we add | out-null so that powershell doesn't print out >/dev/null;: ' every time.
The syntax highlighting tells the story more visually
Powershell Highlighting
All the green stuff is ignored by powershell (comments)
The gray --% is special
The | out-null is special
The white parts are just string-arguments without quotes
(even the single quote is equivlent to "'")
The <# is the start of a multi-line comment
Bash Highlighting
For bash its totally different.
Lime green + underline are the commands.
The --% isn't special, its just an argument
But the ; is special
The purple is output-redirection
Then : is just the standard "do nothing" shell command
Then the ' starts a string argument that ends on the next line
Caveats?
Almost almost none. Powershell legitimately has no downside. The Bash caveats are easy to fix, and are exceedingly rare
If you need #> in a bash string, you'll need to escape it somehow.
changing "#>" to "#"">"or from ' blah #> ' to ' blah #''> '.
If you have a comment #> and for some reason you CANNOT change that comment (this is what I mean by exceedingly rare), you can actually just use #>, you just have to add re-add those first two lines (eg true --% etc) right after your #> comment
One even more exceedingly rare case is where you are using the # to remove parts of a string (I bet most don't even know this is a bash feature). Example code below
https://man7.org/linux/man-pages/man1/bash.1.html#EXPANSION
var1=">blah"
echo ${var1#>}
# ^ removes the > from var1
To fix this one, well there are alternative ways of removeing chars from the begining of a string, use them instead.

Following up on Jeff Hykin's answer, I have found that the first line, while it is happy in bash, produces this output in PowerShell. Note that it is still fully functional, just noisy.
true : The term 'true' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling
of the name, or if a path was included, verify that the path is correct and try again.
At C:\Users\jp\scratch\envar.ps1:4 char:1
+ true --% ; : '
+ ~~~~
+ CategoryInfo : ObjectNotFound: (true:String) [], CommandNotFoundException
hello from powershell
I am experimenting with changing the first lines from:
true --% ; : '
<#'
to:
echo --% > /dev/null ; : ' | out-null
<#'
In very limited testing this seems to be working in bash and powershell. For reference, I am "sourcing" the scripts not "calling" them, e.g. . env.ps1 in bash and . ./env.ps1 in powershell.

Related

dsacls - Invalid DN Syntax in powershell

I am trying to modify the servicePrincipalName permission within powershell script using the 'dsacls' command.
I am taking all the dynamic parameter as script arguments.
The script is not working when I form a command with the arguments variable I received. There is something I am missing with string manipulation.
$perStr ='"' + $strDN + '"' + ' /G ' + $DomainNetBIOSName + '\' + $SQLUser + ':RPWP;"servicePrincipalName"'
$ret = dsacls ${perStr}
The above gives an error:
Invalid DN Syntax
When I run with hardcoded values it runs fine.
When I have a hard time constructing strings to use with external executables I tend to build the entire command and then use Invoke-Expression to run it. Something like this:
$perStr = '& dsacls --% "{0}" /G {1}\{2}:RPWP;"servicePrincipalName"' -f $strDN, $DomainNetBIOSName, $SQLUser
$ret = Invoke-Expression -Command $perStr
The --% will tell it to stop interpreting things beyond that point so it will take all arguments exactly as typed and pass them to the command. See if that works for you, and if not you may want to look at the content of $perStr to make sure that it looks right to you.

Assign full text file path to a variable and use variable as file path in sh file

I am trying to create a shell script for logs and trying to append data into a text file. I have write this sample "test.sh" code for testing:
#!/bin/sh -e
touch /home/sample.txt
SPTH = '/home/sample'.txt
echo "MY LOG FILE" >> "$SPTH"
echo "DUMP started at $(date +'%d-%m-%Y %H:%M:%S')" >> /home/sample.txt
echo "DUMP finished at $(date +'%d-%m-%Y %H:%M:%S')" >> /home/sample.txt
but in above code all lines are working correct except one line of code i.e.
echo "MY LOG FILE" >> "$SPTH"
It is giving error:
test.sh: line 6: : No such file or directory
I want to replace this full path of file "/home/sample.txt" to variable "$SPATH".
I am executing my shell script using
sh test.sh
What I am doing wrong.
Variable assignments in bash shell does not allow you to have spaces within. It will be actually interpreted as command with = and the subsequent keywords as arguments to the first word, which is wrong.
Change your code to
SPTH="/home/sample.txt"
That is the reason why SPTH was not assigned to the actual path you intended it to have. And you have no reason to have single-quote here and excluding the extension part. Using it fully within double-quotes is absolutely fine.
The syntax for the command line is that the first token is a command, tokens are separated by whitespace. So:
SPTH = '/home/sample'.txt
Has the command as SPTH, the second token is =, and so on. You might think this is daft, but most shells behave like this for historical reasons.
So you need to remove the whitespace:
SPTH='/home/sample'.txt

Read filename with * shell bash

I'am new in Linux and I want to write a bash script that can read in a file name of a directory that starts with LED + some numbers.(Ex.: LED5.5.002)
In that directory there is only one file that will starts with LED. The problem is that this file will every time be updated, so the next time it will be for example LED6.5.012 and counting.
I searched and tried a little bit and came to this solution:
export fspec=/home/led/LED*
LedV=`basename $fspec`
echo $LedV
If I give in those commands one by one in my terminal it works fine, LedV= LED5.5.002 but if i run it in a bash scripts it gives the result: LedV = LED*
I search after another solution:
a=/home/led/LED*
LedV=$(basename $a)
echo $LedV
but here again the same, if i give it in one by one it's ok but in a script: LedV = LED*.
It's probably something small but because of my lack of knowledge over Linux I cannot find it. So can someone tell what is wrong?
Thanks! Jan
Shell expansions don't happen on scalar assignments, so in
varname=foo*
the expansion of "$varname" will literally be "foo*". It's more confusing when you consider that echo $varname (or in your case basename $varname; either way without the double quotes) will cause the expansion itself to be treated as a glob, so you may well think the variable contains all those filenames.
Array expansions are another story. You might just want
fspec=( /path/LED* )
echo "${fspec[0]##*/}" # A parameter expansion to strip off the dirname
That will work fine for bash. Since POSIX sh doesn't have arrays like this, I like to give an alternative approach:
for fspec in /path/LED*; do
break
done
echo "${fspec##*/}"
pwd
/usr/local/src
ls -1 /usr/local/src/mysql*
/usr/local/src/mysql-cluster-gpl-7.3.4-linux-glibc2.5-x86_64.tar.gz
/usr/local/src/mysql-dump_test_all_dbs.sql
if you only have 1 file, you will only get 1 result
MyFile=`ls -1 /home/led/LED*`

Understand when to use spaces in bash scripts

I wanted to run a simple bash timer and found this online (user brent7890)
#!/usr/bin/bash
timer=60
until [ "$timer" = 0 ]
do
clear
echo "$timer"
timer=`expr $timer - 1`
sleep 1
done
echo "-------Time to go home--------"
I couldn't copy and paste this code because the server is on another network. I typed it like this (below) and got an error on the line that starts with "until".
#!/usr/bin/bash
timer=60
#Note I forgot the space between [ and "
until ["$timer" = 0 ]
do
clear
echo "$timer"
timer=`expr $timer - 1`
sleep 1
done
echo "-------Time to go home--------"
Where is spacing like this documented? It seems strange that it matters. Bash scripts can be confusing, I want to understand why the space is important.
There are several rules, two basic of that are these:
You must separate all arguments of a command with spaces.
You must separate a command and the argument, that follows after, with a space.
[ here is a command (test).
If you write ["$timer" that means that you start command [60,
and that is, of course, incorrect. The name of the command is [.
The name of the command is always separated from the rest of the command line with a space. (you can have a command with a space in it, but in this case you must write the name of the command in "" or '').

Bash Shell - The : Command

The colon command is a null command.
The : construct is also useful in the conditional setting of variables. For example,
: ${var:=value}
Without the :, the shell would try to evaluate $var as a command. <=???
I don't quite understand the last sentence in above statement. Can anyone give me some details?
Thank you
Try
var=badcommand
$var
you will get
bash: badcommand: command not found
Try
var=
${var:=badcommand}
and you will get the same.
The shell (e.g. bash) always tries to run the first word on each command line as a command, even after doing variable expansion.
The only exception to this is
var=value
which the shell treats specially.
The trick in the example you provide is that ${var:=value} works anywhere on a command line, e.g.
# set newvar to somevalue if it isn't already set
echo ${newvar:=somevalue}
# show that newvar has been set by the above command
echo $newvar
But we don't really even want to echo the value, so we want something better than
echo ${newvar:=somevalue}.
The : command lets us do the assignment without any other action.
I suppose what the man page writers meant was
: ${var:=value}
Can be used as a short cut instead of say
if [ -z "$var" ]; then
var=value
fi
${var} on its own executes the command stored in $var. Adding substitution parameters does not change this, so you use : to neutralize this.
Try this:
$ help :
:: :
Null command.
No effect; the command does nothing.
Exit Status:
Always succeeds.

Resources