Some commands are not working inside TCL thread - multithreading

set t6 [thread::create] ; thread::send -async $t6 "set i1 0"
This is working fine without any errors.
set t5 [thread::create] ; thread::send -async $t5 "date"
But this is erroring out :
invalid command name "date" while executing "date"
What could be the reason ?

The date “command” is not actually a command. It's another program, /bin/date (or /usr/bin/date, depending on exactly how your system is set up). Or, on Windows it is a builtin of CMD.EXE.
What's going on is that you're using tclsh interactively; it detects this and marks the current main interpreter as interactive (sets the global tcl_interactive variable to 1) and the unknown command handler sees that and enables extra behaviours, such as expansion of command names and automatic calling out to external programs; it's effectively rewriting date into:
exec {*}[auto_execok date]
The auto_execok command in Tcl does path searching and all the extra stuff required for running inside bash or cmd (it's got a range of options it can do). On my system, auto_execok date returns /bin/date, but there's no particular reason why it might do so on yours; all we know is that it found something.
The other thread is not interactive; it's not talking directly to you. That means you need to send it full commands, not abbreviations, not things that are shorthands. It also doesn't write results to the console by default; you need to add that too. Try this:
set t5 [thread::create]
thread::send -async $t5 "puts \[[list exec {*}[auto_execok "date"]]\]"
This gets messy fast. It's a lot easier to create procedures in the other thread to do most of the work and then call them:
set t5 [thread::create]
thread::send $t5 {
proc runAndPrint {command args} {
puts [exec {*}[auto_execok $command] {*}$args]
}
}
thread::send -async $t5 [list runAndPrint "date"]
Note that I'm using list to build the command to send over when it isn't a literal. I don't need to do that, but it is an extremely good habit to have as it means that your code Just Works As Expected™ when you start passing arguments with spaces and other meta-characters in; you don't have to think about getting quoting right, as list does that for you (by design). It's great for creating a command to execute and it guarantees that there are no surprise substitutions. If you need substitutions, calling a procedure (or lambda term, or method) is so much easier to make reliable; that can uplevel and upvar if it needs to.

Related

Directory depth in recursive script

hi i'd like to get some help with my linux bash homeworks.
i have to make a script that gets a directory and returns the depth of the deepest subdirectory (+1 for each directory).
I must do it recursively.
I must use 'list_dirs.sh' that takes the virable dir and echo its subdirs.
thats what i got so far:
dir=$1
sub=`source list_dirs.sh`
((depth++))
for i in $sub
do
if [ -n "$sub" ] ; then
./depthScript $dir/$i
fi
done
if ((depth > max)) ; then
max=$depth
echo $max
fi
after testing with a dir that supose to return 3 I got insted:
1
1
1
1
it seems like my depth counter forget previous values and I get output for
each directory.. need some help!
You can use bash functions to create recursive function calls.
Your function would ideally echo 0 in the base case where it is called on a directory with no subdirectories, and echo 1+$(getDepth $subdir) in the case where some subdirectory $subdir exists. See this question on recursive functions in bash for a framework.
When you run a script normally (i.e. it's in your PATH and you just enter its name, or you enter an explicit path to it like ./depthScript), it runs as a subprocess of the current shell. This is important because each process has its own variables. Variables also come in two kinds: shell variables (which are only available in that one process) and environment variables (the values of which get exported to subprocesses but not back up from them). And depending on where you want a variable's value to be available, there are three different ways to define them:
# By default, variables are shell variable that's only defined in this process:
shellvar=something
# `export` puts a variable into the environment, so it'll be be exported to subprocesses.
# You can export a variable either while setting it, or as a separate operation:
export envvar=something
export anotherenvvar
anotherenvvar=something
# You can also prefix a command with a variable assignment. This makes an
# environment variable in the command process's environment, but not the current
# shell process's environment:
prefixvar=something ./depthScript $dir/$i
Given the above assignments:
shellvar is defined in the current shell process, but not in any other process (including the subprocess created to run depthScript).
envvar and anotherenvvar will be inherited by the subprocess (and its subprocesses, and all subprocesses for later commands), but any changes made to it in those subprocesses have no effect at all in the current process.
prefixvar is available only in the subprocess created to run depthScript (and its subprocesses), but not in the current shell process or any other of its subprocesses.
Short summary: it's a mess because of the process structure, and as a result it's best to just avoid even trying to pass values around between scripts (or different invocations of the same script) in variables. Use environment variables for settings and such that you want to be generally available (but don't need to be changed much). Use shell variables for things local to a particular invocation of a script.
So, how should you pass the depth values around? Well, the standardish way is for each script (or command) to print its output to "standard output", and then whatever's using the script can capture its output to either a file (command >outfile) or a variable (var=$(command)). I'd recommend the latter in this case:
depth=$(./depthScript "$dir/$i")
if ((depth > max)) ; then
max=$depth
fi
Some other recommendations:
Think your control and data flow through. The current script loops through all subdirectories, then at the end runs a single check for the deepest subdir. But you need to check each subdirectory individually to see if it's deeper than the current max, and at the end report the deepest of them.
Double-quote your variable references (as I did with "$dir/$i" above). Unquoted variable references are subject to word splitting and wildcard expansion, which is the source of much grief. It looks like you'll need to leave $sub unquoted because you need it to be split into words, but this will make the script unable to cope with directory names with spaces. See BashFAQ #20: "How can I find and safely handle file names containing newlines, spaces or both?"
The if [ -n "$sub" ] ; then test is irrelevant. If $sub is empty, the loop will never run.
In a shell script, relative paths (like ./depthScript) are relative to whatever the working directory of the parent process, not to the location of the script. If someone runs your script from another directory, ./depthScript will not work. Use "$BASH_SOURCE" instead. See BashFAQ #28: "How do I determine the location of my script? I want to read some config files from the same place."
When trying to troubleshoot a script, it can help to put set -x before the troublesome section. This makes the shell print each command as it runs, so you can see what's going on.
Run your scripts through shellcheck.net -- it'll point out a lot of common mistakes.

how do I write to a spawned terminal?

For this little script:
package require Tcl 8.4
package require Expect 5.40
spawn gnome-terminal
while {1} {
puts -nonewline "Enter your name: "
flush stdout
set name [gets stdin]
puts "Hello $name"
}
how can I write to the spawned gnome-terminal so that user input is echoed to both terminals?
You run Expect inside the gnome-terminal, not the other way round. Expect is a command-line program really, and gnome-terminal is really not (it's a graphical terminal emulator). In particular, gnome-terminal ignores its stdin and stdout entirely; it effectively creates those for other programs to use. Meanwhile, Expect controls other programs by talking to their stdin and stdout (with trickery with extra virtual terminals); this means that the interface it uses to its subprocesses is something that gnome-terminal basically ignores from the outside.
Though in this case, why not use Tk to pop up a GUI to ask for the password instead? Instead of putting up a proxy to ask the question, you can ask it directly. This can make for a much richer interface if you desire…

Command Line for Loop to Change Variables in File

I have a script that compiles and runs a piece of idl code. Looks like this,
arg1=$1
idl << EOF
.rnew testvalue_{arg1}.pro
testvalue_{arg1}.pro
EOF
I want to run a for loop from the command line is in which arg1 can take on different names. What I have so far is,
for arg1 in testvalue.sh; do arg1={'value1', 'value2'}; done
I don't think my logic is correct. What am I missing?
first, you need to place $ before variable name so bash knows that it is replaced
Also you just need to give testvalue.sh value, because it is copied from $1 to $arg1
But why I don't recommend creating new variables, just use $1 two times
So the testvalue.sh is:
idl << EOF
.rnew testvalue_$1.pro
testvalue_$1.pro
And the loop:
for arg in 'value1' 'value2'; do ./testvalue.sh $arg; done
I'm a little hazy on the exact details of the question, but it sounds like you want to put 'value1' and 'value2' inside the list of values the for-loop iterates:
for arg1 in value1 value2
do
idl <<EOF
.rnew testvalue_${arg1}.pro
testvalue_${arg1}.pro
EOF
done
Note that I've also changed {arg1} to ${arg1}, as the dollar sign is required to expand the variable.
A for loop in shell scripts will iterate over every value after the in keyword. In the example above it will set arg1 to value1, then execute the contents of the loop, then move on and set arg1 to value2 and execute again.
You can also store the values in a variable:
values_to_test="value1 value2"
for arg1 in $values_to_test
do
idl <<EOF
.rnew testvalue_${arg1}.pro
testvalue_${arg1}.pro
EOF
done
Bear in mind that this applies word splitting and path expansion to values_to_test, so you will need to ensure that none of the values contain question marks, square brackets, asterisks, spaces, tabs, or newlines.
If it worries you, you can disable path expansion (and thus allow use of question marks, square brackets and asterisks) by running set -f in the script before the loop runs.

A way to prevent bash from parsing command line w/out using escape symbols

I'm looking for a way (other than ".", '.', \.) to use bash (or any other linux shell) while preventing it from parsing parts of command line. The problem seems to be unsolvable
How to interpret special characters in command line argument in C?
In theory, a simple switch would suffice (e.g. -x ... telling that the
string ... won't be interpreted) but it apparently doesn't exist. I wonder whether there is a workaround, hack or idea for solving this problem. The original problem is a script|alias for a program taking youtube URLs (which may contain special characters (&, etc.)) as arguments. This problem is even more difficult: expanding "$1" while preventing shell from interpreting the expanded string -- essentially, expanding "$1" without interpreting its result
Use a here-document:
myprogramm <<'EOF'
https://www.youtube.com/watch?v=oT3mCybbhf0
EOF
If you wrap the starting EOF in single quotes, bash won't interpret any special chars in the here-doc.
Short answer: you can't do it, because the shell parses the command line (and interprets things like "&") before it even gets to the point of deciding your script/alias/whatever is what will be run, let alone the point where your script has any control at all. By the time your script has any influence in the process, it's far too late.
Within a script, though, it's easy to avoid most problems: wrap all variable references in double-quotes. For example, rather than curl -o $outputfile $url you should use curl -o "$outputfile" "$url". This will prevent the shell from applying any parsing to the contents of the variable(s) before they're passed to the command (/other script/whatever).
But when you run the script, you'll always have to quote or escape anything passed on the command line.
Your spec still isn't very clear. As far as I know the problem is you want to completely reinvent how the shell handles arguments. So… you'll have to write your own shell. The basics aren't even that difficult. Here's pseudo-code:
while true:
print prompt
read input
command = (first input)
args = (argparse (rest input))
child_pid = fork()
if child_pid == 0: // We are inside child process
exec(command, args) // See variety of `exec` family functions in posix
else: // We are inside parent process and child_pid is actual child pid
wait(child_pid) // See variety of `wait` family functions in posix
Your question basically boils down to how that "argparse" function is implemented. If it's just an identity function, then you get no expansion at all. Is that what you want?

Weird scope issue in .bat file

I'm writing a simple .bat file and I've run into some weird behavior. There are a couple places where I have to do a simple if/else, but the code inside the blocks don't seem to be working correctly.
Here's a simple case that demonstrates the error:
#echo off
set MODE=FOOBAR
if "%~1"=="" (
set MODE=all
echo mode: %MODE%
) else (
set MODE=%~1
echo mode: %MODE%
)
echo mode: %MODE%
The output I'm getting is:
C:\>test.bat test
mode: FOOBAR
mode: test
Why is the echo inside the code block not getting the new value of the variable? In the actual code I'm writing I need to build a few variables and reference them within the scope of the if/else. I could switch this to use labels and gotos instead of an if/else, but that doesn't seem nearly as clean.
What causes this behavior? Is there some kind of limit on variables within code blocks?
You are running into the problem of cmd's static variable expansion. The MODE variable is only evaluated once. You can see this if you omit the #echo off line.
From the set /? documentation:
Finally, support for delayed environment variable expansion has
been added. This support is always
disabled by default, but may be
enabled/disabled via the /V command
line switch to CMD.EXE. See CMD /?
Delayed environment variable expansion is useful for getting around
the limitations of the current
expansion which happens when a line of
text is read, not when it is executed.
The following example demonstrates the
problem with immediate variable
expansion:
set VAR=before
if "%VAR%" == "before" (
set VAR=after
if "%VAR%" == "after" #echo If you see this, it worked
)
would never display the message, since
the %VAR% in BOTH IF statements is
substituted when the first IF
statement is read, since it logically
includes the body of the IF, which is
a compound statement. So the IF
inside the compound statement is
really comparing "before" with "after"
which will never be equal. Similarly,
the following example will not work as
expected:
set LIST=
for %i in (*) do set LIST=%LIST% %i
echo %LIST%
in that it will NOT build up a list of
files in the current directory, but
instead will just set the LIST
variable to the last file found.
Again, this is because the %LIST% is
expanded just once when the FOR
statement is read, and at that time
the LIST variable is empty. So the
actual FOR loop we are executing is:
for %i in (*) do set LIST= %i
which just keeps setting LIST to the
last file found.
Delayed environment variable expansion
allows you to use a different
character (the exclamation mark) to
expand environment variables at
execution time. If delayed variable
expansion is enabled, the above
examples could be written as follows
to work as intended:
set VAR=before
if "%VAR%" == "before" (
set VAR=after
if "!VAR!" == "after" #echo If you see this, it worked
)
set LIST=
for %i in (*) do set LIST=!LIST! %i
echo %LIST%
setlocal EnableDelayedExpansion
will enable the /v flag
Additionally to already anwered here.
Long answer article:
https://rsdn.org/article/winshell/NTCommandProcessor.xml
The google translated variant is pretty broken, but you can at least read the raw text:
https://translate.google.com/?sl=ru&tl=en&text=https%3A%2F%2Frsdn.org%2F%3Farticle%2Fwinshell%2FNTCommandProcessor.xml&op=translate
Start read from Conditional block section.
Short answer:
The block operator (...) blocks a %-variable expansion until the top most scope. You have to exit the scope out to be able to use %-variable as is:
#echo off
set "MODE="
(
(
set MODE=all
echo MODE=%MODE%
)
echo MODE=%MODE%
)
echo MODE=%MODE%
Or use call prefix to reevaluate it in place:
#echo off
set "MODE="
(
(
set MODE=all
call echo MODE=%%MODE%%
)
)
Looks like the read and write use different scoping rules.
If you eliminate this line
set MODE=FOOBAR
it will work as expected. So you'll probably need to have a complex series if if/elses to get the variables populated as you'd like.

Resources