I struggled with a specific problem with java Threads and initialisation of an object in scala. In my original code a Thread freezes. I finally found a solution for my problem. But I do not understand why this works.
I have rewritten the problem into the following code. Running this, the code stops at the first line in the Thread that references to a variable of the object being initialised. If the reference is through the self variable which points to this initialising object, it executes fine.
object Alpha {
var r = "A"
r = "B"
System.out.println("Not threaded: r = " + r)
val thread =
{
val self = this
new Thread(
new Runnable
{
def run() =
{
System.out.println(" Started.")
System.out.println(" Threaded self.r = " + self.r)
self.r = "C"
System.out.println(" Threaded self.r = " + self.r)
// At the following line the thread freezes!
System.out.println(" Threaded r = " + r)
r = "D"
System.out.println(" Threaded r = " + r)
}
})
}
thread.start
thread.join
}
Calling Alpha will result in the following output before execution is frozen.
Not threaded: r = B
Started.
Threaded self.r = B
Threaded self.r = C
I would understand if referencing to object variables during initialisation is prohibited at all times. But now it looks a bit randomly.
I'm doing smth like:
Iterator<String> iterator = requestList.iterator()
(1..threadCount).each {
Thread.start {
while(iterator.hasNext()) {
log.info iterator.next()
Thread.sleep(50)
}
}
}
Given that threadCount = 10 and requestList is ~115, I expect all threads to output all the list, each time asking iterator to give them next.
However, I hardly even get 10 logs, usually 8.
Everything is done inside SoapUI groovy script step, instead of log.info I actually plan triggering a REST request with number N.
What am I doing wrong with these threads?
UPD
Okay, did smth stupid like this, to test (and avoid using one array):
def array1 = all[0..5]
def array2 = all[6..11]
Thread.start{
for(String r: array1) {
log.info r
}
}
Thread.start{
for(String r: array2) {
log.info r
}
}
And now I have no output at all or one log at most, though I expect 12.
How do I create threads that will output data simultaneously?
EVEN MORE
def threadCount=10
(0..threadCount).each { n ->
Thread.start {
(1..10).each {
log.info "thread"+n+" says| "+it
}
}
}
Output is:
thread0 says| 1
thread3 says| 1
thread8 says| 1
thread2 says| 1
thread1 says| 1
thread9 says| 1
thread7 says| 1
thread5 says| 1
thread4 says| 1
thread0 says| 2
And nothing more. Again, what's wrong with me or groovy? (hope groovy is fine)
In the end, problem is that SoapUI kills main thread before all threads have an opportunity to grab their next number.
A quick way to live with this is to add sleep to main method
I have a large processing task which I believe is ripe for being made more efficient with concurrency and parallelism.
I had a look at the GPars docs and I found them quite confusing so I hope people here can help.
The first task I would like to do in parallel currently looks like this:
def providerOneProgrammes = providerOneProgrammeService.getProgrammes(timeWindow)
def providerTwoProgrammes = providerTwoProgrammeService.getProgrammes(timeWindow)
both return a list of objects and both can be run in parallel.
I would like to execute them together and then wait for them to finish before processing the return lists (I will then look for matches between the lists but I'll come to that later).
Thanks
Rakesh
The simplest way to take advantage of GPars here is with callAsync. Here's a simple example:
#Grab(group='org.codehaus.gpars', module='gpars', version='1.0-beta-2')
import groovyx.gpars.GParsPool
def providerOneProgrammeService(timeWindow) {
println "p1 starts"
Thread.sleep(4000)
println "p1 still going"
Thread.sleep(4000)
println "p1 ends"
return "p1 return value"
}
def providerTwoProgrammeService(timeWindow) {
println "p2 starts"
Thread.sleep(5000)
println "p2 still going"
Thread.sleep(5000)
println "p2 still going"
Thread.sleep(5000)
println "p2 ends"
return "p2 return value"
}
def results = []
GParsPool.withPool {
results << this.&providerOneProgrammeService.callAsync("arg1")
results << this.&providerTwoProgrammeService.callAsync("arg2")
}
println "done ${results*.get()}"
You can get the entire output stream by using .text:
def process = "ls -l".execute()
println "Found text ${process.text}"
Is there a concise equivalent to get the error stream?
You can use waitForProcessOutput which takes two Appendables (docs here)
def process = "ls -l".execute()
def (output, error) = new StringWriter().with { o -> // For the output
new StringWriter().with { e -> // For the error stream
process.waitForProcessOutput( o, e )
[ o, e ]*.toString() // Return them both
}
}
// And print them out...
println "OUT: $output"
println "ERR: $error"
Based on tim_yates answer, I tried it on Jenkins and found this issue with multiple assignment: https://issues.jenkins-ci.org/browse/JENKINS-45575
So this works and it is also concise:
def process = "ls -l".execute()
def output = new StringWriter(), error = new StringWriter()
process.waitForProcessOutput(output, error)
println "exit value=${process.exitValue()}"
println "OUT: $output"
println "ERR: $error"
Groovy adds the execute method to String to make executing shells fairly easy;
println "ls".execute().text
but if an error happens, then there is no resulting output. Is there an easy way to get both the standard error and standard out? (other than creating a bunch of code to; create two threads to read both inputstreams, then using a parent stream to wait for them to complete then convert the strings back to text?)
It would be nice to have something like;
def x = shellDo("ls /tmp/NoFile")
println "out: ${x.out} err:${x.err}"
Ok, solved it myself;
def sout = new StringBuilder(), serr = new StringBuilder()
def proc = 'ls /badDir'.execute()
proc.consumeProcessOutput(sout, serr)
proc.waitForOrKill(1000)
println "out> $sout\nerr> $serr"
displays:
out> err> ls: cannot access /badDir: No such file or directory
"ls".execute() returns a Process object which is why "ls".execute().text works. You should be able to just read the error stream to determine if there were any errors.
There is a extra method on Process that allow you to pass a StringBuffer to retrieve the text: consumeProcessErrorStream(StringBuffer error).
Example:
def proc = "ls".execute()
def b = new StringBuffer()
proc.consumeProcessErrorStream(b)
println proc.text
println b.toString()
// a wrapper closure around executing a string
// can take either a string or a list of strings (for arguments with spaces)
// prints all output, complains and halts on error
def runCommand = { strList ->
assert ( strList instanceof String ||
( strList instanceof List && strList.each{ it instanceof String } ) \
)
def proc = strList.execute()
proc.in.eachLine { line -> println line }
proc.out.close()
proc.waitFor()
print "[INFO] ( "
if(strList instanceof List) {
strList.each { print "${it} " }
} else {
print strList
}
println " )"
if (proc.exitValue()) {
println "gave the following error: "
println "[ERROR] ${proc.getErrorStream()}"
}
assert !proc.exitValue()
}
I find this more idiomatic:
def proc = "ls foo.txt doesnotexist.txt".execute()
assert proc.in.text == "foo.txt\n"
assert proc.err.text == "ls: doesnotexist.txt: No such file or directory\n"
As another post mentions, these are blocking calls, but since we want to work with the output, this may be necessary.
To add one more important information to above provided answers -
For a process
def proc = command.execute();
always try to use
def outputStream = new StringBuffer();
proc.waitForProcessOutput(outputStream, System.err)
//proc.waitForProcessOutput(System.out, System.err)
rather than
def output = proc.in.text;
to capture the outputs after executing commands in groovy as the latter is a blocking call (SO question for reason).
def exec = { encoding, execPath, execStr, execCommands ->
def outputCatcher = new ByteArrayOutputStream()
def errorCatcher = new ByteArrayOutputStream()
def proc = execStr.execute(null, new File(execPath))
def inputCatcher = proc.outputStream
execCommands.each { cm ->
inputCatcher.write(cm.getBytes(encoding))
inputCatcher.flush()
}
proc.consumeProcessOutput(outputCatcher, errorCatcher)
proc.waitFor()
return [new String(outputCatcher.toByteArray(), encoding), new String(errorCatcher.toByteArray(), encoding)]
}
def out = exec("cp866", "C:\\Test", "cmd", ["cd..\n", "dir\n", "exit\n"])
println "OUT:\n" + out[0]
println "ERR:\n" + out[1]
command = "ls *"
def execute_state=sh(returnStdout: true, script: command)
but if the command failure the process will terminate