In scons, I am attempting to make a UnitTest system (see code below), based on the great example from here: http://spacepants.org/blog/scons-unit-test
However due to a problem in recent scons 2.0.1 and newer, this cases a dependency cycle, as documented here: http://old.nabble.com/AddPostAction-executes-on-first-build-but-not-subsequent-td18360675.html (and elsewhere).
Does anyone know of a good work-around or replacement solution to this problem?
Code:
def UnitTest(env, target, source, **kwargs):
curTest = env.Program(target, source, **kwargs)
env.AddPostAction(curTest, curTest[0].abspath)
env.Alias('unit_tests', curTest)
env.AlwaysBuild(curTest)
return curTest
SConsEnvironment.UnitTest = UnitTest
mandolineTest = env.UnitTest(target='./codeTest',
source = mix(['test.cc', 'base.cc'),
LIBS = default_libs + ['bgl',],
LIBPATH = default_libs_path,
CPPPATH = default_includes )
I found a workaround for this problem. By using:
env.AddPostAction(curTest, curTest[0].abspath)
it appears that SCons tries to be clever and add a build dependency for curTest[0].abspath to itself, causing this circular dependency problem. The solution is to "hide" the execution of the command from SCons so it can't figure out what you are doing:
env.AddPostAction(curTest, lambda *_, **__: os.system(curTest[0].abspath))
For my unit test system (which is slightly different from yours but had the same problem), this has the desired effect of running the unit test whenever any of its dependencies changes, and not running it if nothing relevant has changed.
Related
I contribute to scikit-image and was using coverage. Now, when I did
coverage run benchmarks/benchmark_name.py
and then generated the report, there were a lot of files that didn't have any link to this file but were still executed when I ran the above command. One interesting thing that I noted in those files, only the lines having a function definition(def abc():) were run. See the image below:
It basically shows the coverage report of a file which didn't have any link to my file. Yet, it was run and only the function definition statements and import statements.
Is this the way python brings functions defined in the project into its scope? If that's the case, I would like to know the flow in which this happened. Please help.
Thanks.
You're looking at import transitive dependencies. At import time, anything not protected by an if __name__ == '__main__': clause will be executed, including the def statements that you mentioned.
Use coverage run --omit=... and similar options to trim your reporting output.
I'm completely new to Groovy, so apologize in advance if I'm missing something obvious.
I'm trying to do some simple REST API scripting in Groovy, but first wanted to understand it's performance for requests/JSON parsing vs Python. I wrote the following script - and am seeing that the imports are taking ~7 seconds. Is there any way to 'include' those in the script, so it doesn't take so long on each run?
def now = new Date()
println now.format("yyyyMMdd-HH:mm:ss.SSS", TimeZone.getTimeZone('UTC'))
#Grab('org.codehaus.groovy.modules.http-builder:http-builder:0.7')
#Grab('oauth.signpost:signpost-core:1.2.1.2')
#Grab('oauth.signpost:signpost-commonshttp4:1.2.1.2')
import groovyx.net.http.RESTClient
import static groovyx.net.http.ContentType.*
for (i = 0; i <1; i++) {
def Client = new RESTClient("http://www.mocky.io/v2/59821b4a110000a9103964eb" )
def resp = Client.get(contentType: JSON)
def myResponseObject = resp.getData()
println myResponseObject.items[i].id
}
now = new Date()
println now.format("yyyyMMdd-HH:mm:ss.SSS", TimeZone.getTimeZone('UTC'))
I get this output:
~$ time groovy Requests.groovy
20170802-18:36:24.556
10
20170802-18:36:25.290
real 0m7.173s
user 0m4.986s
sys 0m0.329s
Just the first few lines of Grabs and imports are taking the majority of the runtime , and that's what I'd like to cut down.
It's not the import that takes time, but #Grab annotation which comes from Grape - a Groovy dependency management system. Those 3 lines:
#Grab('org.codehaus.groovy.modules.http-builder:http-builder:0.7')
#Grab('oauth.signpost:signpost-core:1.2.1.2')
#Grab('oauth.signpost:signpost-commonshttp4:1.2.1.2')
define your script dependencies. Those dependencies are 3rd party libraries provided as a JAR files. Some of them may even have their own dependencies which will be also downloaded to satisfy dependency you have defined (e.g. http-builder requires Apache's HTTP client and core lib).
Running this script takes some time (about 1 second on my laptop), because Groovy has to determine all dependencies and add them to the classpath to satisfy all imports. Keep in mind that your script uses a lot more dependencies than those 3 and all of them have to be resolved.
Using Grape is actually a compromise between using 3rd party libraries in the most easiest way and some overhead that is delegated to Groovy. Alternatively you could run your script with:
groovy -classpath ${GROOVY_CLASSPATH} Request.groovy
where ${GROOVY_CLASSPATH} contains paths to all JAR files you need to successfully run the script. And believe me - you will have to add at least 15 libraries instead of those 3 grapes. Then you will be able to remove all #Grab annotations (they are not needed in this case because you will satisfy groovy script with providing all libs in the classpath) and your script will execute in the blink of an eye - there will be no overhead caused by resolving and loading all dependencies.
Another alternative solution is to use Gradle to manage all dependencies and create so called "fat JAR" that contains all mandatory dependencies inside - in this case you will be able to run your program with java command and all imports will be in place without any dependencies resolving mechanism.
Final conclusion. Grape is a powerful Groovy's feature that has it's own limitations. It allows you to handle dependencies-hell pretty easily, but it comes with its own cost. I hope this answer will help you making a good choice.
I'm trying to make a plugin which implements the compile & optimize methods.
In the doc, optimizers and compilers are always in a separated function/class.
Useless I set optimise to true in the config, the compile method is never called.
Is there a way to have both features in a same plugin ?
Edit: I tried MyPluginCompiler.prototype.defaultEnv = '*';
Did you specify either an extension MyPluginCompiler.prototype.extension = 'someExtension'; or a pattern for your plugin? Either one of these is required for a compiler, so that brunch knows what files to pass to the compilers.
I could not reproduce the issue. I created a repo with a simple plugin that is both a compiler and an optimizer for js files: https://github.com/goshakkk/brunch-plugin-optim-n-comp/blob/master/my-brunch/index.js
You can see it prints both 1 and 2 when building the supplied demo app https://github.com/goshakkk/brunch-plugin-optim-n-comp/tree/master/demo
https://github.com/goshakkk/brunch-plugin-optim-n-comp
If that does not fix your issue, please comment on this thread https://github.com/brunch/brunch/issues/1226
I have a code snippet similar to this:
# Compile protobuf headers
env.Protoc(...)
# Move headers to 'include' (compiled via protobuf)
env.Command([include headers...], [headers...], move_func)
# Compile program (depends on 'include' files)
out2 = SConscript('src/SConscript')
Depends(out2, [include headers...])
Basically, I have Protoc() compiling protobuf files, then the headers are moved to the 'include' directory by env.Command() and finally the program is compiled through a SConscript file in the 'src'.
Since these are header files that are being moved (that the src compilation depends on), they are not explicitly defined as a dependency by scons (as far as I understand). Thus, the compilation runs, but the header files haven't been moved so it fails. I have tried exposing the dependency via Depends() and Requires() without success.
I understand that in the usual case, scons should "figure-out" dependencies, but I don't know how it could do that here.
Thanks!
You seem to be thinking in "make" ways about your build process, which is the wrong approach when using SCons. You can't order single build steps by putting them in different SConscripts, and then including those in a special order. You have to define proper dependencies between your actual sources (C/CPP files for example) and a target like a program or PDF file. Then SCons is able to figure out the correct build order, and will traverse through the folder structure of your project automatically. If required, it will enter subfolders more than once when the dependency graph (DAG) dictates this. Defining this kind of dependencies between inputs and outputs is usually done, using a Builder...and in your case the Install() builder would be a good fit. Please also regard the hints for #2 in the list of "most frequently-asked FAQs" ( https://bitbucket.org/scons/scons/wiki/FrequentlyAskedQuestions).
Further, I can only recommend to read a little more in the UserGuide ( http://www.scons.org/doc/production/HTML/scons-user.html ) to get a better feeling for how to do things in a more "SConsy" way. If you get stuck, feel free to ask further questions on our mailing list at scons-users#scons.org (see http://www.scons.org/lists.php ).
Finally, if you have a lot of steps that you want to execute in serial, and that don't require any special input/output files, SCons is probably not the right tool for your current task. It's designed as a file-oriented build system with automatic parallelization in mind, a simple (Python?) script might be better at the mere serial stuff...
I have a project build by scons.
In the project there are multiple components, including client, shell, engine and etc...
Each component uses different compile options so they are split into different env.
And both shell and engine are going to require client libraries built first.
In the environment settings, both shell and engine has something like "-lclient -L[installpath]/lib", and SConscriptClient is going to build the libclient.a in [installpath]/lib.
So I expect SConscriptClient is run before everything else.
so in the code I have something like:
clientbuild = clientEnv.SConscript ( 'SConscriptClient', variant_dir=clientDir )
if hasShell:
shellbuild = shellEnv.SConscript ( 'SConscriptShell', variant_dir=shellDir )
Depends ( shellbuild, clientbuild )
if hasEngine:
enginebuild = engineEnv.SConscript ( 'SConscriptEngine', variant_dir=engineDir )
Depends ( enginebuild, clientbuild )
However it seems scons is not smart enough to understand the dependencies between client/shell and engine ( that means the Depends call doesn't take effect ). It still try to run SConscriptShell before SConscriptClient
Is there anything i can do to set the dependeices between sconscript?
You shouldn't have to explicitly set these dependencies with Depends(). If the client library is a target built by SCons and both the shell and engine link that library, then SCons should be able to implicitly determine the dependencies and build the client first.
Basically, I see 2 issues here:
Why doesn't SCons implicitly figure out the dependencies?
Why isn't it working as is with the explicit calls to Depends()?
If we figure out number 1, then we wont have to figure out number 2. But just to be complete, I think number 2 isnt working because of what the call to SConscript() is returning. In the subsidiary SConscript scripts (SConscriptClient, SConscriptShell, and SConscriptEngine) are you returning the target? If not, I would imagine the clientbuild variable would be None. To return the target use the Return() SCons function and pass it the return value of the Library() builder.
As for why SCons cant figure out the dependencies implicitly, we would need to see the subsidiary SConscript build scripts. But I can imagine its because you are probably specifying the client library "by hand", so SCons doesnt see the dependency.
The way to build a program with a library so that SCons can see the dependency, you need to use the LIBS construction variable, as follows:
env.Append(LIBS='client')
env.Append(LIBPATH='path/to/client/lib')
env.Program(target='shell', source='shell.cc')
Notice I don't use the -l nor the -L flags above, SCons will add those in a platform independent manner. If you're specifying the library "by hand" by specifying it like this: '-lclient' then SCons wont see the dependency. This is by design, and is more interesting with include paths: if you have a lot of include paths with header files that will almost never change, then (for performance reasons) you don't want SCons to scan them for changes, and thus specify them "by hand".
One additional comment, normally the different environments are passed to the subsidiary SConscript build scripts differently, as follows:
clientEnv = Environment()
# set the clientEnv accordingly
SConscript ('SConscriptClient', variant_dir=clientDir, exports=['clientEnv'] )
SConscriptClient:
Import('clientEnv')
# You may want to clone the clientEnv here, if you want to make
# changes that you don't want seen in the rest of the build