what triggers scons to build files when I have a custom builder? - scons

I'm going nuts trying to control when files are built in scons. I have a very simple example build tree (see below), with a Poem builder that just takes a .txt file and converts it to lower case in a corresponding .eectxt file.
In my SConstruct and SConscript files, I declare dependencies of 3 .txt files.
But I can't figure out what's putting these into the default build!
sconstest/
SConstruct
tiger.txt
src/
SConscript
hope.txt
jabberwocky.txt
where the *.txt files are poems and my SConstruct and SConscript look like this:
SConstruct:
env = Environment();
def eecummings(target, source, env):
if (len(target) == 1 and len(source) == 1):
with open(str(source[0]), 'r') as fin:
with open(str(target[0]), 'w') as fout:
for line in fin:
fout.write(line.lower());
return None
env['BUILDERS']['Poem'] = Builder(action=eecummings, suffix='.eectxt', src_suffix='.txt');
Export('env');
poems = SConscript('src/SConscript');
tigerPoem = env.Poem('tiger.txt');
src/SConscript:
Import('env');
input = ['jabberwocky.txt', 'hope.txt'];
output = [env.Poem(x) for x in input];
Return('output');
What I want to do is to declare the dependency of the .eectxt files from the corresponding .txt files, but not cause them to be built unless I explicitly put them into the Default() build in the SConstruct file, or I request them explicitly at the command line.
How can I do this?

By default, a directory depends on all files and/or targets which reside in it.
So running:
scons
Will then build all targets under the current directory.

I figured out how to do what I want, but I still don't understand why I need to do it this way. Acceptance to the first decent answer that explains it.
Here's what works, if I add the following to the root SConstruct file:
env.Ignore('.', tigerPoem);
env.Ignore('src', poems);
env.Alias('poems', [tigerPoem]+poems);
This ignores the 3 poems from the default target, and then adds them as targets aliased to "poems", so if I run scons it builds nothing, but if I run scons poems it builds the files.
Why does this work? Why does calling env.Poem(...) add something to the default targets?

Related

How can I add per-file defines to a scons project

I'm in the process of porting a makefile project to scons and I can't figure out how to create a unique #define for each file. I would like to have the base filename for each file defined in order to support some custom debug macros. In the makefile, I'm able to do this with the following definition.
-DBASE_FILE_NAME=\"$(<F)\"
I'm not sure how to do this or if it is even possible in scons and would appreciate any feedback.
After some experimentation, the following seems to work.
import os
from glob import glob
# use Python glob, not scons Glob!
CPP_FILES = glob('./src/*.cpp')
env = Environment(CPPPATH='./include', etc...)
for f in CPP_FILES:
env.Object(f, CPPDEFINES={'BASE_FILENAME' : "\\\"" + os.path.basename(f) + "\\\""})
O_FILES = [os.path.splitext(f)[0] + '.o' for f in CPP_FILES]
env.Program('myprogram', O_FILES)
This lets me define things on a per-file basis without listing the files out individually.
Perhaps the following? (Haven't tried it, but something along those lines should work)
env.Program('filename.c',CPPDEFINES='-DBASE_FILE_NAME=\"$SOURCE\"')

How to create a Makefile or .pro file that runs a custom build event

I currently use a small program to process Qt form (.ui) files and automatically generate classes which have a common base class and use virtual functions to access the form elements.
On windows, I run this tool as a custom build step on the ui form file. The only argument to the tool is the input filename.
To clarify, on Windows, Qt runs uic on the .ui file, creating a ui_filename.h file. I need to run my tool on that file.
How can/should I do this on linux? Ideally I'd build it into the .pro file, but I'm happy to edit the Makefile as well.
I'm not awesome at writing Makefiles so this may be very simple. I am happy to write the command manually for each ui_ or *.ui file but ideally it would happen automatically for all .ui files.
It is not needed to write Makefiles manually. Makefiles that call custom external tool can be generated by qmake from the project file .pro.
It is needed to create a custom target using QMAKE_EXTRA_TARGETS. Then the main target should be set as denendent on that custom target (custom target name should be added to PRE_TARGETDEPS), for example How to modify the PATH variable in Qt Creator's project file (.pro)
The tool should run after generation of the form headers, so the custom target should depend on that file customtarget1.depends = ui_mainwindow.h:
customtarget1.target = form_scanner
customtarget1.commands = tool_win_bat_or_linux_shell.sh
customtarget1.depends = ui_mainwindow.h
QMAKE_EXTRA_TARGETS += customtarget1
PRE_TARGETDEPS += form_scanner
The above qmake commands create the following Makefile rules:
# the form header depends on mainwindow.ui
ui_mainwindow.h: ..\test\mainwindow.ui
<tab>#build command...
# form scanner depends on ui_mainwindow.h
form_scanner: ui_mainwindow.h
<tab>tool_win_bat_or_linux_shell.sh
# the final target depends on form scanner
$(DESTDIR_TARGET): form_scanner ui_mainwindowm.h $(OBJECTS)
If there are many forms it is possible to create many custom targets or create one target that depends on all form files:
for (form, FORMS) {
# autogenerated form headers are located in root of build directory
FILE_NAME = $$basename(form)
# prepend ui_ and replace ending .ui by .h
FORM_HEADERS += ui_$$replace(FILE_NAME, .ui$, .h)
}
customtarget1.target = form_scanner
customtarget1.commands = tool_win_bat_or_linux_shell.sh
customtarget1.depends = $$FORM_HEADERS
QMAKE_EXTRA_TARGETS += customtarget1
PRE_TARGETDEPS += form_scanner
So, the command tool_win_bat_or_linux_shell.sh is executed only when all form headers are generated.
It is also possible to run the shell script from the project directory $$PWD and pass as command line arguments the form header file names:
customtarget1.commands = $$PWD/tool_win_bat_or_linux_shell.sh $$FORM_HEADERS
Now that shell script can run some command for each form header tool_win_bat_or_linux_shell.sh:
# for each command line argument
for file in "$#"
do
echo "$file"
ls -l $file
done

Additional, specific source and target for a Builder

I'm new to Scons and I'm trying to figure out if I could use it for my use-case. I have a script whose main actoin is to take a single input and produces multiple output files in a given directory. However, it also needs one additional input and one additional output, as in
script --special-in some.foo --special-in some.bar input.foo output.dir/
The names of some.* files can be computed from the input file name (here input.foo). And the some.* files produced by one rule are consumed by other rules.
In the documentation I found that one can create custom builders as in
bld = Builder(action = 'foobuild $TARGETS - $SOURCES',
suffix = '.foo',
src_suffix = '.input',
emitter = modify_targets)
where the emitter adds the additional target and source. However, I couldn't find how should I distinguish the main source/target from the special ones, which need to be passed using specific options - I can't use $TARGETS and $SOURCES as in the above example. I could probably use a generator and index into source and target, but this seems a bit hacky. I there a better way?
From what you describe, you should be using both an emitter and a generator, just as you state at the end of your question. The "main" source/target will be the first element in the source/target lists. This doesn't seem hacky to me, but I may just be used to it...
Answers are always better with a working example...
Here is the SConstruct to do what you describe. I'm not exactly sure how you plan to compute some.foo and some.bar from input.foo, so in this example I compute input.bar and input.baz from input.foo, and just append output.dir to the list of targets.
import os
def my_generator(source, target, env, for_signature):
command = './script '
command += ' '.join(['--special-in %s' % str(i) for i in source[1:]])
command += ' '
command += ' '.join([str(t) for t in target])
return command
def my_emitter(target, source, env):
source += ['%s%s' % (os.path.splitext(
str(source[0]))[0], ext) for ext in ['.bar', '.baz']]
target += ['output.dir']
return target, source
bld = Builder(generator=my_generator,
emitter=my_emitter)
env = Environment(BUILDERS={'Foo':bld})
env.Foo('output.foo', 'input.foo')
When run on linux...
>> touch input.bar input.baz input.foo
>> echo "#\!/bin/sh" > script && chmod +x script
>> tree
.
├── input.bar
├── input.baz
├── input.foo
├── SConstruct
└── script
0 directories, 5 files
>> scons --version
SCons by Steven Knight et al.:
script: v2.3.4, 2014/09/27 12:51:43, by garyo on lubuntu
engine: v2.3.4, 2014/09/27 12:51:43, by garyo on lubuntu
engine path: ['/usr/lib/scons/SCons']
Copyright (c) 2001 - 2014 The SCons Foundation
>> scons
scons: Reading SConscript files ...
scons: done reading SConscript files.
scons: Building targets ...
./script --special-in input.bar --special-in input.baz output.foo output.dir
scons: done building targets.
All dependencies/targets will be maintained, if you need to feed the outputs from one builder like this into another.
If this doesn't answer your question, please clarify what more you are trying to do.

Multi-input, multi-output compilers with Shake

I'm experimenting with using Shake to build Java code, and am a bit stuck because of the unusual nature of the javac compiler. In general for each module of a large project, the compiler is invoked with all of the source files for that module as input, and produces all of the output files in one pass. Subsequently we typically take the .class files produced by the compiler and assemble them into a JAR (basically just a ZIP).
For example, a typical Java module project is arranged as follows:
a src directory that contains multiple .java files, some of them nested many levels deep in a tree.
a bin directory that contains the output from the compiler. Typically this output follows the same directory structure and filenames, with .class substituted for each .java file, but the mapping is not necessarily one-to-one: a single .java file can produce zero to many .class files!
The rules I would like to define in Shake are therefore as follows:
1) If any file under src is newer than any file under bin then erase all contents of bin and recreate with:
javac -d bin <recursive list of .java files under src>
I know this rule seems excessive, but without invoking the compiler we cannot know the extent of changes in output resulting from even a small change in a single input file.
2) if any file under bin is newer than module.jar then recreate module.jar with:
jar cf module.jar -C bin .
Many thanks!
PS Responses in the vein "just use Ant/Maven/Gradle/" will not be appreciated! I know those tools offer Java compilation out-of-the-box, but they are much harder to compose and aggregate. This is why I want to experiment with a Haskell/Shake-based tool.
Writing rules which produce multiple outputs whose names cannot be statically determined can be a bit tricky. The usual approach is to find an output whose name is statically known and always need that, or if none exists, create a fake file to use as the static output (as per ghc-make, the .result file). In your case you have module.jar as the ultimate output, so I would write:
"module.jar" *> \out -> do
javas <- getDirectoryFiles "" ["src//*.java"]
need javas
liftIO $ removeFiles "" ["bin//*"]
liftIO $ createDirectory "bin"
() <- cmd "javac -d bin" javas
classes <- getDirectoryFiles "" ["bin//*.class"]
need classes
cmd "jar cf" [out] "-C bin ."
There is no advantage to splitting it up into two rules, since you never depend on the .class files (and can't really, since they are unpredictable in name), and if any source file changes then you will always rebuild module.jar anyway. This rule has all the dependencies you mention, plus if you add/rename/delete any .java or .class file then it will automatically recompile, as the getDirectoryFiles call is tracked.

In scons, how can I inject a target to be built?

I want to inject a "Cleanup" target which depends on a number of other targets finishing before it goes off and gzip's some log files. It's important that I not gzip early as this can cause some of the tools to fail.
How can I inject a cleanup target for Scons to execute?
e.g. I have targets foo and bar. I want to inject a new custom target called 'cleanup' that depends on foo and bar and runs after they're both done, without the user having to specify
% scons foo cleanup
I want them to type:
% scons foo
but have scons execute as though the user had typed
% scons foo cleanup
I've tried creating the cleanup target and appending to sys.argv, but it seems that scons has already processed sys.argv by the time it gets to my code so it doesn't process the 'cleanup' target that I manually append to sys.argv.
you shouldn't use _Add_Targets or undocumented features, you can just add your cleanup target to BUILD_TARGETS:
from SCons.Script import BUILD_TARGETS
BUILD_TARGETS.append('cleanup')
if you use this documented list of targets instead of undocumented functions, scons won't be confused when doing its bookkeeping. This comment block can be found in SCons/Script/__init__.py:
# BUILD_TARGETS can be modified in the SConscript files. If so, we
# want to treat the modified BUILD_TARGETS list as if they specified
# targets on the command line. To do that, though, we need to know if
# BUILD_TARGETS was modified through "official" APIs or by hand. We do
# this by updating two lists in parallel, the documented BUILD_TARGETS
# list, above, and this internal _build_plus_default targets list which
# should only have "official" API changes. Then Script/Main.py can
# compare these two afterwards to figure out if the user added their
# own targets to BUILD_TARGETS.
so I guess it is intended to change BUILD_TARGETS instead of calling internal helper functions
One way is to have the gzip tool depend on the output of the log files. For example, if we have this C file, 'hello.c':
#include <stdio.h>
int main()
{
printf("hello world\n");
return 0;
}
And this SConstruct file:
#!/usr/bin/python
env = Environment()
hello = env.Program('hello', 'hello.c')
env.Default(hello)
env.Append(BUILDERS={'CreateLog':
Builder(action='$SOURCE.abspath > $TARGET', suffix='.log')})
log = env.CreateLog('hello', hello)
zipped_log = env.Zip('logs.zip', log)
env.Alias('cleanup', zipped_log)
Then running "scons cleanup" will run the needed steps in the correct order:
gcc -o hello.o -c hello.c
gcc -o hello hello.o
./hello > hello.log
zip(["logs.zip"], ["hello.log"])
This is not quite what you specified, but the only difference between this example and your requirement is that "cleanup" is the step that actually creates the zip file, so that is the step that you have to run. Its dependencies (running the program that generates the log, creating that program) are automatically calculated. You can now add the alias "foo" as follows to get the desired output:
env.Alias('foo', zipped_log)
In version 1.1.0.d20081104 of SCons, you can use the private internal SCons method:
SCons.Script._Add_Targets( [ 'MY_INJECTED_TARGET' ] )
If the user types:
% scons foo bar
The above code snippet will cause SCons to behave as though the user had typed:
% scons foo bar MY_INJECTED_TARGET

Resources