I'm trying to update the files in a directory with a command like:
env.Command(Dir("./targetdir/"),
["./targetdir/file0", "./targetdir/file1", ...],
"./somescript.sh $TARGET")
Scons keeps telling me that ./targetdir/ is up to date, even though I've modified by hand ./targetdir/file0.
Isn't scons supposed to know that, since one source file has changed, the command should be run? Is there a particularity with the fact that the target is a directory?
I want to run the command ./somescript.sh ./targetdir/ whenever any of the file in ./targetdir/ changes. How can I do it?
The problem here is you have no target. Scons can't store information about dependency without having target and source.
So, one of solution use explicit target.
mycmd = Command('some_target', [], ['script.sh targetdir', Touch('$TARGET')])
or
mycmd = Command('some_target', [], 'script.sh targetdir > $TARGET')
Depends(mycmd, Glob('targetdir/*'))
Now, scons have target named some_target and known that it depends from files in targetdir. IMHO, best way create special builder/wrapper for it and use variant dirs to store targets there.
I dont beleive SCons likes the target to be a directory. You should instead specify the individual file(s) as the target.
As a side note, do you intend for the target and source to contain the same files? If this is for lack of an input file for "somescript.sh", typically you can just provide the script as the source. This way SCons will compare the target with the script, as opposed to the target with itself.
env.Command(target = "#targetdir/file0",
source = "#somescript.sh",
action = "#somescript.sh $TARGET")
Notice I use "#" in the path, which means relative to the root SConsctruct.
Related
Julia contains a number of methods for making temporary files and directories.
I'm making fairly heavy use of them (and /dev/shm), to inferface with libraries that really want to work with actual files (JLD/HDF5, and OpenStack Swift).
I had been assuming they would be deleted when their finalisers on the pointer to there name were called.
But then after exiting julia it seemed like they were all still there.
Will linux delete them?
If the app didn't clean after itself, the OS will delete the files eventually. It depends on system settings when temp files are deleted. For example, it can happen on boot or nightly (via cron job) or some another way.
See this answer, for example: How is the /tmp directory cleaned up?
What you are likely looking for,
given your surprise that they were not removed, based on going out of scope, as the do block versions of mktemp.
In the very documentation you linked.
mktemp(f::Function[, parent=tempdir()])
Apply the function f to the result of mktemp(parent) and remove the temporary file upon completion.
mktempdir(f::Function[, parent=tempdir()])
Apply the function f to the result of mktempdir(parent) and remove the temporary directory upon completion.
Which you can use like:
mktempdir("/dev/shm") do tdir
fname = joinpath(tdir, name)
#Do some things with your new temp filename `fname` in your tempdir `tdir`
end
#the directory referenced by `tdir`, and `fname`, have now been deleted.
I have created a package and am now creating my tests within the package. For one test my inputs are a set of files and my outputs will be a different set a files created within the test.
I am saving the input files in the test directory of my package and would like to save the output files there too. Since others may run this test, I do not want to specify the input/output file location using my own path eg /home/myname/.julia/v4.0/MyPackage/test/MyInputFile.txt
How do I specify that the input location is within the package's test folder?
So basically how do I tell Julia to look in the packages's folder under the test directory and not have to worry about specifying the entire path including user name etc?
For example currently I have to say
readtable(/home/myname/.julia/v4.0/MyPackage/test/MyInputFile.txt, separator = '\t', header = false)
But I'd like to just be able to say
readtable(/MyPackage/test/MyInputFile.txt, separator = '\t', header = false)
so that no matter who the user of the package is and where they may store the package, they can still run the test?
I know that LOAD_PATH gives the path Julia looks for packages but I can't find any information on where it looks when importing files.
joinpath(Pkg.dir("MyPackage"), "test") is what you need.
As #GnimucK mentioned in a comment, a better solution is
dirname(#__FILE__)
Why is this better? A package could be installed and used from somewhere else (not the standard package directory). Pkg.dir is "stupid" and does not know better. This is rare, of course, and in most cases it won't matter.
I have some log files generated after each file is compiled.
I am making SCons aware of these files by using an emitter attached to the builder that I'm using to compile that file.
Unfortunately, because I am deleting the empty log files after each build SCons recompiles the source files because the log files are missing.
I would like to ignore these 'side effect' files using SCons Ignore function.
In my emitter I am doing something like this:
def compiler_emitter(target, source, env):
target.append(env.File(source[0].name.split('.')[0] + env['ERRSUFFIX']))
env.Ignore(source[0], target[1])
return target, source
As a note I always pass only one file to my builder.
In my case Ignore function is not working.
What will be the best approach to solve this problem in a 'SCons way' ?
Try using env.SideEffect() instead of Ignore:
SideEffect(side_effect, target) , env.SideEffect(side_effect, target)
Declares side_effect as a side effect of building target. Both
side_effect and target can be a list, a file name, or a node. A side
effect is a target file that is created or updated as a side effect of
building other targets. For example, a Windows PDB file is created as
a side effect of building the .obj files for a static library, and
various log files are created updated as side effects of various TeX
commands. If a target is a side effect of multiple build commands,
scons will ensure that only one set of commands is executed at a time.
Consequently, you only need to use this method for side-effect targets
that are built as a result of multiple build commands.
Because multiple build commands may update the same side effect file,
by default the side_effect target is not automatically removed when
the target is removed by the -c option. (Note, however, that the
side_effect might be removed as part of cleaning the directory in
which it lives.) If you want to make sure the side_effect is cleaned
whenever a specific target is cleaned, you must specify this
explicitly with the Clean or env.Clean function.
http://scons.org/doc/production/HTML/scons-man.html
I'm trying to set up a build system involving a code generator. The exact files generated are unknown until after the generator is run, but I'd like to be able to run further build steps by pattern matching (run some program on all files with some extension). Is this possible?
Some of the answers here involving code generation seem to assume that the output is known or a listing of generated files is created. This isn't impossible in my case, but I'd like to avoid it since it makes things more complicated.
https://bitbucket.org/scons/scons/wiki/DynamicSourceGenerator seems to indicate that it's possible to add additional targets during Builder actions, but while I could get the build to run and list the generated files, any build steps introduced don't run.
https://bitbucket.org/scons/scons/wiki/NonDeterministicDependencies uses Scanners to add build steps. I put a glob(...) in a scanner, and it succeeds in detecting the generated files, but the files are inexplicably deleted before it actually runs the dependent step.
Is this use case possible? And why is SCons deleting my generated files?
A toy example
source (the file referenced in SConscript)
An example generator, constructs 3 files (not easily known to the build system) and puts them in the argument folder
echo "echo 1" > $1/gen1.txt
echo "echo 2" > $1/gen2.txt
echo "echo 3" > $1/gen3.txt
SConstruct
Just sets up a variant_dir
SConscript('SConscript', variant_dir='build')
SConscript
The goal is for it to:
"Compile" the generator (in this toy example, just copies a file called 'source' and adds execute permissions
Run the "compiled" generator ('source' is a script that generates files)
Perform some operation on each of those generated files by extension. This example just runs the "compile" copy operation on them (for simplicity).
env = Environment()
env.Append(BUILDERS = {'ExampleCompiler' :
Builder(action=[Copy('$TARGET', '$SOURCE'),
Chmod('$TARGET', 0755)])})
generator = env.ExampleCompiler('generator', 'source')
env.Append(BUILDERS = {'GeneratorRun' :
Builder(action=[Mkdir('$TARGET'),
'$SOURCE $TARGET'])})
generated_dir = env.GeneratorRun(Dir('generated'), generator)
Everything's fine up to here, where all the targets are explicitly known to the build system ahead of time.
Attempting to use this block of code to glob over the generated files causes SCons to delete (!!) the generated files:
for generated in generated_dir[0].glob('*.txt'):
generated_run = env.ExampleCompiler(generated.abspath + '.sh', generated)
Attempting to use an action to update the build tree results in additional actions not being run:
def generated_scanner(target, source, env):
for generated in source[0].glob('*.txt'):
print "scanned " + generated.abspath
generated_target = env.ExampleCompiler(generated.abspath + '.sh', generated)
Alias('TopLevelAlias', generated_target)
env.Append(BUILDERS = {'GeneratedOperation' :
Builder(action=[generated_scanner])})
dummy = env.GeneratedOperation(generated_dir[0].File('#dummy'), generated_dir)
Alias('TopLevelAlias', dummy)
The Alias operations are suggested in above dynamic source generator guide, but don't seem to do anything. The prints do execute and indicate that the action gets run.
Running some build pattern on special file extensions is possible with SCons. For C/CPP files this is the preferred scheme, for example:
env = Environment()
env.Program('main', Glob('*.cpp'))
The main task of SCons, as a build system, is to do the minimum amount of work such that all your targets are up-to-date. This makes things complicated for the use case you've described above, because it's not clear how you can reach a "stable" situation where no generated files are added and all targets are built.
You're probably better off by using a simple Python script directly...I really don't see how using SCons (or any other build system for that matter) is mission-critical in this case.
Edit:
At some point you have to tell SCons about the created files (*.txt in your example above), and for tracking all dependencies properly, the list of *.txt files has to be complete. This the task of the Emitter within SCons, which is responsible for returning the list of resulting target and source files for a Builder call. Note, that these files don't have to exist physically during the "parse" phase of SCons. Please also have a look at my answer to Scons: create late targets , which goes into some more detail.
Once you have a proper Emitter in place (see also https://bitbucket.org/scons/scons/wiki/ToolsForFools , "Using Emitters") you should be able to use the Glob('*.txt') call, which will detect and track your created files automatically.
Finally, on our page "Talks and Slides" ( https://bitbucket.org/scons/scons/wiki/TalksAndSlides ) you can find my talk from the PyCon FR.2014, "Why SCons is Not Slow", which explains shortly how SCons works internally. This might be helpful in understanding this problem better and coming up with a full solution.
I know that, whatever data is placed in package/component dir/data, will be copied to the install directory. What I mean is if I have a binary, readme, license.txt inside package/component dir/data/myapp, package/component dir/data/readme, package/component dir/data/license.txt and if I choose my target installation dir to be “/opt/myfirstapp”, then inisde /opt/myfirstapp, I will have 3 files copied, myapp, readme, license.txt.
Having said that, I also have a “/usr” directory with in package/component dir/data/, however this is not the standard “/usr” which will be inside root “/”, it is just a replica. Now inside my replica “/usr” I have some directory hierarchy and some files, like /usr/bin/myapp, /usr/lib/libmyapp.so, /usr/share/icons” and many more, infact a lot. Now I want the replica “/usr” content to be copied to “/usr” (the original usr inside root folder). I should also make sure that I just add new contents to “/usr” (root /usr), but delete any existing content.
Question is clear, some files inside my data directory will have to go to target install dir, but some selected ones (for ex: /usr) will have to be copied to other paths. How do I achieve this.
Currently we have the same problem in my company: we need 2 target directories, one for the exe and one for the libraries (well, it's a bit more complex but in few words...).
After having spoken with Qt support and got the answer that it's actually not possible ("It is possible only after extracting. After extraction, you can use copy or move operation, unfortunately there is currently no other way.") I decided to use the AdminTargetDir as the second target directory. This because there's no other way to pass dynamic variables to the IFW. So after installation I call a "finalizeInstall_patch.bat" file passing the TargetDir and AdminTargetDir and this will move the libraries directory from TargetDir to AdminTargetDir. Why a .bat patch file ? because it's actually not possible to move a directory using the methods provided by the IFW. Qt support just opened a suggestion-ticket for our problem: https://bugreports.qt-project.org/browse/QTIFW-595
I hope that this answer will help others having same similar problems.
NOTE: There is a way to move a directory (on Windows), calling addOperation("Execute", "cmd /C move source dest...") but this brings to other problems out of topic here.
This worked for us (Qt Installer, macOS):
var args = ["cp", "-R", "#TargetDir#/MyApp.app", "/Applications"];
component.addOperation("Execute", args);