SCons -- target that "deletes" a file? - scons

Is there a simple way in SCons to create a target which is considered up-to-date if the named file is verified not to exist? (And of course, to have a builder which deletes the file if it does exist...)
For instance:
b_file = env.Command("files", ["file1", "file2", "file3"],
"build-files -o $TARGET $SOURCES")
env.Depends(b_file,env.FileMustNotExist("file4"))
The idea would be that before building "files", SCons would make sure not only that "file1", "file2", and "file3" are present, but that "file4" does not exist.
I know that for this particular case I could approximate what I want by adding "rm -f file4; " to the beginning of the command, but that's not exactly the same. In particular, if I add the "rm" command to the builder, and then create "file4" outside of SCons, re-running SCons won't delete "file4" unless one of the other sources has changed.
I'd want something such that creating "file4" and re-running SCons would result in simply deleting "file4" and indicating that "files" is now built.

To my knowledge this is not foreseen in the design, and therefore not possible. Note how SCons is a file-oriented build system, so its main task is to create files...and not deleting them.
In newer versions there is the Pseudo target, which is intended for build actions that don't create an actual output file...but it's not able to delete files, if I remember correctly.
So, it looks as if the "rm -f file" strategy is still the best way to go. You might want to use Python's os.path.isfile and os.unlink though, to stay compatible over the different platforms.

Related

How do I mark a file as config(missingok) in fpm non-interactively?

I've been trying to use fpm to create an rpm, but have ran into a problem. After I install the package, there are files I no longer need which are deleted in a post-install script in order to save space. Unfortunately, when the packages in uninstalled, it complains about the files not being there, as they are still registered by the rpm as part of the package. When I looked into how to fix this via the rpm, I stumbled on the %config(missingok) macro which seems ideal. However, it doesn't seem like there is a way to set this via fpm.
My current options for possible solutions are changing the -edit flag from using vi to edit the spec file to using a script by setting the fpm_editor variable, or touching the file in a pre-remove script to try and trick the rpm into thinking these problematic file still exist. Neither of these option are very appealing.
So my question is this: Is there a way to use fpm to either a: remove the package from the "sight" of the rpm post-install, or b: mark the file as noconfig(missingok) via fpm?
Without utilizing the two solutions above of course.
The usual way of doing this is rm -f these files at the end of the %install section, instead of doing this in the post-install scriptlet.
This way the useless files will not be packaged in the final rpm.
I never packaged an rpm with fpm, but looking at the source code I see the command-line switches --exclude and --exclude-file that should be the ones you're looking for:
option ["-x", "--exclude"], "EXCLUDE_PATTERN",
"Exclude paths matching pattern (shell wildcard globs valid here). " \
"If you have multiple file patterns to exclude, specify this flag " \
"multiple times.", :attribute_name => :excludes do |val|
excludes << val
next excludes
end # -x / --exclude
option "--exclude-file", "EXCLUDE_PATH",
"The path to a file containing a newline-sparated list of "\
"patterns to exclude from input."

how to distribute all files in a data directory with automake

I have some data files used in my package. They are 74 files in a directory. According to automake manual section 9.3 Architecture-independent data files I can list them individually
dist_pkgdata_DATA = mydir/file1 mydir/file2 ..... mydir/file74
That's too much typing. Any one can suggest a good solution?
There is a slightly different solution for such a problem, but it is essentially the same mechanism (listing all files manually). Since programmers are lazy, we don't want to type them. Furthermore, each time you need to add new files or remove old file, you have to update the Makefile.am.
If you don't want to list files at all the compressed file approach with it's install/uninstall hooks look like a reasonable approach for a lot of files, especially if they are essentially "write only".
If you want to have those files normally inspectable you'll have to do something like:
MYDIR = mydir
EXTRA_DIST = $(MYDIR)
install-data-local:
test -z $(pkgdatadir) || $(MKDIR_P) $(pkgdatadir)
find "$(MYDIR)" -type f -exec $(INSTALL_DATA) {} $(pkgdatadir) \;
The "too much typing" part is due to the dist_ prefix -- automake is being told to package up those files for the distribution tarball. EXTRA_DIST on the other hand will include the entire contents of a directory into the tarball, but you have to specify what to do with those files later. And it really doesn't check, either. If mydir has 2 files or 74 files, it's all the same to automake.
Another way to solve it is to have another program in bootstrap.sh (before autoreconf is invoked) make a makefile fragment defining dist_pkgdata_DATA which is then included in Makefile.am. This still doesn't fix the correct number of files problem either.

how to specify in a makefile where to send the results too

Right now I have a makefile that build the .tex file (latex) in the same directory as it and spits out a pdf version of that file and also a bunch of baggage with it. I was wondering how to specify in the make file where to send the result. I want to send the results to my desktop directory. Is this at all possible? Also I used the clean function to get rid of the auto-generated garbage files but it still spits them out. any help on that?
PDFLATEX=/usr/texbin/pdflatex
SOURCE=report_Template.tex
RESULT=report_Template.pdf
$(RESULT): $(SOURCE)
$(PDFLATEX) $(SOURCE)
$(PDFLATEX) $(SOURCE)
clean:
rm -f $(RESULT) *.aux *.log *.toc *.out *~
This isn't something make has anything to do with. The commands you run put their output where you tell them to.
It appears that pdflatex creates the output next to the input so in whatever directory you run it (and the makefile) from.
You can add a cp to the end of that rule to copy the file wherever you want and/or see if pdflatex has an argument that can be given for output filename.
That being said if your make target rule doesn't create the target filename exactly that's a poor rule and might cause you trouble in larger make setups. (This is what .PHONY rules are for in part.)
clean is not magic. It is simply a target like any other. You need to run it for it to do anything.

Vim run project specific make with key command?

I'm working on a compiled project in Vim (Typescript). make % will build an individual file. I use this to check for errors. This is great for error checking, but it creates compiled files next to the source files that I don't need.
For my actual build process, I have a single command that compiles everything. This is in a Makefile.
I'd like to be able to map a key command to "build my whole project" in a generic way, so if I'm editing any .ts file underneath my project directory, it runs that specific command.
How can I do this?
The trick would be to actually use a Makefile:
all: complete.exe
complete.exe: *.ts
somecompilation-command $^ -o $#
This way, you can just leave makeprg at 'make':
:set makeprg&
And happily do:
:mak

How do you force a makefile to rebuild a target?

I have a makefile that builds and then calls another makefile. Since this makefile calls more makefiles that does the work it doesn't really change. Thus it keeps thinking the project is built and up to date.
dnetdev11 ~ # make
make: `release' is up to date.
How do I force the makefile to rebuild the target?
clean = $(MAKE) -f ~/xxx/xxx_compile.workspace.mak clean
build = svn up ~/xxx \
$(clean) \
~/cbp2mak/cbp2mak -C ~/xxx ~/xxx/xxx_compile.workspace \
$(MAKE) -f ~/xxx/xxx_compile.workspace.mak $(1) \
release:
$(build )
debug:
$(build DEBUG=1)
clean:
$(clean)
install:
cp ~/xxx/source/xxx_utility/release/xxx_util /usr/local/bin
cp ~/xxx/source/xxx_utility/release/xxxcore.so /usr/local/lib
Note: Names removed to protect the innocent
Final Fixed version:
clean = $(MAKE) -f xxx_compile.workspace.mak clean;
build = svn up; \
$(clean) \
./cbp2mak/cbp2mak -C . xxx_compile.workspace; \
$(MAKE) -f xxx_compile.workspace.mak $(1); \
.PHONY: release debug clean install
release:
$(call build,)
debug:
$(call build,DEBUG=1)
clean:
$(clean)
install:
cp ./source/xxx_utillity/release/xxx_util /usr/bin
cp ./dlls/Release/xxxcore.so /usr/lib
The -B switch to make, whose long form is --always-make, tells make to disregard timestamps and make the specified targets. This may defeat the purpose of using make, but it may be what you need.
You could declare one or more of your targets to be phony.
A phony target is one that is not really the name of a file; rather it
is just a name for a recipe to be executed when you make an explicit
request. There are two reasons to use a phony target: to avoid a
conflict with a file of the same name, and to improve performance.
...
A phony target should not be a prerequisite of a real target file; if
it is, its recipe will be run every time make goes to update that
file. As long as a phony target is never a prerequisite of a real
target, the phony target recipe will be executed only when the phony
target is a specified goal
One trick that used to be documented in a Sun manual for make is to use a (non-existent) target '.FORCE'. You could do this by creating a file, force.mk, that contains:
.FORCE:
$(FORCE_DEPS): .FORCE
Then, assuming your existing makefile is called makefile, you could run:
make FORCE_DEPS=release -f force.mk -f makefile release
Since .FORCE does not exist, anything that depends on it will be out of date and rebuilt.
All this will work with any version of make; on Linux, you have GNU Make and can therefore use the .PHONY target as discussed.
It is also worth considering why make considers release to be up to date. This could be because you have a touch release command in amongst the commands executed; it could be because there is a file or directory called 'release' that exists and has no dependencies and so is up to date. Then there's the actual reason...
Someone else suggested .PHONY which is definitely correct. .PHONY should be used for any rule for which a date comparison between the input and the output is invalid. Since you don't have any targets of the form output: input you should use .PHONY for ALL of them!
All that said, you probably should define some variables at the top of your makefile for the various filenames, and define real make rules that have both input and output sections so you can use the benefits of make, namely that you'll only actually compile things that are necessary to copmile!
Edit: added example. Untested, but this is how you do .PHONY
.PHONY: clean
clean:
$(clean)
make clean deletes all the already compiled object files.
If I recall correctly, 'make' uses timestamps (file modification time) to determine whether or not a target is up to date. A common way to force a re-build is to update that timestamp, using the 'touch' command. You could try invoking 'touch' in your makefile to update the timestamp of one of the targets (perhaps one of those sub-makefiles), which might force Make to execute that command.
This simple technique will allow the makefile to function normally when forcing is not desired. Create a new target called force at the end of your makefile. The force target will touch a file that your default target depends on. In the example below, I have added touch myprogram.cpp. I also added a recursive call to make. This will cause the default target to get made every time you type make force.
yourProgram: yourProgram.cpp
g++ -o yourProgram yourProgram.cpp
force:
touch yourProgram.cpp
make
I tried this and it worked for me
add these lines to Makefile
clean:
rm *.o output
new: clean
$(MAKE) #use variable $(MAKE) instead of make to get recursive make calls
save and now call
make new
and it will recompile everything again
What happened?
1) 'new' calls clean.
'clean' do 'rm' which removes all object files that have the extension of '.o'.
2) 'new' calls 'make'.
'make' see that there is no '.o' files, so it creates all the '.o' again. then the linker links all of the .o file int one executable output
Good luck
As abernier pointed out, there is a recommended solution in the GNU make manual, which uses a 'fake' target to force rebuilding of a target:
clean: FORCE
rm $(objects)
FORCE: ;
This will run clean, regardless of any other dependencies.
I added the semicolon to the solution from the manual, otherwise an empty line is required.
As per Miller's Recursive Make Considered Harmful you should avoid calling $(MAKE)! In the case you show, it's harmless, because this isn't really a makefile, just a wrapper script, that might just as well have been written in Shell. But you say you continue like that at deeper recursion levels, so you've probably encountered the problems shown in that eye-opening essay.
Of course with GNU make it's cumbersome to avoid. And even though they are aware of this problem, it's their documented way of doing things.
OTOH, makepp was created as a solution for this problem. You can write your makefiles on a per directory level, yet they all get drawn together into a full view of your project.
But legacy makefiles are written recursively. So there's a workaround where $(MAKE) does nothing but channel the subrequests back to the main makepp process. Only if you do redundant or, worse, contradictory things between your submakes, you must request --traditional-recursive-make (which of course breaks this advantage of makepp). I don't know your other makefiles, but if they're cleanly written, with makepp necessary rebuilds should happen automatically, without the need for any hacks suggested here by others.
If you don't need to preserve any of the outputs you already successfully compiled
nmake /A
rebuilds all
It was already mentioned, but thought I could add to using touch
If you touch all the source files to be compiled, the touch command changes the timestamps of a file to the system time the touch command was executed.
The source file timstamp is what make uses to "know" a file has changed, and needs to be re-compiled
For example: If the project was a c++ project, then do touch *.cpp, then run make again, and make should recompile the entire project.
It actually depends on what the target is. If it is a phony target (i.e. the target is NOT related to a file) you should declare it as .PHONY.
If however the target is not a phony target but you just want to rebuild it for some reason (an example is when you use the __TIME__ preprocessing macro), you should use the FORCE scheme described in answers here.
http://www.gnu.org/software/make/manual/html_node/Force-Targets.html#Force-Targets
On my Linux system (Centos 6.2), there is a significant difference between declaring the target .PHONY and creating a fake dependency on FORCE, when the rule actually does create a file matching the target. When the file must be regenerated every time, it required both
the fake dependency FORCE on the file, and .PHONY for the fake dependency.
wrong:
date > $#
right:
FORCE
date > $#
FORCE:
.PHONY: FORCE

Resources