I have a makefile that builds and then calls another makefile. Since this makefile calls more makefiles that does the work it doesn't really change. Thus it keeps thinking the project is built and up to date.
dnetdev11 ~ # make
make: `release' is up to date.
How do I force the makefile to rebuild the target?
clean = $(MAKE) -f ~/xxx/xxx_compile.workspace.mak clean
build = svn up ~/xxx \
$(clean) \
~/cbp2mak/cbp2mak -C ~/xxx ~/xxx/xxx_compile.workspace \
$(MAKE) -f ~/xxx/xxx_compile.workspace.mak $(1) \
release:
$(build )
debug:
$(build DEBUG=1)
clean:
$(clean)
install:
cp ~/xxx/source/xxx_utility/release/xxx_util /usr/local/bin
cp ~/xxx/source/xxx_utility/release/xxxcore.so /usr/local/lib
Note: Names removed to protect the innocent
Final Fixed version:
clean = $(MAKE) -f xxx_compile.workspace.mak clean;
build = svn up; \
$(clean) \
./cbp2mak/cbp2mak -C . xxx_compile.workspace; \
$(MAKE) -f xxx_compile.workspace.mak $(1); \
.PHONY: release debug clean install
release:
$(call build,)
debug:
$(call build,DEBUG=1)
clean:
$(clean)
install:
cp ./source/xxx_utillity/release/xxx_util /usr/bin
cp ./dlls/Release/xxxcore.so /usr/lib
The -B switch to make, whose long form is --always-make, tells make to disregard timestamps and make the specified targets. This may defeat the purpose of using make, but it may be what you need.
You could declare one or more of your targets to be phony.
A phony target is one that is not really the name of a file; rather it
is just a name for a recipe to be executed when you make an explicit
request. There are two reasons to use a phony target: to avoid a
conflict with a file of the same name, and to improve performance.
...
A phony target should not be a prerequisite of a real target file; if
it is, its recipe will be run every time make goes to update that
file. As long as a phony target is never a prerequisite of a real
target, the phony target recipe will be executed only when the phony
target is a specified goal
One trick that used to be documented in a Sun manual for make is to use a (non-existent) target '.FORCE'. You could do this by creating a file, force.mk, that contains:
.FORCE:
$(FORCE_DEPS): .FORCE
Then, assuming your existing makefile is called makefile, you could run:
make FORCE_DEPS=release -f force.mk -f makefile release
Since .FORCE does not exist, anything that depends on it will be out of date and rebuilt.
All this will work with any version of make; on Linux, you have GNU Make and can therefore use the .PHONY target as discussed.
It is also worth considering why make considers release to be up to date. This could be because you have a touch release command in amongst the commands executed; it could be because there is a file or directory called 'release' that exists and has no dependencies and so is up to date. Then there's the actual reason...
Someone else suggested .PHONY which is definitely correct. .PHONY should be used for any rule for which a date comparison between the input and the output is invalid. Since you don't have any targets of the form output: input you should use .PHONY for ALL of them!
All that said, you probably should define some variables at the top of your makefile for the various filenames, and define real make rules that have both input and output sections so you can use the benefits of make, namely that you'll only actually compile things that are necessary to copmile!
Edit: added example. Untested, but this is how you do .PHONY
.PHONY: clean
clean:
$(clean)
make clean deletes all the already compiled object files.
If I recall correctly, 'make' uses timestamps (file modification time) to determine whether or not a target is up to date. A common way to force a re-build is to update that timestamp, using the 'touch' command. You could try invoking 'touch' in your makefile to update the timestamp of one of the targets (perhaps one of those sub-makefiles), which might force Make to execute that command.
This simple technique will allow the makefile to function normally when forcing is not desired. Create a new target called force at the end of your makefile. The force target will touch a file that your default target depends on. In the example below, I have added touch myprogram.cpp. I also added a recursive call to make. This will cause the default target to get made every time you type make force.
yourProgram: yourProgram.cpp
g++ -o yourProgram yourProgram.cpp
force:
touch yourProgram.cpp
make
I tried this and it worked for me
add these lines to Makefile
clean:
rm *.o output
new: clean
$(MAKE) #use variable $(MAKE) instead of make to get recursive make calls
save and now call
make new
and it will recompile everything again
What happened?
1) 'new' calls clean.
'clean' do 'rm' which removes all object files that have the extension of '.o'.
2) 'new' calls 'make'.
'make' see that there is no '.o' files, so it creates all the '.o' again. then the linker links all of the .o file int one executable output
Good luck
As abernier pointed out, there is a recommended solution in the GNU make manual, which uses a 'fake' target to force rebuilding of a target:
clean: FORCE
rm $(objects)
FORCE: ;
This will run clean, regardless of any other dependencies.
I added the semicolon to the solution from the manual, otherwise an empty line is required.
As per Miller's Recursive Make Considered Harmful you should avoid calling $(MAKE)! In the case you show, it's harmless, because this isn't really a makefile, just a wrapper script, that might just as well have been written in Shell. But you say you continue like that at deeper recursion levels, so you've probably encountered the problems shown in that eye-opening essay.
Of course with GNU make it's cumbersome to avoid. And even though they are aware of this problem, it's their documented way of doing things.
OTOH, makepp was created as a solution for this problem. You can write your makefiles on a per directory level, yet they all get drawn together into a full view of your project.
But legacy makefiles are written recursively. So there's a workaround where $(MAKE) does nothing but channel the subrequests back to the main makepp process. Only if you do redundant or, worse, contradictory things between your submakes, you must request --traditional-recursive-make (which of course breaks this advantage of makepp). I don't know your other makefiles, but if they're cleanly written, with makepp necessary rebuilds should happen automatically, without the need for any hacks suggested here by others.
If you don't need to preserve any of the outputs you already successfully compiled
nmake /A
rebuilds all
It was already mentioned, but thought I could add to using touch
If you touch all the source files to be compiled, the touch command changes the timestamps of a file to the system time the touch command was executed.
The source file timstamp is what make uses to "know" a file has changed, and needs to be re-compiled
For example: If the project was a c++ project, then do touch *.cpp, then run make again, and make should recompile the entire project.
It actually depends on what the target is. If it is a phony target (i.e. the target is NOT related to a file) you should declare it as .PHONY.
If however the target is not a phony target but you just want to rebuild it for some reason (an example is when you use the __TIME__ preprocessing macro), you should use the FORCE scheme described in answers here.
http://www.gnu.org/software/make/manual/html_node/Force-Targets.html#Force-Targets
On my Linux system (Centos 6.2), there is a significant difference between declaring the target .PHONY and creating a fake dependency on FORCE, when the rule actually does create a file matching the target. When the file must be regenerated every time, it required both
the fake dependency FORCE on the file, and .PHONY for the fake dependency.
wrong:
date > $#
right:
FORCE
date > $#
FORCE:
.PHONY: FORCE
Related
This is the code that i have written in my makefile. For some reason it is not letting me execute the make function. When i type "make findName", i get "make: 'findName' is up to date."
findName: findName.cpp
g++ -g findName.cpp -o findName;
clean:
/bin/rm -f findName *.o
backup:
#tar -zcvf bbrown.assignment4_1.tar.gz *.cpp *.h makeFile readme # will make a tar.gz
tar -cvf bbrown.findName.tar *.cpp *.sh makeFile readme
A message like "make: 'target' is up to date." means that make has decided it doesn't need to run any commands, based on the timestamps of files involved. The make program considers files (and phony targets) to have a tree of prerequisites, and commands associated with creating a file will only be run if the file doesn't exist or has an older timestamp than one of its prerequisites. In big projects, this helps avoid completely rebuilding everything which could take a lot of time, when only a few source files have actually changed: make will determine which commands are actually needed. But this does require setting up the Makefile with accurate prerequisites.
Your Makefile specifies that file findName has one prerequisite: findName.cpp. If make successfully creates findName, then you do nothing else but just type make again, you'll see the "up to date" message: this is a feature. If you edit and save findName.cpp and then run make, it should repeat the g++ command.
But suppose you also have some header files you're including from findName.cpp, say findName.h. If you edit and save findName.h and then run make, you'll get "up to date", since make didn't know to findName.h has effects on findName. You would need to add the prerequisite to fix this:
findName: findName.cpp findName.h
g++ -g findName.cpp -o findName
There are various ways to automatically deal with header dependencies like that, but they get into more advanced use of make, and/or using other build tools.
I find that I end up in this situation on Ubuntu often, and I was wondering if there's a neat way to solve it.
Suppose I am writing some C++ programs, say a.cpp, b.cpp and c.cpp. During testing, I generate a lot related files like a.out, .a.out~, .a.un~ etc. If at some point later I realise that I no longer need a.cpp, I can perform rm a.cpp. But then I am left with a clutter of associated junk that is no longer relevant.
I am aware that I can perform periodic rm .*.un~ but I'm hoping for a better way. Is there a way I can get rm to prompt me at the point of deleting a.cpp with something like
rm: remove regular file 'a.cpp'? y
rm: remove associated file '.a.un~' too?
which I can then say 'y' or 'n' to?
This is an example Makefile.
all:
g++ -o a a.cpp
clean:
rm -f *~
Make sure to keep the tabulators in front of g++ and rm.
Compile with make all and clean up with make clean.
Is there a simple way in SCons to create a target which is considered up-to-date if the named file is verified not to exist? (And of course, to have a builder which deletes the file if it does exist...)
For instance:
b_file = env.Command("files", ["file1", "file2", "file3"],
"build-files -o $TARGET $SOURCES")
env.Depends(b_file,env.FileMustNotExist("file4"))
The idea would be that before building "files", SCons would make sure not only that "file1", "file2", and "file3" are present, but that "file4" does not exist.
I know that for this particular case I could approximate what I want by adding "rm -f file4; " to the beginning of the command, but that's not exactly the same. In particular, if I add the "rm" command to the builder, and then create "file4" outside of SCons, re-running SCons won't delete "file4" unless one of the other sources has changed.
I'd want something such that creating "file4" and re-running SCons would result in simply deleting "file4" and indicating that "files" is now built.
To my knowledge this is not foreseen in the design, and therefore not possible. Note how SCons is a file-oriented build system, so its main task is to create files...and not deleting them.
In newer versions there is the Pseudo target, which is intended for build actions that don't create an actual output file...but it's not able to delete files, if I remember correctly.
So, it looks as if the "rm -f file" strategy is still the best way to go. You might want to use Python's os.path.isfile and os.unlink though, to stay compatible over the different platforms.
Right now I have a makefile that build the .tex file (latex) in the same directory as it and spits out a pdf version of that file and also a bunch of baggage with it. I was wondering how to specify in the make file where to send the result. I want to send the results to my desktop directory. Is this at all possible? Also I used the clean function to get rid of the auto-generated garbage files but it still spits them out. any help on that?
PDFLATEX=/usr/texbin/pdflatex
SOURCE=report_Template.tex
RESULT=report_Template.pdf
$(RESULT): $(SOURCE)
$(PDFLATEX) $(SOURCE)
$(PDFLATEX) $(SOURCE)
clean:
rm -f $(RESULT) *.aux *.log *.toc *.out *~
This isn't something make has anything to do with. The commands you run put their output where you tell them to.
It appears that pdflatex creates the output next to the input so in whatever directory you run it (and the makefile) from.
You can add a cp to the end of that rule to copy the file wherever you want and/or see if pdflatex has an argument that can be given for output filename.
That being said if your make target rule doesn't create the target filename exactly that's a poor rule and might cause you trouble in larger make setups. (This is what .PHONY rules are for in part.)
clean is not magic. It is simply a target like any other. You need to run it for it to do anything.
I have few header files in /my/path/to/file folder. I know how to include these files in new C program but everytime I need to type full path to header file before including it. Can I set some path variable in linux such that it automatically looks for header files ?
You could create a makefile. A minimal example would be:
INC_PATH=/my/path/to/file
CFLAGS=-I$(INC_PATH)
all:
gcc $(CFLAGS) -o prog src1.c src2.c
From here you could improve this makefile in many ways. The most important, probably, would be to state compilation dependencies (so only modified files are recompiled).
As a reference, here you have a link to the GNU make documentation.
If you do not want to use makefiles, you can always set an environment variable to make it easier to type the compilation command:
export MY_INC_PATH=/my/path/to/file
Then you could compile your program like:
gcc -I${MY_INC_PATH} -o prog src1.c src2.c ...
You may want to define MY_INC_PATH variable in the file .bashrc, or probably better, create a file in a handy place containing the variable definition. Then, you could use source to set that variable in the current shell:
source env.sh
I think, however, that using a makefile is a much preferable approach.
there is a similar question and likely better solved (if you are interested in a permanent solution): https://stackoverflow.com/a/558819/1408096
Try setting C_INCLUDE_PATH (for C header files) or CPLUS_INCLUDE_PATH (for C++ header files).
Kudos:jcrossley3
I'm not in Linux right now and I can't be bothered to reboot to check if everything's right, but have you tried making symbolic links? For example, if you are on Ubuntu:
$ cd /usr/include
$ sudo ln -s /my/path/to/file mystuff
So then when you want to include stuf, you can use:
#include <mystuff/SpamFlavours.h>