find_dependency(Threads) or include(FindThreads) in a package config file - multithreading

In CMake, we can use find_dependency() in an package -config.cmake file to "forwards the correct parameters for QUIET and REQUIRED which were passed to the original find_package() call." So, naturally we'll want to do that instead of calling find_package() in such files.
Also, for dependency on a threads library, CMake offers us the FindThreads module, so that we write include(FindThreads), prepended by some preference commands, and get a bunch of interesting variables set. So, that's preferable to find_package(Threads).
And thus we have a dilemma: What to put in -config.cmake files, for a threads library dependency? The former, or the latter?

Following a discussion in comments with #Tsyarev, it seems that:
find_package(Threads) includes the FindThreads module internally.
... which means it "respects" the preference variables affecting FindThreads behavioe.
so it makes sense, functionally and aesthetically, to just use find_package() in your main CMakeLists.txt and find_dependency() in -config.cmake.

Related

Scons command/explicit dependency

I have a code snippet similar to this:
# Compile protobuf headers
env.Protoc(...)
# Move headers to 'include' (compiled via protobuf)
env.Command([include headers...], [headers...], move_func)
# Compile program (depends on 'include' files)
out2 = SConscript('src/SConscript')
Depends(out2, [include headers...])
Basically, I have Protoc() compiling protobuf files, then the headers are moved to the 'include' directory by env.Command() and finally the program is compiled through a SConscript file in the 'src'.
Since these are header files that are being moved (that the src compilation depends on), they are not explicitly defined as a dependency by scons (as far as I understand). Thus, the compilation runs, but the header files haven't been moved so it fails. I have tried exposing the dependency via Depends() and Requires() without success.
I understand that in the usual case, scons should "figure-out" dependencies, but I don't know how it could do that here.
Thanks!
You seem to be thinking in "make" ways about your build process, which is the wrong approach when using SCons. You can't order single build steps by putting them in different SConscripts, and then including those in a special order. You have to define proper dependencies between your actual sources (C/CPP files for example) and a target like a program or PDF file. Then SCons is able to figure out the correct build order, and will traverse through the folder structure of your project automatically. If required, it will enter subfolders more than once when the dependency graph (DAG) dictates this. Defining this kind of dependencies between inputs and outputs is usually done, using a Builder...and in your case the Install() builder would be a good fit. Please also regard the hints for #2 in the list of "most frequently-asked FAQs" ( https://bitbucket.org/scons/scons/wiki/FrequentlyAskedQuestions).
Further, I can only recommend to read a little more in the UserGuide ( http://www.scons.org/doc/production/HTML/scons-user.html ) to get a better feeling for how to do things in a more "SConsy" way. If you get stuck, feel free to ask further questions on our mailing list at scons-users#scons.org (see http://www.scons.org/lists.php ).
Finally, if you have a lot of steps that you want to execute in serial, and that don't require any special input/output files, SCons is probably not the right tool for your current task. It's designed as a file-oriented build system with automatic parallelization in mind, a simple (Python?) script might be better at the mere serial stuff...

How to prevent dead-code removal of utility libraries in Haxe?

I've been tasked with creating conformance tests of user input, the task if fairly tricky and we need very high levels of reliability. The server runs on PHP, the client runs on JS, and I thought Haxe might reduce duplicative work.
However, I'm having trouble with deadcode removal. Since I am just creating helper functions (utilObject.isMeaningOfLife(42)) I don't have a main program that calls each one. I tried adding #:keep: to a utility class, but it was cut out anyway.
I tried to specify that utility class through the -main switch, but I had to add a dummy main() method and this doesn't scale beyond that single class.
You can force the inclusion of all the files defined in a given package and its sub packages to be included in the build using a compiler argument.
haxe --macro include('my.package') ..etc
This is a shortcut to the macro.Compiler.include function.
As you can see the signature of this function allows you to do it recursive and also exclude packages.
static include (pack:String, rec:Bool = true, ?ignore:Array<String>, ?classPaths:Array<String>):Void
I think you don't have to use #:keep in that case for each library class.
I'm not sure if this is what you are looking for, I hope it helps.
Otherwise this could be helpful checks:
Is it bad that the code is cut away if you don't use it?
It could also be the case some code is inlined in the final output?
Compile your code using the compiler flag -dce std as mentioned in comments.
If you use the static analyzer, don't use it.
Add #:keep and reference the class+function somewhere.
Otherwise provide minimal setup if you can reproduce.

Avoiding implicit precompiled header dependencies?

We are using a precompiled header to include library files such as Boost and Windows.
Our precompiled.h is included explicitly at the top of each .cpp file in order to work with the precompiled header commands (/Yc, /Yu, and /Fp). I accepted that as necessary.
Recently, however, I found /FI, which forces an include file at the top of the source file. I tried using it to force-include precompiled.h instead of including it explicitly, and sure enough, it worked.
This would allow us to omit the precompiled header (which is an implementation detail, as far as I am concerned), and only specify the actual dependencies of the file.
Unfortunately, it looks like the only way to validate that we aren't relying on implicit dependencies provided by precompiled.h is to periodically run through a build without /FI"precompiled.h" to see which files have a problem.
This is fairly onerous. Is there a better way?

can an RPM spec file "include" other files?

Is there a kind of "include" directive in RPM spec? I couldn't find an answer by googling.
Motivation: I have a RPM spec template which the build process modifies with the version, revision and other build-specific data. This is done by sed currently. I think it would be cleaner if the spec would #include a build-specific definitions file, which would be generated by the build process, so I don't need to search and replace in the spec.
If there is no include, is there an idiomatic way to do this (quite common, I believe) task?
Sufficiently recent versions of rpmbuild certainly do support %include:
%include common.inc
Unfortunately, they aren't very smart about it -- there is no known set of directories, in which it will look for the requested files, for example. But it is there and variables are expanded, for example:
%include %{_topdir}/Common/common.inc
RPM does not support includes.
I have solved similar problems with either m4 macro processor or by just concatenating parts of spec (when the "include" was at the beginning).
If you only need to pass a few variables at build time, and not include several lines from another file, you can run
rpmbuild --define 'myvar SOMEVALUE' -bb myspec.spec
and you can use %myvar in the spec.
I faced this same issue recently. I wanted to define multiple sub-packages that were similar, but each varied just slightly (they were language-specific RPMs). I didn't want to repeat the same boiler-plate stuff for each sub-package.
Here's a generic version of what I did:
%define foo_spec() %{expand:%(cat '%{myloc}/main-foo.spec')}
%{foo_spec bar}
%{foo_spec baz}
%{foo_spec qux}
The use of %{expand} ensures that %(cat) is only executed a single time, when the macro is defined. The content of the main-foo.spec file is then three times, and each time %1 in the main-foo.spec file expands to each of bar, baz and qux, in turn, allowing me to treat it as a template. You could easily expand this to more than one parameter, if you have the need (I did not).
For the underlying issue, there maybe two additional solutions that are present in all rpm versions that I am aware of.
Subpackages
macro and rpmrc files.
Subpackages
Another alternative (and perhaps the "RPM way") is to use sub-packages. Maximum RPM also has information and examples of subpackages.
I think the question is trying to structure something like,
two spec files; say rpm_debug.spec and rpm_production.spec
both use %include common.spec
debug and production could also be client and server, etc. For the examples of redefining a variable, each subpackage can have it's own list of variables.
Limitations
The main advantage of subpackages is that only one build takes place; This may also be a disadvantage. The debug and production example may highlight this. That can be worked around using strip to create variants or compiling twice with different output; perhaps using VPATH with Gnu Make). Having to compile large packages and then only have simple variations, like with/without developer info, like headers, static libraries, etc. can make you appreciate this approach.
Macros and Rpmrc
Subpackages don't solve the problem of structural defines that you wish for an entire rootfs hierarchy, or larger collection of RPMs. We have rpmbuild --showrc for this. You can have a large amount of variables and macros defined by altering rpmrc and macros when you run rpm and rpmbuild. From the man page,
rpmrc Configuration
/usr/lib/rpm/rpmrc
/usr/lib/rpm/redhat/rpmrc
/etc/rpmrc
~/.rpmrc
Macro Configuration
/usr/lib/rpm/macros
/usr/lib/rpm/redhat/macros
/etc/rpm/macros
~/.rpmmacros
I think these two features can solve all the problems that %include can. However, %include is a familiar concept and was probably added to make rpm more full-featured and developer friendly.
Which version are you talking about? I currently have %include filename.txt in my spec file and it seems to work just like the C #include directive.
> rpmbuild --version
RPM version 4.8.1
You can include the *.inc files from the SOURCES directory (%_sourcedir):
Source1: common.inc
%include %{SOURCE1}
In this way they will go automatically into SRPMS.
I've used scripts (name your favorite) to take a template and create the spec file from that. Also, the %files tag can import a file that is created by another process, e.g. Python's bdist-rpm.

Using library with different names within autoconf

I am trying to build an application with OpenSync 0.4 (0.3.9 indeed) dependency.
In the project's configure.ac the opensync library is written as libopensync1. However, this doesn't build on my Gentoo system. Changing libopensync1 to libopensync does fix the issue for me.
I searched with Google and found that libopensync1 is used in some distributions, while libopensync in others. So how to resolve this issue in configure.ac?
Thanks.
The macro AC_SEARCH_LIBS does what you need. (There is much heated debate about whether or not pkg-config should ever be used. If you choose to rely on it, ptomato gives a reasonable approach.) Simply add this to your configure.ac:
AC_SEARCH_LIBS([osync_mapping_new],[opensync1 opensync],[],
[AC_MSG_ERROR([can't find opensync])])
This will first look for a library named opensync1; if
it doesn't find that, it will look for opensync.
The primary drawback of using pkg-config is that most projects that
rely on it do not actually check if the data provided by the .pc
file is reliable, so configure may succeed but a subsequent build
will fail. It is always possible for a user to set PKG_CONFIG=true
when running configure and completely eliminate all of the data
provided by any associated .pc files, setting LIBS, CFLAGS, etc by
hand the 'old-fashioned' way.
The primary drawback of not using pkg-config is that the user
has to set LIBS, CFLAGS, etc. the old-fashioned way. In practice,
this is pretty trivial, and all pkg-config has done is move the
data from a single CONFIG_SITE file to separately maintained
.pc files for each package.
If you do use PKG_MODULE_CHECK, follow it up with a call to
AC_CHECK_LIB or AC_SEARCH_LIBS to validate the data in whatever
.pc file was located by PKG_CHECK_MODULES
I'm assuming that the place at which this occurs inside your configure.ac is inside a PKG_CHECK_MODULES call.
Looking at the libopensync sources, it seems that libopensync1 is the newer name, and libopensync is the old name. So, we'll use pkg-config macros to look for the newer name unless it doesn't exist.
Put this in your configure.ac:
# Check if libopensync1 is known to pkg-config, and if not, look for libopensync instead
PKG_CHECK_EXISTS([libopensync1], [OPENSYNC=libopensync1], [OPENSYNC=libopensync])
Then later in your PKG_CHECK_MODULES call, replace libopensync1 with $OPENSYNC.

Resources