Turning off unittest execution of third-party code - linux

I'm trying to understand how the '-unittest' dmd switch can be used to select which files have their unittests executed.
I have a file, "a.d", containing a unittest block. File "a.d" imports from a third-party module (requiring the file "b1.d" and in turn "b2.d") which contain their own unittest blocks.
I don't want to run the tests in the third-party code: I just want to run the tests in a.d.
If I compile the third-party code first
dmd -c b1.d b2.d
then try to link it with my code with the unittests copied in
dmd -unittest a.d b1.o b2.o
then I get an error saying that the module in b1.d which a.d is trying to import is in a file that cannot be read.
Can anyone show me how to accomplish this?
Thanks!

What you want to do is not possible because a.d has imported b1.d and b2.d. It means that those modules must be passed to the compiler.
If you want to link some *.o files it's more complex: you have to write an interface (*.di file for them just like for a *.so) thus it's not a good idea to use this mechanism to bypass the unittests. (although this could work it's a bit heavy).
A more straightforward way to arbitrary select some unittests is to use the trait getUnitTests. It's really more the way to go.

You are almost there. Just use separate compilation and linking steps, i.e.
dmd -c -unittest a.d
and then:
dmd a.o b1.o b2.o
That's it.

Related

Is it possible to start a program with a missing shared library

I'm running Linux and have a situation like this:
A binary file 'bin1' loads via dlopen 'shared1.so' which is linked with 'shared2.so' and 'shared3.so'.
if 'shared2.so' or 'shared3.so' is missing the program 'bin1' won't run.
There are runs that I know that I won't touch any code from 'shared2.so' and I want 'bin1' to be able to run even when this library is missing, can this be done ?
You could ship program with dummy shared2.so library. You might need to add dummy functions which shared1 expects to find there. This can be done manually or via automatic tool like Implib.so.

Setting the include path with bindgen

I'm writing a Rust interface to a small C library, which has headers spread in a few locations. It's not a system library, and is normally used by some executables in the same package; I'm currently including it as a git submodule in my Cargo project.
Building the library seems to be pretty easy; I've opted to use the gcc crate from build.rs:
gcc::Config::new()
.file("external/foo/dir1/file1.c")
.file("external/foo/dir2/file2.c")
.include("external/foo/dir1/")
.include("external/foo/dir2/")
.include("external/foo/config_a/")
.compile("libfoo.a");
Now I was hoping to use the bindgen crate to generate the FFI interface without too much fuss, but it doesn't seem to have a way of setting include paths.
I can create a wrapper.h as suggested by this blog and include several headers, but if dir1/dir1.h includes conf.h directly, which works when building due to .include("external/foo/config_a/") it can't be found.
I can't find anything in bindgen's API to help here (essentially I want to pass the equivalent of gcc/clang's -I option). Am I missing anything?
The best option I can think of so far is to copy the various headers from the library source into some temporary directory in build.rs and run bindgen on that, but that seems somewhat messy if there's a nicer way.
With the API you can use Builder::clang_arg with arbitrary arguments:
let b = bindgen::builder().header("foo.h").clang_arg("-I/path");
From the command line you can do the same by appending arguments after --, like:
bindgen foo.h -- -I/path

How do I get haxe to generate externs?

I am writing haxe code which I want to compile to an arbitrary target as a module and then use the results from another module compiled for this same target. I don’t want to handle this the “Haxe way” (which is to fully inline all libraries at compiletime). Instead I want to be able to write distinct Haxe modules and reference them with full type safety without inlining between the modules. The natural way to do this would be to have both source Haxe files and a separate directory of “headers” filled with extern describing the public API of my module, with these externs somehow automatically generated so that they don’t need to be manually maintained.
I cannot figure out how to get Haxe to emit externs. It would make sense to me if haxe-externs were an actual “target platform” so that I could do something like:
$ haxe ClassName -hxe externsoutdir
It would make less sense but still be acceptable if one of the -D flags like -D dump (which seems to sort of get one part of the way there) or some imaginary, nonexistent -D dump-externs existed. Then you could generate externs while compiling to your favorite target:
$ haxe ClassName -js outfile.js -D shallow-expose -D dump-externs=externsoutdir
The idea is to take a class definition like this:
#:expose
class ClassName {
function quack() {
trace('quack');
}
}
and emit something like this in a separate directory:
extern class ClassName {
function quack():Void;
}
so that I can consume it from another module like this:
#:expose
class MyClassName extends ClassName {
override function quack() {
super.quack();
trace('…and again I say “quack”');
}
}
$ haxe -cp path\to\externsoutdir MyClassName -js outfile.js -D shallow-expose
It would only make sense to generate externs for things decorated with #:expose or some other decorator.
I will figure out how to wrap the emitted modules to load each other correctly. That’s easy. The hard part is generating the extern definitions—shouldn’t Haxe already have a way to do this?
Is there already some tool or built-in way I’m missing to do this? When Googling, all I see are projects that supposedly help with generating externs for existing JavaScript libraries. But that’s not my use case…
Update: --gen-hx-classes was removed sometime aroudn Haxe 4.0.0-rc3. Apparently the functionality still exists secretly as -D gen-hx-classes, but beware, if you rely on this, it seems like its going away.
I believe --gen-hx-classes option might be what you're looking for. Oddly I don't see it in the compiler flags list.
I use it in a modular JavaScript build system that is similar to what you're talking about.
I believe it creates a directory of .hx files that are externs for every class generated by the build (including those from the Haxe standard library.) Actually, getting duplicates of the classes in the standard library may be a problem you will face.
You may also need to use #:keep (or the related macro) to ensure dead code elimination doesn't remove things the other build will need.
You might also need to exclude a class from one or the other builds, e.g. --macro 'exclude("haxe.io.Input")' (or, excludeFile is actually more performant for a whole list of exclusions.)

Error linking module in ocaml

I am a complete beginner with Ocaml programming and I am having trouble linking a module into my program. Actually I am doing some regular expression checking and I have written a function that basically tokenizes a string based on a separator string using the Str module . So i use the functions defined in the library like this:
Str.regexp_string /*and so on*/
However, when I try to compile the ml file, I get an error suggesting that I have an undefined global Str . We use List functions by typing in List.length and so on just like I did for Str without having to explicitly include the specific module. I tried
open Str;;
include Str;; /*None of these work and I still get the same error*/
However if in the toplevel I use
load "str.cma" /*Then the program works without problems*/
I want to include the module in the ml file because I have to in the end link 3 cmo's to get the final executable(which is not run in the toplevel). I know this is a really basic question but I am having trouble solving it. Thanks in advance.
You don't need to add anything in your file foo.ml.
You do need to tell the compiler where to find the Str module when compiling foo.ml . To do so, add it to the command line used to compile foo.ml:
ocamlc str.cma foo.ml
or
ocamlopt str.cmxa foo.ml
List and other modules from the standard library are accessible by default, so you don't need to tell the compiler about those often used modules.
Just add str to the libraries field of your dune file.
I think you need to use '-cclib ' compiler directive.
The module name shouldn't include the file ending like .cma.
Below is what I did when trying to use the unix and threads modules.
I think you need to use some combination of the 'custom' and 'cclib' compiler directives.
ocamlc -custom unix.cma threa.ml -cclib -lunix
Look at chapter 7 of this book for help:
http://caml.inria.fr/pub/docs/oreilly-book/html/book-ora063.html
And look at coverage of compiler directives here:
http://caml.inria.fr/pub/docs/manual-ocaml-4.00/manual022.html#c:camlc
ocamlc calc.ml str.cma -o calc
File "calc.ml", line 1:
Error: Error while linking calc.cmo:
Reference to undefined global `Str'
Code is very simple, to cut down scruff.
let split_into_words s =
Str.split ( Str.regexp "[ \n\t]+") s ;;
let _ =
split_into_words "abc def ghi" ;;
On ocaml 4.0.2. Obviously, there is a problem here, but I am too much of a beginner to understand what it is. From toplevel it seems work fine with #load "str.cma", so there is something here we don't understand. Anyone know what it is?

Is there a standard way of checking for a named executable with autoconf

I have a project using autoconf. On my build server it fails to build because the makefile refers to makeinfo, which is not currently installed on the build server, but the error is not caught until compile time. I'd like to ensure that the problem is caught at configure time and gives a sensible error message.
I can't find a standard macro that will check for the existence of an executable named 'makeinfo'. I could write my own, but I don't want to reinvent the wheel. Is there something generic that I'm missing? Or even a specific check for makeinfo?
How about AC_CHECK_PROG or AC_PATH_PROG?
http://www.gnu.org/software/autoconf/manual/html_node/Generic-Programs.html#Generic-Programs

Resources