I have a module which binds to a C function using the FFI. How can I make this module use doctest?
The error I get when running doctest Foo.hs is something like this:
ByteCodeLink: can't find label
During interactive linking, GHCi couldn't find the following symbol:
bar
This may be due to you not asking GHCi to load extra object files,
archives or DLLs needed by your current session. Restart GHCi, specifying
the missing library using the -L/path/to/object/dir and -lmissinglibname
flags, or simply by naming the relevant files on the GHCi command line.
Alternatively, this link failure might indicate a bug in GHCi.
If you suspect the latter, please send a bug report to:
glasgow-haskell-bugs#haskell.org
### Failure in Foo.hs:41: expression `foo'
expected: [42]
but got:
<interactive>:24:1: Not in scope: `bar'
Examples: 2 Tried: 2 Errors: 0 Failures: 1
Doctest accepts arbitrary GHC flags. If you want to run Doctest with FFI code you need to pass the exact same flags that you would need to run a GHCi session with that code. Have e.g. a look at the Doctest driver of unix-time.
Related
I'm trying to use ghci / stack repl on a project where one module has foreign calls linked to a C lib tdsodbc, but I keep getting
ghc: panic! (the 'impossible' happened)
(GHC version 7.10.3 for x86_64-unknown-linux):
Loading temp shared object failed: /tmp/ghc4628_0/libghc_71.so: undefined symbol: SQLPrepareW
(where SQLPrepareW is defined in that C lib). Building with stack works fine. This happens even on other modules that just happen to import the foreign-calling module, even without actually calling the foreign functions. It doesn't happen on load, but as soon as I try to fully evaluate any function in the repl.
How can I tell ghci that some of the functions are defined in libs outside of ghc?
I've tried the -l option (e.g. stack exec ghci -- -ltdsodbc), but the only difference then is that a different function from the same lib is in the error message:
ghc: panic! (the 'impossible' happened)
(GHC version 7.10.3 for x86_64-unknown-linux):
Loading temp shared object failed: /tmp/ghc24107_0/libghc_25.so: undefined symbol: SQLDriverConnectW
Note that it's obviously checking for the lib when using -l, since if I misspell it, it'll say it can't find it:
$ stack exec ghci -- -L/usr/lib/x86_64-linux-gnu/odbc -ltdsodbctypo
Warning (added by new or init): Specified resolver could not satisfy all dependencies. Some external packages have been added as dependencies.
You can suppress this message by removing it from stack.yaml
GHCi, version 7.10.3: http://www.haskell.org/ghc/ :? for help
<command line>: user specified .o/.so/.DLL could not be loaded (libtdsodbctypo.so: cannot open shared object file: No such file or directory)
Whilst trying to load: (dynamic) tdsodbctypo
Additional directories searched: /usr/lib/x86_64-linux-gnu/odbc
This is with
$ stack --version
Version 1.4.0, Git revision e714f1dd3fade19496d91bd6a017e435a96a6bcd (4640 commits) x86_64 hpack-0.17.0
I've also tried stack ghci --ghci-options '-ltdsodbc -fobject-code', but it also panics with undefined symbol: SQLPrepareW.
The nice folks in #haskell on freenode said maybe I should try passing -fobject-code to ghci. That didn't work. I tried :set and :seti to see if it was already set, but ghci didn't show anything about object code. (Doing :unset -fobject-code just gave Some flags have not been recognized: -fno-object-code.)
Then today I happened to look at my ~/.ghci for some other reason, and that did have :set -fobject-code, even though :set/:seti doesn't show that. Removing :set -fobject-code from my ~/.ghci took away the panic attacks, and I can now use functions from modules that import the module that defines foreign functions :)
Actually calling any of the foreign functions from ghci leads to a segfault (catchsegv log for the interested), but at least I can test the pure stuff now …
I have a Haskell library with several executables (tests, benchmarks, etc), in total about six. When I do some refactoring in the library, I usually need to make some small change to each of the executables.
In my current workflow, I separately compile each executable (say, with GHCi) and fix each one up. This is tedious because I have to type out the path to each executable, and moreover have to reload all of the (very large) library, which even with GHCi takes some time.
My first thought to solve this issue was to create a single dummy module that imports the executable "Main" modules. However, this (of course) requires that the "Main" modules have a module name like module Executable1 where .... But now cabal complains when compiling the executable that it can't find a module called "Main" (despite explicitly listing "main-is" in the cabal file for each executable.)
I also tried ghci Exec1.hs Exec2.hs ..., but it complains module ‘main#main:Main’ is defined in multiple files.
Is there an easy way to load multiple "Main" modules at once with GHCi so I can typecheck them simultaneously?
Cabal’s main-is option only tells Cabal what filename it should pass to GHC. Cabal does not care about it’s module name.
GHC itself has a flag, also called -main-is, documented here which tells the compiler what module conains the main function.
So this works:
executable foo
main-is: Foo.hs
ghc-options: -main-is Foo
Of course Foo.hs should start with module Foo where… and export main. As usual, the module name and file name needs to match.
This way, all executable can have different module names and you can load them all in GHCi.
If you also want to change the name of the main function, write ghc-options: -main-is Foo.fooMain. I would guess you could even have all executables have the same module but different main-functions this way.
When parsing source codes with c preprocessor enabled, the parser doesn't like undefined macros like MIN_VERSION_packagename(a,b,c). How can I get cabal/ghc tell cpp the package info and add the macro definitons?
You can use the very idiomatic (/s) options:
ghc -optP-include -optPdist/build/autogen/cabal_macros.h
I happen to have just been writing a pull request to doctest about this, you may be interested in referencing it:
https://github.com/sol/doctest/pull/109/files#diff-438bc19bd41887f8cacb796eaa990b0aR81
I can't figure out how to get WinGHCi to load and compile my .hs file.
I have a file, C:\Users\Haskell\Source\hello.hs, that only contains the following line:
main = putStrLn "Hello, world!"
If, at the Prelude> prompt, I run
:cd C:\Users\Haskell\Source\
nothing happens, which I'm assuming means the command was successful. However, when I try to run
:load hello.hs
I get a "[1 of 1] Compiling Main. Ok, modules loaded: Main" message. My prompt then changes from "Prelude" to "*Main" and I type:
ghc -o hello hello.hs
After that, I will get a series of errors talking about how ghc, o, hello, hello, and hs are "Not in scope."
I am in the correct directory. Why won't my program run?
One of my problems is that I'm unable to navigate the directories. I know that :!dir lists the files, and I am in the right directory, but :load hello.hs still doesn't work and I keep getting the scope error.
Any help would be appreciated.
EDIT: A user pointed out that if I have gotten to the *Main prompt, then my program has been loaded and compiled and I do not need to run the ghc command. If that is the case, how would I run it? Haskell.org states that, "You can then run the executable (./hello on Unix systems, hello.exe on Windows)," but an exe has not been created.
I find it easier to first navigate to the directory then invoke ghci. Once in Prelude you can use :l and the file name.
Or, you could load ghci then use :l and use the fully qualified path for the file.
Edit:
After reading your edits, it is clear you are getting your code compiled fine. Once it says it has compiled, there is no reason to try and do so again with ghc (I don't think you can do that from within ghci anyhow).
Now that it is compiled, you can use any of the code and data types defined there in. So to use your main function, just type in main at the *Main> prompt.
So, I'm trying to use the Plugins package to dynamically load a haskell function from a source file. The source file depends on a package foo with module Foo.Bar. I'm running my project in a Cabal sandbox, where I have foo installed. Both my main program, and the module I'm loading with plugins, depend on foo. I always get one of the following two errors:
When I have foo installed in ~/.cabal, I get the error:
GHCi runtime linker: fatal error: I found a duplicate definition for symbol
aizmvszmaizmlibzm0zi1_FooziBar_zdfTypeableBazzuds2_closure
whilst processing object file
/home/joey/.cabal/lib/foo-0.1/ghc-7.6.3/HSfoo-0.1.o
This could be caused by:
* Loading two different object files which export the same symbol
* Specifying the same object file twice on the GHCi command line
* An incorrect `package.conf' entry, causing some object to be
loaded twice.
GHCi cannot safely continue in this situation. Exiting now. Sorry.
When I don't have it installed in ~/.cabal, I get a standard "module not found" error. And when I don't have it installed in my sandbox, I get the same module not found error trying to compile my main program code.
The plugins documentation is scarce at best. Any thoughts on how to solve this?
I got this working by using System.Plugins.Make to actually do the compliation, instead of relying on pre-existing object files. Not a complete solution, doesn't explain the problem, but it works for me for now.