own symbol not found cabal test - haskell

I'm try to add some test to a little project in Haskell. (I'm using a cabal sandbox on Mac OS)
Everything is fine (the Haskell code compile and run perfectly in cabal repl.
But when I try to build the test, it compiles fine, but crashes at link time, claiming that it can't find some symbols
which are from my package.
It seems that it's not trying to link with my own module library, even though I added the dependency in the cabal test suite configuration. If I didn't it could compile, did It ?
So is it a (cabal ) configuration problem or something more major ?

I assume you Cabal file has a library, executable, test-suite layout. Then you must list the relevant modules in the library section of your Cabal file under exposed-modules or other-modules. In any case, all the modules of your package must be listed in the Cabal file.

Related

Cabal install tidal ends with warning

I'm trying install tidal in command line this way:
cabal install tidal
but it ends with this message:
Warning: You asked to install executables, but there are no executables in
target: tidal. Perhaps you want to use --lib to install libraries instead.
Return of:
cabal install tidal --lib
is:
Resolving dependencies...
Up to date
If I check ghk-pkg list, there is no package tidal
...
Have somebody similar problem or what I'm doing wrong?
My environment is:
Windows 10 Education
Haskell 8.4.3
Cabal 3.2.0.0
Ghc 8.10.1
Thank you for help.
Like Stack for a longer time, Cabal-install does now (as of 3.2) not really install libraries anymore – in the sense of, change the computer's state so that GHC can access the library on it†. Both tools only install executables now. It used to do that for libraries too, but that was stopped with the now default Nix-style builds.
Now (and, really, also already before), the way to use a library is instead to just depend on it, and let Cabal figure out behind the scenes if it needs to be installed. I.e., you add a .cabal file to your .hs source file with build-depends: tidal in it. Then when you say cabal install ., it will first download and install the library before then using it for building your own executable.
†Of course both Stack and Cabal do technically speaking install libraries, just they don't globally register them. I.e., cabal knows where it has installed the library, but you're not really supposed to know about that. It's in the spirit of continuous integration: if your code builds now with the particular state of libraries you happen to have installed, that's not very reliable. If it builds with just those libraries that are explicitly listed in a project file, the chances are much better that future-you (or somebody else) will still be able to use your code on another computer without hours of figuring out what libraries to install first.
cabal install --lib tidal doesn't install the library binaries in a location managed by ghc-pkg. The binaries remain in the Cabal "store".
What it does is to create a plaintext GHC package environment file that is picked up by standalone invocations of ghc and ghci and tells them where to look for the extra libraries.
By default (as mentioned in the docs) this package environment file will be created at ~/.ghc/$ARCH-$OS-$GHCVER/environments/default and will be picked by ghc and ghci invocations made anywhere.
We can also supply an extra --package-env parameter to create the environment file in a local folder, which will only affect ghc and ghci invocations made in that folder. For example:
cabal install --lib --package-env . tidal
cabal projects themselves ignore environment files, as their package environments are constructed from the build-depends section of the cabal file for the sake of reproducibility. But environment files are useful for not having to create a cabal project in the first place, if you only need it for playing with the library in ghci, or if you are compiling simple programs using ghc only.

GHC linker cannot find lHSsemigroups

I'm attempting to compile a Haskell project on Windows with profiling enabled, using the following command.
ghc --make -O -prof -fprof-auto game_dangerous.hs
I develop the project myself and the same source code compiled and linked fine without profiling. As expected (from previous experience) I ran into a number of errors of the form:
Could not find module `Data.Vector.Mutable'
Perhaps you haven't installed the profiling libraries for package `vector-0.12.0.2'?
I proceeded to iteratively reinstall packages based on the errors encountered using for example:
cabal install -p vector --reinstall
Cabal kept giving me warnings about possibly breaking packages with the reinstalls but I pressed on as (as far as I could see) every package that could be broken was going to get reinstalled itself as I moved through the tree of dependencies. Also, I've previously followed the same process on another machine and it worked fine. After reinstalling all the required packages my project now compiles but the linker fails with this error:
C://Program Files//Haskell Platform//8.6.3//mingw//bin/ld.exe: cannot find -lHSsemigroups-0.18.5-8pPnWqWrcWhEagTFf5Pnk2_p
collect2.exe: error: ld returned 1 exit status
`gcc.exe' failed in phase `Linker'. (Exit code: 1)
However, the build does complete successfully without profiling enabled. Does anyone know what may have gone wrong and how to fix the issue? Thanks in advance.
Steven
I would try making a .cabal file for your program, where you explicitly specify the cabal packages your program depends on and use cabal v2-build to compile your program. It will warn you about missing dependencies of your program until you include them all in build-depends section of the .cabal file. You only need to include the dependencies of your program, not the dependencies of the dependencies. After that you can add cabal.project.local to enable profiling and maybe something else. It should be enough to run cabal v2-build to build your program and packages it depends on with profiling(and other options in the cabal.project.local) enabled.
You need to have profiling enabled in the packages used by your program to support profiling in it. Cabal v2 builds allows you to have multiple instances of the same package. Those instances are different because different flags and options have been used to build them.
It is possible to achieve the same result using a separate package database for your program. That is using ghc-pkg with --package-db option.
Another option is to use stack. It will solve the same issues, but differently at the cost of more space and some performance penalties in ghc(compared to ghc built from source which can be used with cabal).

Why do I have to remove `ekg` from build-deps for haskell stack/cabal to find my dll on Windows?

I'm building a Haskell 7.10 project that depends on tdsodbc.dll, using stack v1.7.1 on Windows, everything 64-bit. I have tdsodbc.dll in the lib folder of the project, and extra-lib-dirs: lib and extra-libraries: tdsodbc in the .cabal.
But when I compile, I get Missing C library: tdsodbc when stack runs cabal configure. I've tried putting extra-lib-dirs: [lib] in stack.yaml, and I can see from that configure command that it has put --extra-lib-dirs=C:\Users\Kevin\src\theproject\lib on the cabal configure command line, but still complains about it missing.
Now the weird part: If I remove ekg from the build-deps of the project (and remove the relevant imports etc.), the project builds just fine! I still have to copy the dll into .stack-work/dist/… to make it run, but why would ekg in build-deps stop cabal from being able to configure it?
I've tried the trick from Cannot get cabal to find the mpi library for haskell-mpi on Windows with putting c/Users/Kevin/src/theproject/lib in LIBRARY_PATH (there's no .a file to mv in my case, and no .h's), but that didn't help me. Only removing ekg has helped me so far. What could be causing this? The "solution" seems completely irrelevant to the problem :(
EDIT: I tried using plain Haskell Platform 7.10.3 (from https://www.haskell.org/platform/prior.html ), and that configured and built just fine. So the problem is just when cabal configure is called from stack.

Re-using Haskell Platform prebuilt libraries within a cabal sandbox?

When I build a Cabal project without a sandbox, Cabal uses existing libraries from my Haskell Platform installation. However, if I try to do the same inside a Cabal sandbox, Cabal forcibly rebuilds all my dependencies into the sandbox.
To save on build times and disk space, it'd be great to be able to instruct Cabal to use existing Haskell Platform libraries instead of rebuilding them. Is this possible?
Example (files in a gist):
executable blog
hs-source-dirs: .
main-is: Test.hs
build-depends: base >= 4.5 && < 5
, text
If I cabal build in the directory containing this .cabal file, my Test module gets built against the Haskell Platform version of text.
However, if I do the same in a sandbox:
cabal clean # (or alternatively clone an empty gist)
cabal sandbox init
cabal build
I get this:
$ cabal build
Package has never been configured. Configuring with default flags. If this
fails, please run configure manually.
Resolving dependencies...
Configuring install-test-0.1...
cabal: At least the following dependencies are missing:
text -any
If I now go and cabal install, the latest text library is built from scratch under my sandbox.
As per cabal-install bug #1695, this is currently not supported. It's something that may eventually be built, see multi-instance packages for more information.
If anyone's reading and cares, an alternative that should work for some users would be to share a single cabal sandbox among multiple projects. This way you could still keep your Haskell Platform installation separate from library installations you need during your development. More on that in An Introduction to Cabal sandboxes

Haskell Bad Interface File

I am trying to take my Haskell project and split it apart into a library and a set of executables that depend on the library. When I try to compile now I get the following error:
src/Main.hs:23:0:
Bad interface file: /Users/<MyHomeDir>/.cabal/lib/Core-0.0.1/ghc-6.12.1/<MyModule>.hi
mismatched interface file ways (wanted "", got "p")
I believe that the p might be the p flag related to packages for ghc. Is this correct? Do I need to add more configuration options somewhere to my cabal file to support this?
I encountered a similar problem when compiling executables with dynamic linking.
I compiled a library and executable by invoking cabal install --ghc-option=-dynamic pkg.
The executable was built with dynamic linking but the library part was unusable.
I assume that using the --ghc-option=-dynamic option caused the static version of the library was built with dynamic linking also.
Since Cabal-1.14 I can use the --enable-executable-dynamic option which works correctly.
That's saying it found a profiling build, but you're building Main.hs without profiling enabled. Quick fixes:
enable profiling in the build for Main.hs
build and install <MyModule> with profiling enabled
Either way, that will begin with a command that resembles
$ runghc Setup.hs configure --enable-library-profiling

Resources