Haskell Bad Interface File - haskell

I am trying to take my Haskell project and split it apart into a library and a set of executables that depend on the library. When I try to compile now I get the following error:
src/Main.hs:23:0:
Bad interface file: /Users/<MyHomeDir>/.cabal/lib/Core-0.0.1/ghc-6.12.1/<MyModule>.hi
mismatched interface file ways (wanted "", got "p")
I believe that the p might be the p flag related to packages for ghc. Is this correct? Do I need to add more configuration options somewhere to my cabal file to support this?

I encountered a similar problem when compiling executables with dynamic linking.
I compiled a library and executable by invoking cabal install --ghc-option=-dynamic pkg.
The executable was built with dynamic linking but the library part was unusable.
I assume that using the --ghc-option=-dynamic option caused the static version of the library was built with dynamic linking also.
Since Cabal-1.14 I can use the --enable-executable-dynamic option which works correctly.

That's saying it found a profiling build, but you're building Main.hs without profiling enabled. Quick fixes:
enable profiling in the build for Main.hs
build and install <MyModule> with profiling enabled
Either way, that will begin with a command that resembles
$ runghc Setup.hs configure --enable-library-profiling

Related

GHC linker cannot find lHSsemigroups

I'm attempting to compile a Haskell project on Windows with profiling enabled, using the following command.
ghc --make -O -prof -fprof-auto game_dangerous.hs
I develop the project myself and the same source code compiled and linked fine without profiling. As expected (from previous experience) I ran into a number of errors of the form:
Could not find module `Data.Vector.Mutable'
Perhaps you haven't installed the profiling libraries for package `vector-0.12.0.2'?
I proceeded to iteratively reinstall packages based on the errors encountered using for example:
cabal install -p vector --reinstall
Cabal kept giving me warnings about possibly breaking packages with the reinstalls but I pressed on as (as far as I could see) every package that could be broken was going to get reinstalled itself as I moved through the tree of dependencies. Also, I've previously followed the same process on another machine and it worked fine. After reinstalling all the required packages my project now compiles but the linker fails with this error:
C://Program Files//Haskell Platform//8.6.3//mingw//bin/ld.exe: cannot find -lHSsemigroups-0.18.5-8pPnWqWrcWhEagTFf5Pnk2_p
collect2.exe: error: ld returned 1 exit status
`gcc.exe' failed in phase `Linker'. (Exit code: 1)
However, the build does complete successfully without profiling enabled. Does anyone know what may have gone wrong and how to fix the issue? Thanks in advance.
Steven
I would try making a .cabal file for your program, where you explicitly specify the cabal packages your program depends on and use cabal v2-build to compile your program. It will warn you about missing dependencies of your program until you include them all in build-depends section of the .cabal file. You only need to include the dependencies of your program, not the dependencies of the dependencies. After that you can add cabal.project.local to enable profiling and maybe something else. It should be enough to run cabal v2-build to build your program and packages it depends on with profiling(and other options in the cabal.project.local) enabled.
You need to have profiling enabled in the packages used by your program to support profiling in it. Cabal v2 builds allows you to have multiple instances of the same package. Those instances are different because different flags and options have been used to build them.
It is possible to achieve the same result using a separate package database for your program. That is using ghc-pkg with --package-db option.
Another option is to use stack. It will solve the same issues, but differently at the cost of more space and some performance penalties in ghc(compared to ghc built from source which can be used with cabal).

Cabal can't find foreign libraries

Recently I was trying to install llvm-general-3.5.1.0 package.. for about a week. Basically I am getting this error: link. My situation is identical. Windows 10, ghc 7.10.2, cabal 1.22.4.0. I installed llvm 3.5.2 from sources with cmake and everything went fine. In llvm/lib directory I have *.lib files (eg. LLVMAnalysis.lib).
But somehow cabal can't see those libraries and gives this frustrating error:
Configuring llvm-general-3.5.1.0...
setup.exe: Missing dependencies on foreign libraries:
* Missing C libraries: LLVMLTO, LLVMObjCARCOpts, LLVMLinker, LLVMipo,
LLVMVectorize, LLVMBitWriter, LLVMCppBackendCodeGen, LLVMCppBackendInfo,
LLVMTableGen, LLVMDebugInfo, LLVMOption, LLVMX86Disassembler,
LLVMX86AsmParser, LLVMX86CodeGen, LLVMSelectionDAG, LLVMAsmPrinter,
LLVMX86Desc, LLVMX86Info, LLVMX86AsmPrinter, LLVMX86Utils, LLVMJIT,
LLVMIRReader, LLVMAsmParser, LLVMLineEditor, LLVMMCAnalysis,
LLVMMCDisassembler, LLVMInstrumentation, LLVMInterpreter, LLVMCodeGen,
LLVMScalarOpts, LLVMInstCombine, LLVMTransformUtils, LLVMipa, LLVMAnalysis,
LLVMProfileData, LLVMMCJIT, LLVMTarget, LLVMRuntimeDyld, LLVMObject,
LLVMMCParser, LLVMBitReader, LLVMExecutionEngine, LLVMMC, LLVMCore,
LLVMSupport
This problem can usually be solved by installing the system packages that
provide these libraries (you may need the "-dev" versions). If the libraries
are already installed but in a non-standard location then you can use the
flags --extra-include-dirs= and --extra-lib-dirs= to specify where they are.
I really want to use this package on my Windows, but nothing seems to work (I tried everything like --extra-lib-dirs and compiled also with MinGW and VS - the same problem).
I can't accept the fact that it won't install. I mean, there must be some way to fix Setup.hs from this cabal package or something. Does anyone have an idea what can be wrong with cabal in this case and how can I try to workaround this? I don't know how exactly cabal works, maybe someone with this knowledge will have an idea? Or maybe there is a way to do this without cabal?
Ok, i've managed to build it and, i think, found the root of the issue.
First, steps to build:
Get the MinGW. My installation of MinGW has gcc 4.8.
Get 32-bit MinGHC.
Compile LLVM 3.5 with MinGW's gcc and install it somewhere.
Copy contents of MinGW installation directory into MinGHC Install
Dir\ghc-7.10.2\mingw, replacing conflict files.
In the command line set your PATH so it has haskell toolset from
MinGHC (i recommend using switch .bat scripts) and llvm-config.exe.
Get the llvm-general package source either using cabal fetch or
downloading via browser from hackage.
Replace cc-options: -std=c++11 line of llvm-general.cabal with
cc-options: -std=gnu++11.
Finally, cabal configure and cabal build should work.
I have been changing my build environment many times, so if this doesn't work for you let me know, i probably forgot something.
Now let's go into details.
What we thought is a bug of cabal is not, actually. The problem is that both stack and MinGHC (and Haskell Platform, i guess) use quite old gcc - 4.6. This gcc has even two defects:
It doesn't support -std=c++11 and LLVM 3.5 can't be built using it.
As a consequence, this gcc can't be used by ghc when compiling
llvm-general, because it can't parse LLVM headers properly.
Even if it could, its linker can't link against LLVM libs compiled by
MinGW using gcc 4.8. This is why cabal was telling you it
couldn't find LLVM libs. I've hacked Setup.hs so that it wouldn't
look for these libs, but pass -lLLVMSomething to linker via -pgml
ghc option. This lead to clear error message:
ld.exe: ignoring libLLVMSupport.a ...
ld.exe: can't find -lLLVMSupport
So, the cabal was actually finding these libs, but was dropping them because they couldn't be linked to.
Ideally, the solution would be to update mingw distribution used by stack/MinGHC. But as a workaround you can just replace old gcc with new one.
Finally, -std=gnu++11 is used because current MinGW release is affected by this bug, which prevents compilation of c++ bits of the package. Whew, that was a long way.

own symbol not found cabal test

I'm try to add some test to a little project in Haskell. (I'm using a cabal sandbox on Mac OS)
Everything is fine (the Haskell code compile and run perfectly in cabal repl.
But when I try to build the test, it compiles fine, but crashes at link time, claiming that it can't find some symbols
which are from my package.
It seems that it's not trying to link with my own module library, even though I added the dependency in the cabal test suite configuration. If I didn't it could compile, did It ?
So is it a (cabal ) configuration problem or something more major ?
I assume you Cabal file has a library, executable, test-suite layout. Then you must list the relevant modules in the library section of your Cabal file under exposed-modules or other-modules. In any case, all the modules of your package must be listed in the Cabal file.

Is there a maximum number of modules that can be compiled via Cabal?

Is there a maximum number of modules that can be compiled via Cabal / via GHC?
I do have a rather large project (many auto generated modules) which does not compile using cabal, but no error message shows up. Cabal just silently exits with exit code 127 (Mac OS X, happens in 10.6 and 10.8, only these 2 platforms tested; GHC 7.4.2).
EDIT Output of cabal build -v3:
>>> cabal build -v3
Using internal setup method with build-type Simple and args:
["build","--verbose=3"]
creating dist/build
creating dist/build/autogen
Building java-bindings-1.0...
Preprocessing library java-bindings-1.0...
Building library...
creating dist/build
("/usr/bin/ghc",["--make","-package-name","java-bindings-1.0","-v","-hide-all-packages","-fbuilding-cabal-package","-i","-idist/build","-i.","-idist/build/autogen","-Idist/build/autogen","-Idist/build","-optP-include","-optPdist/build/autogen/cabal_macros.h","-odir","dist/build","-hidir","dist/build","-stubdir","dist/build","-package-id","base-4.5.1.0-81d626fb996bc7e140a3fd4481b338cd","-package-id","java-bridge-0.20130602-5cb59a7e71733b25cf4b8a8ae649598b","-O","-XHaskell98","Java.Applet", (references to the other modules)
/usr/bin/ghc returned ExitFailure 127
Now that I see that output I believe it maybe is the number of command line arguments that are being passed to ghc - I believe there is a maximum length imposed by the operating system... Is there a way to circumvent this limitation?
Yes - the operating system places a limit on the length of a command line, which is used internally for the final linking step.
I'm not sure what to suggest here, other than reporting the bug against Cabal and maybe trying to fix it yourself. Cabal's code is pretty readable, if you go that route. :)

How can configuration tools like sdl-config be used with a cabalized project?

I have a working SDL/Haskell application that I would like to build using Cabal instead of the current Makefile (because that is the "Haskell way"). The Makefile is itself very simple, and I was hoping that the default cabal build process could allow me to reconstruct a build command specified in my Makefile. The problem is that it makes use of "sdl-config", a utility that gives you all the necessary cc- compiler options:
wrapper.o: SDLWrapper_stub.h
ghc -no-hs-main `sdl-config --cflags` -Wall wrapper.c -c
Cabal does not seem to expand that into a shell call when calling GHC. How can I specify that sdl-config's options should be fed into GHC when compiling wrapper.o?
Using the configure style in Cabal, you can write a little configure script that substitutes a variable for the output of the sdl-config command. The values will then be replaced in a $foo.buildinfo.in file, yielding a $foo.buildinfo file, that Cabal will include in the build process.
General solution: the configure script
#!/bin/sh
SDLFLAGS=`sdl-config --cflags`
echo Found "$SDLFLAGS"
sed 's,#SDLFLAGS#,'"$SDLFLAGS"',' z.buildinfo.in > z.buildinfo
The $foo.builinfo.in file
cc-options: #SDLFLAGS#
The .cabal file
Build-type: Configure
When you run "cabal configure" the "cc-options" field in z.buildinfo will be created to hold:
cc-options: -I/usr/include/SDL -D_GNU_SOURCE=1 -D_REENTRANT
which cabal will include in the build.
Done.
Specific solution for pkg-config tools
For tools that support the pkg-config-style of configuration, such as sdl or cairo and others, Cabal has specific support already:
pkgconfig-depends: package list
A list of pkg-config packages, needed to build this package. They can be annotated with versions, e.g. gtk+-2.0 >= 2.10, cairo >= 1.0. If no version constraint is specified, any version is assumed to be acceptable. Cabal uses pkg-config to find if the packages are available on the system and to find the extra compilation and linker options needed to use the packages.
If you need to bind to a C library that supports pkg-config (use pkg-config --list-all to find out if it is supported) then it is much preferable to use this field rather than hard code options into the other fields.
So for sdl you just need:
pkgconfig-depends: sdl
Use the Configure build type in your $PROJ_NAME.cabal file and generate a $PROJ_NAME.buidinfo file from a $PROJ_NAME.buildinfo.in template with a configure script. Look at the source of the SDL library on Hackage for an example. This section of the Cabal user guide provides more details.
One tip: do not forget to mention $PROJ_NAME.buildinfo.in and configure in the extra-source-files field.

Resources