I'm making some "experiments" on a haskell module and I have a problem with a source file I wish to modify.
I have many reasons to think that GHC seek the installed (with cabal) library on my system and not the local sources files.
I deleted the *.o files locally and the other source files in this module are not rebuild by GHC.
Can I force GHC to use the local sources files of a module or ignore an installed module in particular?
Yes, use ghc -hide-package evil-package. Or you can hide the package temporarily with ghc-pkg hide evil-package, and then undo it later with ghc-pkg expose evil-package.
Related
I have the following directory structure:
libgs - The basic Global Script abstract machine and standard library
gsi - A Global Script interpreter
gs2hs - A Global Script to Haskell compiler
Most, but not of course all, of this code is written in Global Script, and translated to Haskell by a Haskell program, hsgs2hs.
As such, the code in gsi and gs2hs both depend on the modules from libgs.
Because of somewhat sloppy code organization on my part, the compiler in gs2hs also depends on the front-end modules (parser, type-checker, etc.) from the gsi directory.
Legal aside: If it matters: my code is freely available online, but is not open-source, and its license does not permit redistribution through Hackage. End legal aside.
I can make this directory structure work by running
ghc --make -i../libgs gsi.hs -o gsi
in the gsi directory, and
ghc --make -i../libgs -i ../gsi gs2hs.hs -o gs2hs
in the gs2hs directory.
This has the problem that, every time I do both builds in sequence, GHC recompiles every single module in the libgs directory, and every shared module in the gsi directory, telling me 'flags changed'.
I figure, ok, I should probably be using packages for re-used code in Haskell, right? So I convert libgs to a package:
Add a libgs.cabal file to that directory, listing all the modules as exposed modules.
Add a libgs/install-all script that runs cabal install --lib --package-env $REPO_ROOT/package.env . and call it before building gsi and gs2hs
Add -package-env $REPO_ROOT/package.env to the GHC flags in the gsi and gs2hs directories.
No joy!
Now, any change to libgs at all - even just adding a new module to it - causes GHC to recompile every module in the gsi directory, telling me the fundamental module GSI.Value has changed. Even though it actually hasn't; the source code for that module and everything it depends on (which isn't much) is unchanged. Just the hash for the package it's coming from has changed.
How do I stop GHC from recompiling the world constantly, and get it to only recompile things when the result can actually be different?
Add a libgs.cabal file to that directory, listing all the modules as exposed modules.
Ok, good.
Add a libgs/install-all script that runs cabal install --lib --package-env $REPO_ROOT/package.env
Don't do that. As a general rule, never use install --lib. It's usually better to let Cabal figure out when to install libraries. The easiest way is to put the executables in the package itself. You can have both a library: section in the .cabal file as well as arbitrarily many executable: gsi and executable: gs2hs ones.
Alternatively, you can keep libgs a package that doesn't care about executables, but have these in their own package each. Then you don't do any builds in the package directories themselves, but instead put a cabal.project file in your main src directory, saying
packages: ./libgs ./gsi ./gs2hs
Then run cabal new-build in that directory. It'll collectively store the require object files in its .dist-newstyle directory.
I tried cabal install GLUT which gave the following:
Setup: Missing dependency on a foreign library:
* Missing C library: glut32
This problem can usually be solved by installing the system package that
provides this library (you may need the "-dev" version). If the library is
already installed but in a non-standard location then you can use the flags
--extra-include-dirs= and --extra-lib-dirs= to specify where it is.
So I thought, ok, lets get the sources and point cabal to the directories. I first tried freeglut and then the following
cabal install GLUT --extra-include-dirs="<path to freeglut>\include"
--extra-lib-dirs="<path to freeglut>\src"
Same thing, so I thought maybe it doesn't work with freeglut, and got glut:
cabal install GLUT --extra-include-dirs="<path to glut>\include"
--extra-lib-dirs="<path to glut>\lib"
When this doesn't work, I try to download the source and cabal install inside the directory, then runghc Setup configure. Then thinking that there is some parse error of the paths, I try every possible way of writing a file path known to man; quotes, no quotes, backslashes, double backslashes, forward slashes, and every combination of the above. I even placed all the files on my PATH in hopes it would find them. All other options exhausted, I proceeded to sacrifice a goat to satan, but still no dice.
The question is, what do I have to do to convince ghc to find this library? (this is windows 7)
You need to make the libglut32.a import library accessible to the compiler (see this answer for information about what import libraries are). Just copy it under $GHCDIR/mingw/lib. Alternatively, try the Haskell Platform installer, which ships with a pre-compiled version of the GLUT bindings.
I've installed ghc 6.12.3, and then the Haskell Platform. I'm trying to compile a test program:
$ ghc test.hs
test.hs:3:0:
Failed to load interface for `Bindings':
Use -v to see a list of the files searched for.
so, naturally, I do
cabal install Bindings
Which works fine, and places the package in ~/.cabal/lib/bindings-0.1.2 The problem is, that when I go to compile again with ghc, it still doesn't find the package I've installed with cabal.
compiling in verbose mode gives:
ghc -v test.hs
Using binary package database: /home/ludflu/ghc/lib/ghc-6.12.3/package.conf.d/package.cache
Using binary package database: /home/ludflu/.ghc/x86_64-linux 6.12.3/package.conf.d/package.cache
As suggested by another stackoverflow user, I tried:
ghc-pkg describe rts > rts.pkg
vi rts.pkg # add the /home/ludflu/.cabal/lib to `library-dirs` field
ghc-pkg update rts.pkg
But to no avail. How to I add the .cabal to the list of package directories to search?
Thank you!
You can check which packages are installed with ghc-pkg list. It may be that you need to either specify the packages to ghc with -package <pkgname> or I believe adding --make to will trigger a chasing down of dependencies, including packages.
Edit: the bindings package is obsolete indeed, see the hackage page. This isn't a package management problem, the only module available is Bindings.Deprecated, which you are perfectly able to load, even though it is an empty module. I believe the relevant parts have been broken out into bindings-<module>, so if you want the bindings functionality you should look to those packages.
http://www.haskell.org/haskellwiki/Cabal-install
One thing to be especially aware of, is that the packages are installed locally by default by cabal, whereas the commands
runhaskell Setup configure
runhaskell Setup build
runhaskell Setup install
install globally by default. If you install a package globally, the local packages are ignored. The default for cabal-install can be modified by editing the configuration file.
I was getting the same error with the runhaskell command. I used the cabal in the directory that had the .cabal file and was able to resolve the error.
When compiling a haskell source file via ghc --make foo.hs GHC always leaves behind a variety of intermediate files other than foo.exe. These are foo.hi and foo.o.
I often end up having to delete the .hi and .o files to avoid cluttering up the folders.
Is there a command line option for GHC not to leave behind its intermediate files? (When asked on #haskell, the best answer I got was ghc --make foo.hs && rm foo.hi foo.o.
I've gone through the GHC docs a bit, and there doesn't seem to be a built-in way to remove the temporary files automatically -- after all, GHC needs those intermediate files to build the final executable, and their presence speeds up overall compilation when GHC knows it doesn't have to recompile a module.
However, you might find that setting the -outputdir option will help you out; that will place all of your object files (.o), interface files (.hi), and FFI stub files in the specified directory. It's still "clutter," but at least it's not in your working directory anymore.
GHC now has the options no-keep-hi-files and no-keep-o-files. See here for more information.
My usual workflow is to use cabal rather than ghc directly. This sets the outputdir option into an appropriate build folder and can do things like build haddock documentation for you. All you need is to define the .cabal file for your project and then say cabal install or cabal build instead of run ghc directly. Since you need to follow this process in the end if you ever want to share your work on hackage, it is a good practice to get into and it helps manage package dependencies as well.
You can set the -hidir to /dev/null, I think, sending them there. Also, the -fno-code option in general turns off a lot of output. You might just want to use Cabal.
Turns out that using -hidir/-odir/-outputdir is no good; /dev/null is a file, and not a directory. See http://www.haskell.org/pipermail/xmonad/2010-May/010182.html
2 cents to improve the workflow a bit:
We can put the following alias into the .bashrc (or similar) config
file:
alias hsc='_hsc(){ ghc -no-keep-hi-files -no-keep-o-files "$#";}; _hsc'
And then just call
$ hsc compose.hs
[1 of 1] Compiling Main ( compose.hs, compose.o )
Linking compose ...
$ ls
compose compose.hs
Can cabal use hsc2hs to create hs files? How?
I didn't find an answer in the manuals, googling, nor in other projects (had my hopes up for gtk2hs but it turns out that it doesn't use cabal)
Yes, cabal understands that when you list module Foo in your .cabal file, and it finds Foo.hsc on disk, that it must run hsc2hs on the module first.
Cabal transparently handles the existence of .hsc files.