how to reuse cabal compiled modules when using ghci - haskell

I have a fairly large haskell project, and running ghci on some files can require compiling dozens or hundreds of modules before it gets to a prompt, which can take a number of minutes. I'm using cabal, and so I generally have already compiled object files under dist/. But ghci only looks for .o files next to the source .hs files; it does not know about cabal's dist/. Is there any simple and good way to make ghci load those object files rather than recompiling everything on its own?
I'm asking for a simple and good way, because I have complicated and ugly ways to do it. :)
For example, this will make ghci reuse the cabal compiled modules. I came up with this command
line by copying, and modifying the way cabal runs ghc, ensuring that it sets everything the same as the last cabal build, which is necessary to make sure ghci will load the modules.
ghci -package-conf dist/package.conf.inplace -i -idist/build/git-annex/git-annex-tmp -i. -idist/build/autogen -Idist/build/autogen -Idist/build/git-annex/git-annex-tmp -optP-include -optPdist/build/autogen/cabal_macros.h -odir dist/build/git-annex/git-annex-tmp -hidir dist/build/git-annex/git-annex-tmp -stubdir dist/build/git-annex/git-annex-tmp -XHaskell98 dist/build/git-annex/git-annex-tmp/Utility/libdiskfree.o dist/build/git-annex/git-annex-tmp/Utility/libmounts.o
Seems to me it should be possible for a "cabal ghci" to calculate this command line and run it, or perhaps there is a tool I don't know of that already does so.

You can set the odir and ohi options to point to the cabal build directory like so: http://www.haskell.org/ghc/docs/7.6.1/html/users_guide/separate-compilation.html#output-files

Related

How can I run GHCi against a compiled package?

I should really know this by now, but I don't. I'm often working on a Cabal-based package and have just run a successful cabal build. Now I want to try some things out in GHCi. If I run cabal repl, then GHC recompiles the whole package into bytecode and runs it in the interpreter. Not what I want at all! If I were just running GHCi directly, I'd use something like -O -fobject-code, but that won't give me the package context. I just want "Give me a repl with the package as it's been compiled, compiling additional things only as necessary." How do I do it?
I don't know the right way, but I do know a workaround that can sometimes be useful. If the thing you care about is a library component, you can ask for a repl for an executable component.
I believe --repl-options -fobject-code kind of does what you want:
cabal repl --repl-options -fobject-code --repl-options -O --builddir dist-repl
This will give you incremental building of compiled code as you work in GHCi. Caveats:
dist-repl is an alternative directory for the -fobject-code build objects. As of cabal 3.6.2.0 at least, trying to reuse the regular output from cabal build leads to some unnecessary rebuilds and other strange behaviour, as reported at cabal issue #3565. That being so, it's better to compromise and use --builddir to keep a separate set of build objects. Note that cabal clean accepts the --builddir option just fine.
Setting the optimisation level explicitly is necessary, as otherwise the default -O0 from cabal repl will override your package setting.

Where do I find (and run) an executable compiled with a cabal sandbox?

I'm compiling my myProgram.lhs with the use of a cabal sandbox (set up with cabal sandbox init). I'm using a simplest approach I've come up with:
cabal exec -- ghc myProgram
or (having a rule in Makefile)
cabal exec -- make myProgram
After that, in my source directory, appears myProgram.o, but not the executable myProgram.
How do I run the resulting program?
cabal exec -- ./myProgram
doesn't work.
Now, I've come up with a simplest approach to test it:
cabal exec -- runghc myProgram.lhs
but I don't like this.
Do you know where the resulting executable is?
(I haven't created any cabal file for my project yet. I simply used to compile the program with bare ghc and test it, then--when I needed custom dependencies--I set up the cabal sanbox and installed the dependencies manually there.)
This didn't actually look like a problem of cabal exec, and it wasn't!
My history
Simultaneously with starting to use the cabal sandbox, I explicitly gave a custom name to my module in the source file (myProgram.lhs). And in such case just a bare ghc (without cabal exec) wouldn't generate the executable, too, as answered in Cabal output is redirected but not generated. (I simply couldn't test the bare ghc command, because I had the dependencies in the sandbox, so my module wouldn't compile.)
Explanation
Explanation quoted from that Q&A:
I get the warning
output was redirected with -o, but no output will be generated because there is no main module.
A quote from The Haskell 98 Report:
A Haskell program is a collection of modules, one of which, by convention, must be called Main and must export the value main.
The solution
A solution is to add -main-is MyProgram.main to ghc opts. Then it generates the executable.
./myProgram simply appears in my source directory now, no matter whether I call
ghc -main-is MyProgram.main myProgram
or
cabal exec -- ghc -main-is MyProgram.main myProgram

Generating correct link dependencies for GHC and Makefile style builds

I have a Haskell project where a number of executables are produced from mostly the same modules. I'm using a Makefile to enable parallel builds, and it very nearly works the way I want. Here's a stripped down version of my current Makefile, with ideas taken from http://www.haskell.org/ghc/docs/latest/html/users_guide/separate-compilation.html#using-make:
HFLAGS=-O3 -Wall -v0 -fno-ignore-asserts
HASKELLS=bin1 bin2 bin3 bin4 bin5 bin6 bin7 bin8 bin9
all: $(HASKELLS)
%.hi: %.o
#:
%.hi %.o: %.hs
ghc -c $(HFLAGS) $<
$(HASKELLS): %: %.o
ghc --make $(HFLAGS) $#
.hsdepend: *.hs
ghc -M -dep-makefile .hsdepend *.hs
rm -f .hsdepend.bak
include .hsdepend
As you can see, I still use ghc --make for linking (only); this way the individual modules can be compiled in parallel, and ghc --make only invokes the linker.
Unfortunately this is not foolproof: A relink is triggered only for, say, bin1 only if bin1.o is newer than the executable, but not if only one of the other object files has been updated. This can happen when a change is made in a module such that it results in the .o file being updated, but the interface of the module does not change, i.e. the .hi file is not touched.
One alterative solution would be to trigger a null ghc --make for every binary every time make is invoked; unfortunately, this is slow and clutters the output (I'd like to see when something was linked and when not).
ghc -M only generates a dependency line for each .o file, but none for the linked executables. The information about which .o files to link into which executable (given the name of the main module binN.hs) obviously is there, but it's not entirely clear to me if it's possible to get to it using any Makefile magic.
I can only think of a way to do what this by writing a post-processor for .hsdepend, but that seems excessive.
Can anyone suggest a better solution?
My advice would be, don't bother trying to make this work. It should work, and it would be nice if it worked, but ghc's -M support is currently broken (as in, it doesn't generate proper dependency rules, and omits rules for some non-Haskell files). Actually getting this to work reliably will take a great deal of effort, and in the end will trigger more rebuilds than strictly necessary.
Furthermore, support for parallel builds has been merged into GHC, so when ghc-7.8 is released you'll be able to use plain ghc --make to get parallel builds. Or you could use ghc's HEAD now.

how to execute haskell program in cygwin

I compiled my helloworld.hs and got a helloworld.o file, I tried ./helloworld, but it didn't work, so what is the right way to execute the helloworld?
I am using cygwin, I just write down $ ghc --make helloworld.hs and I get helloworld.hi, helloworld.exe.manifest, helloworld.o files, I don't know what do I need to do next...
Depending on whether you used a Cygwin ghc or a Windows native ghc, you got either a.out (a historical traditional name) or helloworld.exe. If you have a.out you'll need to rename it to something.exe to execute it on Windows.
You can easily tell ghc how to call the executable: ghc -o helloworld.exe --make helloworld.hs.
By the way ghc --help would have told you:
To compile and link a complete Haskell program, run the compiler like so:
    ghc-6.8.2 --make Main
where the module Main is in a file named Main.hs (or Main.lhs) in the current directory. The other modules in the program will be located and compiled automatically, and the linked program will be placed in the file a.out' (orMain.exe' on Windows).
As you haven't specified anything about how you compiled, such as for instance what compiler you're using, we can only guess.
The common way to get a .o (object) file out of ghc is using the -c switch; as the manual says, that means "do not link". The mnemonic is "compile only". Without linking, you have only a portion of a program, and it cannot be executed. Precisely what it needs to be linked against will depend on the particular object file, and some of that is filled in by default if you simply let the compiler run the linker. Linking separately is more complicated.

How to stop GHC from generating intermediate files?

When compiling a haskell source file via ghc --make foo.hs GHC always leaves behind a variety of intermediate files other than foo.exe. These are foo.hi and foo.o.
I often end up having to delete the .hi and .o files to avoid cluttering up the folders.
Is there a command line option for GHC not to leave behind its intermediate files? (When asked on #haskell, the best answer I got was ghc --make foo.hs && rm foo.hi foo.o.
I've gone through the GHC docs a bit, and there doesn't seem to be a built-in way to remove the temporary files automatically -- after all, GHC needs those intermediate files to build the final executable, and their presence speeds up overall compilation when GHC knows it doesn't have to recompile a module.
However, you might find that setting the -outputdir option will help you out; that will place all of your object files (.o), interface files (.hi), and FFI stub files in the specified directory. It's still "clutter," but at least it's not in your working directory anymore.
GHC now has the options no-keep-hi-files and no-keep-o-files. See here for more information.
My usual workflow is to use cabal rather than ghc directly. This sets the outputdir option into an appropriate build folder and can do things like build haddock documentation for you. All you need is to define the .cabal file for your project and then say cabal install or cabal build instead of run ghc directly. Since you need to follow this process in the end if you ever want to share your work on hackage, it is a good practice to get into and it helps manage package dependencies as well.
You can set the -hidir to /dev/null, I think, sending them there. Also, the -fno-code option in general turns off a lot of output. You might just want to use Cabal.
Turns out that using -hidir/-odir/-outputdir is no good; /dev/null is a file, and not a directory. See http://www.haskell.org/pipermail/xmonad/2010-May/010182.html
2 cents to improve the workflow a bit:
We can put the following alias into the .bashrc (or similar) config
file:
alias hsc='_hsc(){ ghc -no-keep-hi-files -no-keep-o-files "$#";}; _hsc'
And then just call
$ hsc compose.hs
[1 of 1] Compiling Main ( compose.hs, compose.o )
Linking compose ...
$ ls
compose compose.hs

Resources