Hooking up a build tool in Cabal (Haskell) - haskell

I was trying to use bnfc tool to generate a bunch of files, like lexer, parser, etc. for me. This works fine. Now I wanted to clean this up a bit by not having to manually compile the bnfc file and having it generate a number of files which clutter my /src folder.
I tried the Cabal mechanism where you list the tool in build-tools field of the .cabal file and mention the files you expect to be generated by extra-source-files field. This worked for me for Alex and Happy as they are recognised as build-tools by Cabal but bnfc isn't. Is there a way I can hook up bnfc or any tool in general with Cabal and have Cabal recognise them as build-tools?

Apparently cabal doesn't know about bnfc (doesn't appear on the list).
Looks like there's no way to do it using just the .cabal file, but there's an example of how to hook up a preprocessor in your Setup.hs in the cabal sources under tests/PackageTests/CustomPreProcess/Setup.hs using the user hook hookedPreProcessors (all hooks are in UserHooks.hs)

Related

How to configure syntastic to use build-depends and hs-source-dirs from my test suite in the .cabal file of the package?

Syntastic works great in my system with hdevtools and hlint. But if I'm editing a file under a test directory, importing packages that are exclusively under the test-suite configuration of the cabal package, it marks my imports as bogus and tells me to include them in my cabal file. The same problem happens with the hs-source-dir, it only finds the ones under the library or executable directory.
There is no silver bullet. You can either set g:syntastic_haskell_hdevtools_args and friends to the proper flags for your project, or write a wrapper script similar to this and point g:syntastic_haskell_hdevtools_exec to it. Syntastic has no built-in support for looking at cabal files.

How to get cabal and nix work together

As far as I understood, Nix is alternative for cabal sandbox.
I finally managed to install Nix, but I still don't understand how it can replace a sandbox.
I understand you don't need cabal using Nix and the wrapped version of GHC; however if you want
to publish a package you'll need at some point to package it using cabal. Therefore, you need to be able to write and test your cabal configuration within NIX. How do you do that?
Ideally, I would like an environment similar to cabal sandbox but "contained" within NIX, is that possible? In fact, what I really would like is the equivalent of nested sandboxes — as I usually work on projects made of multiple packages.
Update about my current workflow
At the moment I work on 2 or 3 independent projects (P1, P2, P3) which are each composed of 2 or 3 cabal modules/packages, let's say for P1: L11, L12 (libraries)
and E11 (executables). E11 depends on L12 which depends on L11. I mainly split the executables from the library because they are private and kept on a private git repo.
In theory, each project could have this own sandbox (shared between its submodules). I tried that (having a common sandbox for L11 L12 and E11), but it's quickly annoying because, if you modify L11, you can't rebuild it because E11 depends on it, so I have to uninstall E11 first to recompile L11.
It might be no exactly the case, but I encounter the similar problem.
This would be fine if I were occasionally modifying L11, but in practice, I changed it more that E11.
As the shared sandbox doesn't work, so I went back to the one sandbox for every package solution. It's working but is less than ideal.
The main problem is if I modify L11, I need to compile it twice (once in L11, and then again in E11). Also, each time I'm starting a new sandbox, as everybody knows, I need to wait a while to get everything package downloaded and recompiled.
So by using Nix, I'm hopping to be able to set up separate cabal "environments" per project, which solves all the issue aboves.
Hope this is clearer.
I do all my development using Nix and cabal these days, and I can happily say that they work in harmony very well. My current workflow is very new, in that it relies on features in nixpkgs that have only just reached the master branch. As such, the first thing you'll need to do is clone nixpkgs from Github:
cd ~
git clone git://github.com/nixos/nixpkgs
(In the future this won't be necessary, but right now it is).
Single Project Usage
Now that we have a nixpkgs clone, we can start using the haskellng package set. haskellng is a rewrite of how we package things in Nix, and is of interest to us for being more predictable (package names match Hackage package names) and more configurable. First, we'll install the cabal2nix tool, which can automate some things for us, and we'll also install cabal-install to provide the cabal executable:
nix-env -f ~/nixpkgs -i -A haskellngPackages.cabal2nix -A haskellngPackages.cabal-install
From this point, it's all pretty much clear sailing.
If you're starting a new project, you can just call cabal init in a new directory, as you would normally. When you're ready to build, you can turn this .cabal file into a development environment:
cabal init
# answer the questions
cabal2nix --shell my-project.cabal > shell.nix
This gives you a shell.nix file, which can be used with nix-shell. You don't need to use this very often though - the only time you'll usually use it is with cabal configure:
nix-shell -I ~ --command 'cabal configure'
cabal configure caches absolute paths to everything, so now when you want to build you just use cabal build as normal:
cabal build
Whenever your .cabal file changes you'll need to regenerate shell.nix - just run the command above, and then cabal configure afterwards.
Multiple Project Usage
The approach scales nicely to multiple projects, but it requires a little bit more manual work to "glue" everything together. To demonstrate how this works, lets consider my socket-io library. This library depends on engine-io, and I usually develop both at the same time.
The first step to Nix-ifying this project is to generate default.nix expressions along side each individual .cabal file:
cabal2nix engine-io/engine-io.cabal > engine-io/default.nix
cabal2nix socket-io/socket-io.cabal > socket-io/default.nix
These default.nix expressions are functions, so we can't do much right now. To call the functions, we write our own shell.nix file that explains how to combine everything. For engine-io/shell.nix, we don't have to do anything particularly clever:
with (import <nixpkgs> {}).pkgs;
(haskellngPackages.callPackage ./. {}).env
For socket-io, we need to depend on engine-io:
with (import <nixpkgs> {}).pkgs;
let modifiedHaskellPackages = haskellngPackages.override {
overrides = self: super: {
engine-io = self.callPackage ../engine-io {};
socket-io = self.callPackage ./. {};
};
};
in modifiedHaskellPackages.socket-io.env
Now we have shell.nix in each environment, so we can use cabal configure as before.
The key observation here is that whenever engine-io changes, we need to reconfigure socket-io to detect these changes. This is as simple as running
cd socket-io; nix-shell -I ~ --command 'cabal configure'
Nix will notice that ../engine-io has changed, and rebuild it before running cabal configure.

Alex, Happy, Cabal, and Re-preprocessing

I am using Alex 3.0.5, Happy 1.18.10, Cabal 1.16.0.2
I have a small compiler project that is built using Cabal. I am exposing the compiler's internals as a library, so I have in the exposed modules section, MyLangLex and MyLangPar. If I delete the .hs files that are generated by Alex and Happy, then running cabal configure, and then cabal build will run Alex and Happy first, generate the files, and then proceed with the build, and everything works as expected. However, if I do not delete these files, Alex and Happy either do not build the files, or they don't put them in the right place. I think Happy runs, because I see a message from Happy; however, when I look at the .hs file that should be generated it is incorrect (doesn't have a change in it), and I can tell for sure that the version of the .hs file that Cabal uses in the build is the wrong one because the behaviour that should have changed does not. I.e. The change to the .y file does not get incorporated into the built program, so I suspect that while Happy is run, Cabal places this file in some temp directory, and then uses the old .hs file, which is still there for the build. But I am not sure about this.
Is the error on my part or is one of the tools misbehaving?
It sounds like you need a "other-modules:" directive in your library section for Lex.x and Par.y:
library
...
build-tools: alex, happy
other-modules: Compiler.RSL.Syntax.Lex, Compiler.RSL.Syntax.Par
The other-modules directive together with build-tools will instruct cabal to use alex and to create Compiler/RSL/Syntax/Lex.hs from the .x file if it doesn't exist (and the same for Par.hs).
Alternatively, add Compiler.RSL.Syntax.Lex to your 'exposed-modules' list. This tells cabal that the Lex.hs file should exist, and so if it doesn't cabal will look for ways to build it using the tools in the build-tools line.

Why cabal tool doesn't use Setup.lhs/Setup.hs?

I've added a putStrLn "Hello" line into main function of my Setup.lhs and was expecting to see it when running cabal configure or cabal build. But i did not.
Then i've compiled Setup.lhs with ghc --make and ran ./Setup configure and the line was shown.
If it's done intentionnaly, i don't see rationale behind this and even need in Setup.lhs file at all. Can you clear these things for me?
You most likely have
build-type: Simple
in your .cabal file. If you select the Simple build type, you essentially promise that your Setup file does nothing but invoke defaultMain, and the cabal binary will not invoke it. If you want to ensure that your Setup file is run every time, then change the line to
build-type: Custom
You also ask about the rationale for requiring the Setup file anyway: actually, it isn't required if you use the Simple build type. The cabal binary will happily configure and install it without. However, it is considered good style to include a Setup file for any package, because it will allow users to install the package who have the Cabal libary available, but not the cabal-install tool (and Hackage enforces the presence of a Setup file for this reason).

Run HAppStack app withot cabal

I'm trying out HAppStack. I installed HAppStack and created a project: happstack new project web. New folder 'web' created with project guestbook under it. So now I want to run it. The only way I could do it is run cabal install. But I want to run my app without installing with cabal! Executing run.sh errors: Could not find module 'Paths_guestbook'. How can I do it?
Edit:
In general, is there a way to run HAppStack app without rebuild like in Snap?
In general, you can always build Cabal projects without installing simply by doing:
$ cabal configure
$ cabal build
The resulting executable will usually be called dist/build/<project>/<project>.
The specific error you're getting is because the code must be built with Cabal to get the Paths_guestbook module, which will contain information about the location of data files used by it. (It may be the case that it's unable to find these data files if you run the executable without installing it; in that case, you'll need a more elaborate solution, such as cabal-dev.)
(I'm not a Happstack user, so I don't know if there's an official way to accomplish this, but this should work for basically any Cabal-based project in general. The repository shows that run.sh was last modified in 2009, so I suspect it has simply bit-rotten. It doesn't do anything special, though, so cabal build should work just fine.)
SHORT VERSION:
The run.sh seems to be missing an include paramater. Modify it to look like this:
#!/bin/sh
runghc -isrc -isrc-interactive-only src/Main.hs
I have update the run.sh in darcs to include this change.
LONG VERSION:
Normally that flag is not needed for Happstack applications. You can usually just do, runhaskell Main.hs. But in that particular example the Main.hs explicitly imports:
import Paths_guestbook (version)
which is used in the versionInfo function so that the server can report its own version number. Though version number in src-interactive-only is hardcoded and will generally be out of date. So it is only correct if you actually build with cabal.
The Paths_guestbook module is normally created automatically when cabal build is run. So, another fix would be to change the run.sh to:
#!/bin/sh
runghc -isrc -idist/build/autogen src/Main.hs
And run cabal configure && cabal build once. After that you will be able to use run.sh (until you do a cabal clean).
Another option would be to set a CPP flag in the .cabal file, and only import Paths_guestbook when the application is being built via cabal.
For example in the happstack.com source code:
http://patch-tag.com/r/stepcut/happstackDotCom/snapshot/current/content/pretty/Main.hs
In line 40 (or so) you will see an #ifdef __CABAL__. happstack.com needs to be able to know where to find the static content such as .css files. When doing runhaskell Main.hs in the local directory, it will look for the files in a sub-directory of the local directory. If you do cabal install it will instead look whever cabal installs the data files. Or, you can override the default location with command-line arguments. (Which is what the debian packaging for that app does).
Unfortunately, the happstack new project command is somewhat bitrotten because the author became a parent and has not had time to work on it in a long time. It will likely be removed from the upcoming Happstack release in order to reduce confusion.
In order to be truly useful, I think the command needs to prompt for a bunch of values and then generate a new project from a set of templates. Similar to how 'cabal init' works. But currently, no one has volunteered the time to make that happen.
To see changes to your source appear automatically with out restarting the server you can use the happstack-plugins library. There is an screencast of it here:
http://happstack.blogspot.com/2010/10/recompile-your-haskell-based-templates.html

Resources