Default GHC flags with Haskell Stack - haskell

I wanted to set some custom compiler flags for all stack-compiled packages on my machine (more aggressive optimizations than the defaults), and following the stack guide, I added some ghc-options (applying to "$everything") to my ~/.stack/config.yaml. These flags were properly applied to stack invocations outside of any project directory.
Within project directories, however, the stack.yaml options don't just take precedence; everything in ~/.stack/config.yaml is completely ignored! I have no ghc-options specified in any project-local stack.yaml files, but those specified ~/.stack/config.yaml have no effect.
The stack documentation would seem to suggest that options should collapse, as I had expected them to:
stack has two layers of configuration: project and non-project. All of these are stored in stack.yaml files, but the former has extra fields (resolver, packages, extra-deps, and flags). The latter can be monoidally combined so that a system config file provides defaults, which a user can override with ~/.stack/config.yaml, and a project can further customize.
Unless I have a very different idea of what "customize" means, this isn't the behavior I'm observing. Is there something I'm doing incorrectly, or is the documentation misleading in this respect?

Yup, I consider this to be a bug. I have a PR that hasn't been merged yet that fixes it https://github.com/commercialhaskell/stack/pull/3781

Related

Nested git dependencies when using Stack (Haskell)

I have two Haskell libraries lib-a and lib-b, both hosted on private git repos. lib-b depends on lib-a, both build with no problem.
Now i want to import lib-b into another project and thus add it to the stack configuration with the git directive, like this:
- git: git#github.com:dataO1/lib-b.git
commit: deadbeef102958393127912734
Stack still seems to need a specific version for lib-a:
In the dependencies for application-0.1.0.0:
lib-a needed, but the stack configuration has no specified version (no package with that name found,
perhaps there is a typo in a package's build-depends or an omission from the stack.yaml packages
list?)
needed due to application-0.1.0.0 -> lib-b-0.1.0.0
The question now is, can stack somehow figure out specific versions for nested git dependencies without explicitely specifying them? If the project grows i dont want to manually adjust this every time i update lib-a.
Sidenote: I'm using nixOS and the nix directive for all three stack projects.
Stack follows the snapshot-based model of package management, meaning that it has some "globally" specified set of packages (of fixed version) that you can use. In Stack's case this set of packages is called Stackage. The core idea is to have this clearly specified set of packages that you're working with.
So the short answer is no it cannot figure it out by itself, you have to add them by hand.
But! you need to specify only packages that are not in the snapshot. e.g. package lib-a is likely to depend on mostly packages that are commonly used in Haskell (e.g. base, aeson, ...) and those will already be in Stackage. so even if the project grows you will be adding just "a few" git refs.
So this doesn't generally tend to be a problem.

hide deps info in a chain of extra-deps

This is a Haskell stack package dependency configuration question.
I created a Haskell library git:LibA
I created another Haskell library git:LibB depending on git:LibA
I created a Haskell application AppC depending on git:LibB
To compile LibB, I need to specify git:LibA in extra-deps section of LibB's stack.yaml with a commit checksum, which is reasonable.
To compile AppC, it seems that I need to specify both the following packages in extra-deps section of AppC's stack.yaml
git:LibB with a commit checksum
git:LibA with a commit checksum
Is there any way to only specify git:LibB in AppC or configure LibB to hide git:LibA information to applications?
Motivation of the question: my current AppC's stack.yaml is error-prone: if I update LibA and the commit checksum in LibB but forget to update the checksum in AppC, then I will not get the new LibA in AppC. In my opinion, the newer checksum of LibB should already contain the information of a newer LibA, and the developer of AppC should not need to update LibA's checksum in AppC but only LibB's checksum.
No, stack doesn't "see" the stack.yml of any dependencies, only their respective .cabal descriptors.
If you're building multiple internal packages you can just put them all under the same source tree and list them under the packages list in your stack.yaml:
packages:
- LibA
- LibB
- AppC
Note that that doesn't mean you all have to put them into the same VCS if you don't want to - you could use git submodules; Or you could even list git locations/hashes pretty much the same way you do with extra-deps directly.
That's of course not an option if your packages are supposed to be entirely independent, but at that point you'll probably want to look into a more structured solution like making custom snapshots anyway.
No, this is not possible. By design, stack uses only the stack.yaml of the project you are building. It does not read or use any stack.yaml file which may exist in any of your dependencies. One advantage of this design is that there is a single place to specify package versions. Otherwise it's not clear how to handle different stack.yaml requesting different versions of the same package.
There are a couple of options if you want more convenience when developing these packages together, and don't care as much about keeping them independent. You can specify relative paths in your stack.yaml, and always build with whatever versions you have locally checked out. Or you can bring all three into a single VCS repo, and use the VCS to manage which changes to A should be linked to which in C.

Conditional selection of integer simple in Stack

Due to licensing constraints, we need to use the integer-simple variant of GHC when compiling in Windows platforms. Currently this is specified in our stack.yaml file:
ghc-variant: integersimple
# ...
extra-deps:
- text-1.2.2.1
# Override default flag values for local packages and extra-deps
flags:
text:
integer-simple: true
However, this won't work on Linux.
It'd be nice a way to conditionally include the code above depending on the host architecture. Is there a way of doing this using stack, and if not, how would you go about solving this problem?
The only alternative I can think about right now is having two stack files, but I'd like to avoid that.
TLDR: Use a custom Setup.hs.
It doesn't seem to be possible at present, because .cabal doesn't allow passing on flags to packages (although you can do so via the command line) and .yaml doesn't allow conditionals at present1.
1. I find the stack and cabal documentation kind of obtuse and disorganised. Add to that the fact that the API seems to change rather often, and it becomes rather easy to see how one might overlook some feature that might solve your problem. Keep an eye open.

Will ghc-options of an executable override ghc-options of linked libraries?

I have a main Haskell executable program with a cabal file. There, I specify the ghc-options.
This executable links against other libraries out in the wilderness. Will the ghc-options of the cabal files of these libraries be ignored?
I'm basically wondering whether the executable's ghc-options will be used for the entire enchilada (main executable + libraries).
Additional bounty notes: Please also expand on chi's comment below, namely, what exactly is the difference between ghc-options for compiling vs. linking. Which are which, and which are never needed in libraries? Maybe you can talk about some of the most important ones, such as the -threaded mentioned below.
Under the normal cabal-install workflow (and the stack workflow built atop it), flags specified in your Cabal file are local to your package, and should not trigger rebuilds. Similarly, options specified with --ghc-options on the command line are local to your package.
To your specific questions about -threaded, this flag has no effect on library code (as cabal-install will tell you), only on executables.
A brief listing of GHC flags is available here. In particular, note that -threaded is listed under Linking options, with a further link to Options affecting linking. From this information, we conclude that -threaded is only meaningful for executables because it signals to GHC that we wish to use the threaded runtime. If your package doesn't provide an executable, it has no need for any runtime, threaded or otherwise.
For a high-level explanation of compiling vs. linking: they are two of the steps between source code and executable. Compilation is the process of producing an object file from source code. Linking is the process of connecting the numerous object files which compose your executable. When you compile an executable, it has no idea that a function, say map exists unless you defined it, so it just compiles under the assumption that it does. The linking step is where we make all those names available and meaningful. In the case of -threaded, we are making the linking process aware of the threaded runtime, which all code calling on the runtime will use.
Since I don't know if you're using the standard cabal workflow, stack, or the new cabal.project workflow, here's a digression to discuss this behavior in the cabal.project case.
This is actually an open bug, right now.
The bug is tracked as issue 3883 on the Cabal GitHub (and somewhat in the related issue 4247).
Relevant to your question, according to current behavior, specifying flags in a ghc-options stanza in a cabal.project file causes those dependencies to be compiled (or recompiled, as the case may be) with those flags.

Is the stack.yaml file supposed to be checked into version control?

I'm quite new to stack and wondering whether to git commit or .gitignore that file.
What are the implications of either of these choices?
I'd say you should commit stack.yaml, as that makes it much easier to build your package in a reproducible way. That is particularly relevant if your repository is public, and if you use the more exotic kinds of extra-deps in stack.yaml (pointers to Git repositories, secondary cabal packages within your source tree, etc.).
A complementary observation is that we should still provide reasonable version bounds for dependencies in the .cabal file even if we are using stack, as doing otherwise would make life harder for people who don't use stack or have a set of packages different than the specified by stack.yaml.
Yep. stack.yaml has a whole bunch of (not always necessary) fields such as the extra dependencies that matter for consistent builds. Check it in.

Resources