I would like to install some Haskell libraries globally, for example hindent which is used by my editor's Haskell integration. What is the recommended way to do this?
I thought that stack install hindent was the correct way to do this. However, then I wanted to update my packages and found that there was no way to do this. According to the GitHub issue report I found,
stack is concerned with managing a local build sandbox for a project. It isn't intended to be a global package manager.
There appear to be workarounds such as maintaining a dummy project with artificial dependencies on the packages I would like installed. This sounds like a terrible hack, and I have been unable to find any official documentation on what approach should actually be taken.
Installing Haskell packages using my system package manager (Homebrew) is not an option since they are not packaged.
I would have opened an issue report against Stack, however the contribution guidelines requested that I instead ask a question here under the haskell-stack tag.
Well, stack install in any project will install to ~/.local/bin therefore making whatever executable you install be globally accessible.
The global project is used when running stack without a project, it is located in ~/.stack/global-project/stack.yaml.
If you want all of your globally accessible tools to have the same dependencies (perhaps to ensure that the ghc version matches or something), then you could make a project intended to build all of these tools. It's up to you whether or not it is the "global project" - there's not much special about it, it's just a default if you run stack and aren't in a project.
In order to record "what haskell executables do I want installed globally", you might consider creating a shell file like
#!/bin/sh
stack install hindent
And then running this whenever you change the versions of the installed tools.
Also, for tools like intero that need to match the ghc version, you can do stack install --copy-compiler-tool intero, and it will be available on the PATH when stack is used with that ghc version.
Related
I'm new to Haskell stack and wondering how to find out the name of the package that contains a particular module.
Currently, I want to use Data.Tuple.Extra(fst3) ( https://hackage.haskell.org/package/extra-1.7.9/docs/Data-Tuple-Extra.html ) and want to know what I should write below
$ stack install ????
I've already installed the tuple package, which, however, doesn't seem to include the Extra part.
All the Internet resources about the installation of a package I've found so far say something along the lines of "To use Blahblah.Anything.Something, you need to install the foofoo package" . . . How can one know? I searched Stackage but it shows only the documentation of Data.Tuple.Extra and I still fail to find the name of the package.
Edit: As K.A.Buhr notes in her/his answer, stack install is the wrong command for the above case. Use stack build instead.
When browsing package documentation in Hackage, the top-left portion of the page header will always give the package, version number, and description. On the page you link, it's here:
You can also use the "Contents" link in the top-right to go to the main page for the extra package, which gives its full list of modules, licensing, links to the package home page and bug tracker, and so on.
As a side note, stack install extra is technically the wrong command to "install" this package. If you want to make the extra package available for use within the Stack global project, the correct command is stack build extra. If you want to use extra within a stack project, then you want to add extra to the dependencies in your package's xxx.cabal or package.yaml file instead and run stack build (no arguments) to build and install it for use in your project.
In contrast, the stack install command is equivalent to stack build --copy-bins which copies any executables in the package to ~/.local/bin so they'll be in your path. See the Stack docs. It's intended to be used for installing programs written in Haskell that are distributed via Stack, so you can do stack install hlint to install the hlint linter, for example.
In this case, because the extra package has no executables, stack install extra and stack build extra will do the same thing, but it's better to get into the habit of using stack build when you aren't intending to install any package binaries, to avoid surprises.
I was trying to add hdevtools to my stack project, so I ran stack build hdevtools. The install seemed to work successfully, and my text editor stopped reporting imported libraries installed via stack (like aeson and tasty) as missing.
However, things went wrong when I added this line to the dependencies section of my package.yaml file:
- hdevtools >= 0.1 && < 1
And then tried to run stack build again. I received the following error output:
Error: While constructing the build plan, the following exceptions were encountered:
In the dependencies for my-app-name-0.1.0.0:
hdevtools is a library dependency, but the package provides no library
needed since my-app-name is a build target.
Some different approaches to resolving this:
* Consider trying 'stack solver', which uses the cabal-install solver to attempt
to find some working build configuration. This can be convenient when dealing
with many complicated constraint errors, but results may be unpredictable.
Plan construction failed.
I tried running stack solver, but that threw the exception documented here.
How can I declare hdevtools as a dependency of my project?
#alexis-king recommends using stack build --copy-compiler-tool hdevtools in this guide, in the section titled Setting up editor integration.
This works for the current project, and other projects using the same GHC version, but you will need to run it again when you upgrade to a new GHC version.
More context from King's guide:
As mentioned above, stack install is not what you want. Tools like ghc-mod, hlint, hoogle, weeder, and intero work best when installed as part of the sandbox, not globally, since that ensures they will match the current GHC version your project is using.
How can I declare hdevtools as a dependency of my project?
hdevtool is an executable and cabal doesn't have a concept of development dependencies (like in other package managers like npm etc). So, all you can do is install hdevtools globally and make it work.
Is it possible to install package from sources with something similar to stack build package-name? (latter works with packages on Stackage, but not with custom ones)
Um, stack build (within the source directory)?
Stack doesn't really have a notion of installing libraries though, it only installs executables. To “install” locally-sourced packages, you need to specify what for you want them installed: add them as dependencies to another project, via a location: field in the packages: field in that project's stack.yaml file.
That's arguably sensible since, one might say, there's nothing you can do with an installed library except invoking it in another Haskell project (or in a REPL, which you can get with stack ghci). I personally don't hold with that though, I like actually being able to say install that library now. Which is one of the reasons I have always stuck to good old cabal-install rather than Stack. With that, you can just
cabal install
from within the source directory.
Cabal-install has often been criticised: its local installs can easily get out of sync and then you have weird dependency conflicts and need to rebuild lots of stuff. I never found this that much of a problem, and anyway this has been adressed in recent Cabal through Nix-style builds, which never produce conflicts.
user$: stack install dictionaries
Error: While constructing the build plan, the following exceptions were encountered:
In the dependencies for dictionaries-0.1.0.0:
binary-0.8.3.0 must match >=0.7.5 && <0.8 (latest applicable is 0.7.6.1)
time-1.6.0.1 must match >=1.5.0 && <1.6 (latest applicable is 1.5.0.1)
With the above command, I want to install the dictionaries package globally.
What are my options here?
I plan to stack unpack dictionaries, and then modify the versions.
But then how to install the modified 'local' package globally?
What's the best practices here?
Thanks
Easiest one: Add allow-newer: true to stack.yaml
It would to be the solution in this case. It solves the upper version limitation problem like yours. But of course there is possibility of build failure.
Maybe possible one: Change resolver to latest nightly
That failure occurs sometimes on nightly snapshot. And usually repaired soon by library update and new nightly snapshot. If you are using old snapshot, change it to latest. Or waiting may be one of solution.
Most general one: Make dependency to modified local package
You can do it by adding the package path to stack.yaml packages:. Then stack will use it instead of snapshot one.
The way you want is probably not a good idea. Even if that's possible, how are you going to handle with the many version of snapshot case? Local dependencies should be specified to each package.
With the above command, I want to install the dictionaries package globally.
(Preliminary note: by "globally", I will assume you mean globally for your user, as opposed to a system-wide installation.)
dictionaries is not in any Stackage snapshot. As far as I'm aware of, that means you cannot install it globally, as for libraries that is only an option if the package is in a snapshot. Cf. Stack issue #2656 -- while the planned feature described there sounds like what you are trying to do, there is a caveat:
Should also warn when it isn't used with --copy-bins, and if there are targets that don't have executables, as these both indicate a misunderstanding about how it works.
That being so, my suggestion is to install the package per-project, using the packages field with an extra-dep key -- that is, the "most general" solution in jeiea's answer.
Is there a way to view the documentation of any package I installed in a cabal sandbox? Currently I have a script that places the documentation in a common path so I can view it with a simple server script, but I imagine there's a better or more accepted way of doing this.
To be clear, I do not mean cabal-dev, but the sandboxing tools in the latest cabal.
If you install packages into sandbox with documentation enabled (cabal install --enable-documentation or documentation: True in ./cabal.config), the generated documentation will be put under ./.cabal-sandbox/share/doc/$arch-$os-$compiler/$pkigid. In other words, this works in the same way as with the user package DB (one exception is that a local documentation index is not created in the sandbox case - this is a known issue).