Stack downloading package but sill can not use it - haskell

These are my dependencies in package.yaml:
dependencies:
- weasel
- network
- HTTP
- bytestring
but I still get the error
Could not find module ‘Network.HTTP.Base’
Use -v to see a list of the files searched for.
import Network.HTTP.Base
when importing

First, ensure the package.yaml buffer is saved to disk.
Second, assuming you are using stack, ensure that you are not seeing the following warning:
Warning: /Users/dan/scratch/foo/foo.cabal was modified manually. Ignoring
/Users/dan/scratch/foo/package.yaml in favor of the cabal file. If you
want to use the package.yaml file instead of the cabal file, then
please delete the cabal file.
```
If you are seeing this warning, check your foo.cabal file for anything you may want to keep, and port it over to your package.yaml file, and then delete the foo.cabal file so that hpack can generate a fresh one.
Third, ensure that this dependencies section pertains to the particular target you are currently trying to build. For example, if these are listed in the library dependencies, but not for the test suite dependencies, then the corresponding modules will not be available to the test suite. If the dependencies are specified at the top level, then they should in fact be available for all build targets.

Related

Problem with loading module ‘Distribution.Simple’

I was trying to build one new project in Haskell (GHC version 8.10.4, stack version 2.7.3, cabal version 3.6.2.0) using stack, but at the time of running the command stack setup I (surprisingly) got the following error:
Setup.hs:2:1: error:
Could not load module ‘Distribution.Simple’
It is a member of the hidden package ‘Cabal-3.2.1.0’.
You can run ‘:set -package Cabal’ to expose it.
(Note: this unloads all the modules in the current scope.)
Use -v (or `:set -v` in ghci) to see a list of the files searched for.
|
2 | import Distribution.Simple
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
Also, it is worth noting that I was trying to look for different solutions on the Internet, but didn't find the working one for me. The general idea was to explicitly add Cabal into the package.yaml file here:
library:
source-dirs: src
dependencies:
- Cabal
But that didn't save my situation. I was able to build the project without that step (well, just skipped that part), but I was interested to solve the issue.
Moreover, when I was trying to build Haskell package timeit (as an example, using the command line), there was the same error at the time of executing
runhaskell Setup.hs configure.
Interestingly, I didn't have this problem before (probably, it appeared after I updated GHC).
Does anybody know any ways of how to deal with such issue? Is there any way probably to reinstall the GHC, or cabal, or stack (if it helps)? Would be glad to see any comments and solutions.
Instead of deleting the
~/.ghc/*/environments/default
file, just add the Cabal package to it, including version number, which you can find if you run the command:
ghc-pkg list
I had the same issue.
Deleting the files
~/.ghc/*/environments/default
(or the whole folder .ghc) solved the stack setup problem.
It seems that they were somehow interacting with stack's operations.

What does 'no specified version' mean in my Cabal build?

The recent Travis CI build of the development version of my Haskell package reports the error
MissingH must match >=1.3.0.1, but the stack configuration has no specified version (latest matching version is 1.4.0.1)
when building for GHC 8.6.1, even though I have
MissingH >=1.3.0.1
in my build-depends.
I don't understand this error: it seems contradictory. I have no upper limit on MissingH, so why is it erroring and not using the latest?
You need to add MissingH to stack.yaml.
extra-deps:
- 'MissingH-1.4.0.1'
Your package's *.cabal file says what versions of dependencies are compatible with your package. It is a loose specification, not all combinations may actually work (because they may have conflicting bounds on transitive dependencies, or there is some unforeseen breakage with a particular version you haven't tested with).
In contrast, stack.yaml describes a particular snapshot of packages, pinned to specific versions. This precisely says "my package is known to work with those versions". Of course, it is tedious to maintain the version of every dependency, and for that the Stackage team maintains a "resolver", a curated set of package versions known to work together, that you can use to specify the version of many packages at once, by setting the resolver: field of stack.yaml appropriately. A resolver only lists a subset of packages on Hackage, so when one of your dependencies is not in there, you need to add it to your stack.yaml as an extra-dep.
Update: following the discussion, some more details about configuring travis are necessary.
First, my current preferred solution for CI of Haskell projects is to not bother with stack and use instead https://github.com/haskell-CI/haskell-ci which generates a travis script using cabal-install.
Now for a less radical solution.
Currently the travis script is only varying the --resolver option, but as far as I can tell there is no command line option to add an extra-dep. It seems stack.yaml files are the only way for that. Furthermore, we only want to specify MissingH as an extra-dep for the latest nightlies, because LTS'es already include it.
Thus I suggest the following:
Create a separate stack.yaml for the nightly resolver only, call it something else since you already have one, for example stack-nightly.yaml
packages:
- .
extra-deps:
- 'MissingH-1.4.0.1'
Set an environment variable to point to stack-nightly.yaml when the resolver is a nightly, maybe:
env:
...
- $RESOLVER=nightly STACK_YAML=stack-nightly.yaml
# Not sure of the syntax.
Otherwise you can use the --stack-yaml command line option.

Create hackage package that can be installed with stack

When running stack sdist in my project directory, the stack.yaml file isn't included in the tarball (this seems to be expected).
Consequently, when I upload the tarball to hackage, then stack install mypackage it complains about missing dependencies (extra-deps) which I specified in the stack.yaml file.
$ stack install pandoc-placetable
Run from outside a project, using implicit global project config
Using resolver: lts-5.17 from implicit global project's config file: ~/.stack/global-project/stack.yaml
While constructing the BuildPlan the following exceptions were encountered:
-- Failure when adding dependencies:
spreadsheet: needed (>=0.1.3 && <0.1.4), not present in build plan (latest applicable is 0.1.3.4)
needed for package: pandoc-placetable-0.4
-- While attempting to add dependency,
Could not find package spreadsheet in known packages
Recommended action: try adding the following to your extra-deps in /Users/maurobieg/.stack/global-project/stack.yaml
- spreadsheet-0.1.3.4
Or what's the recommended way to make a hackage package stack-installable if it has further hackage dependencies?
Update: I just added extra-source-files: stack.yaml to the cabal file and the stack.yaml is indeed included in the tarbal of the newly published version. Nevertheless, stack install pandoc-placetable-0.4.1 still comes up with the same error.
I could also just tell people who don't want to install cabal-install on their system to clone from GitHub, then build with stack. Is that the recommended approach for tiny packages? Or should I ask them to include the dependency of pandoc-placetable (i.e. spreadsheet) in their global stack.yaml? Smells like polluting a global file...
As mentioned by #mgsloan in the comments above: There's an open stack issue about using stack.yaml from hackage package.
I guess until it's fixed I'll just tell people to clone from GitHub (or as mentioned by #MichaelSnoyman to stack unpack) and then cd into the newly created directory and stack install there.

How to get stack to save dependencies?

I'm using the stack install command to save dependencies for a new project. How do I get it to save those dependencies into stack.yaml? Unless I'm missing something, I can't see where stack is recording the project dependencies and I can't seem to find anything in a docs about this.
You still keep your dependencies in a .cabal file. From the Stack FAQ:
A .cabal file is provided for each package, and defines all package-level metadata just like it does in the cabal-install world: modules, executables, test suites, etc. No change at all on this front.
A stack.yaml file references 1 or more packages, and provides information on where dependencies come from.
If you need additional versions of dependencies than the LTS Haskell snapshot you're using, you'll add them to the extra-deps portion of the stack.yaml file.

Cabal - Expose all modules while building library

Is it possible to tell Cabal to expose all modules while building a library?
Right now I have to provide very long list of modules in the exposed-modules cabal configurtion file section.
The modern answer is stack + hpack instead of using explicit cabal config. It could automatically expose package modules and provides many other enhancements.
You have to list all modules in the cabal configuration file. In your case, you just put the list of modules after exposed-modules:. There is no simpler way to write a list of modules.
Cabal cannot automatically find the files that are part of an executable or library, so it relies on the list of modules in the configuration file. Unlike GHC, cabal cannot find modules based on import statements in the source code. If you don't list every module, then you may be able to build the project (because GHC can find source files), but other commands such as cabal sdist will not access the source files that aren't listed.

Resources