Thinning and renaming packages for ghci and scripting - haskell

For scripting and playing around in ghci I want to import Data.Matrix from the package matrix.
Base already contains Data.Matrix but from another package: matrices.
I've been able to work around this successfully with PackageImports:
For ghci I do:
$ stack exec --resolver lts-12.5 --package "matrix" -- ghci
Prelude> :set -XPackageImports
Prelude> import "matrix" Data.Matrix
and for scripts:
#!/usr/bin/env stack
-- stack --package matrix
{-# LANGUAGE PackageImports #-}
import "matrix" Data.Matrix
main = putStrLn $ Data.Matrix.prettyMatrix
$ Data.Matrix.fromList 1 1 [1]
executed with stack ghc script.hs; ./script.hs
But the documentation says "You probably don’t need to use this ... See also 'Thinning and renaming modules' for an alternative way ..."
There, it is suggested to use e.g. -package "base (Data.Bool as Bool)" so I wanted to try that and figured for my case it would be either something like
-package "base (Data.Matrix as Mx)" to rename the existing which I want to ignore, or
-package "matrix (Data.Matrix as Mx)" to add a custom name for the module that I want.
But I can't even make the example work:
stack exec -package "base (Data.Bool as Bool)" -- ghci
Invalid option `-package'
Did you mean this?
--package
...
stack exec --package "base (Data.Bool as Bool)" -- ghci
The following errors occurred while parsing the build targets:
- Directory not found: (Data.Bool
- Directory not found: Bool)
stack exec -package base (Data.Bool as Bool) -- ghci
bash: syntax error near unexpected token `('
For scripting
#!/usr/bin/env stack
(I've tried each of those separately)
-- stack -package "base (Data.Bool as Bool)"
-- stack -package base (Data.Bool as Bool)
-- stack --package "base (Data.Bool as Bool)"
-- stack --package base (Data.Bool as Bool)
import Bool
main = putStrLn $ show True
doesn't compile (stack ghc script2.hs)
[1 of 1] Compiling Main ( script2.hs, script2.o )
script2.hs:4:1: error:
Could not find module ‘Bool’
Use -v to see a list of the files searched for.
|
4 | import Bool
| ^^^^^^^^^^^

Everything you find in the GHC manual concerns either options directly to the GHC compiler, or in the Haskell code itself. But you're using Stack for package management, which is an entirely different beast. (Sure enough it invokes GHC, but it does way more than that.)
When using Stack, there's no reason to dabble with PackageImports (unless you really need to import two modules together of the same name from different packages!). By default, Stack hides every package that isn't explicitly a dependency, so no need to rename anything; just use normal Stack options for specifying the packages, and simple imports in the actual Haskell:
#!/usr/bin/env stack
-- stack --resolver lts-12.5 runghc --package matrix
import Data.Matrix as M
main = putStrLn . prettyMatrix
$ M.fromList 1 1 [1]
Make sure stack actually uses the resolver line, i.e. either
$ chmod +x matrixtest.hs
$ ./matrixtest
or
$ stack matrixtest.hs
but not stack ghc matrixtest.hs or something like that.

Related

Compiling a haskell script with external dependencies without cabal

I'm relatively new to Haskell and I realize I might be swimming against the stream here, but nonetheless, I'll ask:
Say I have a short Haskell script:
import Data.List.Split (splitOn)
main :: IO ()
main = do
let orders = splitOn "x" "axbxc"
putStrLn $ head orders
If I used only standard functions I could compile this with ghc <script.hs>. Because I depend on the split package to provide the splitOn function, the compilation fails.
Now, I have no difficulties setting up a cabal project with a project.cabal and a Setup.hs file in order to get this to actually compile. However, this feels like a lot of extra boilerplate for a standalone script.
So, is there a way to compile a single .hs file against some external package? Something similar to what in Python would be done by pip install something, "installing the package into the interpreter", i.e. is there a way to install extra packages "into ghc", so that I for instance only need to provide some extra linking flag to ghc?
The Cabal equivalent of the Stack script in bradrn's answer would be:
#!/usr/bin/env cabal
{- cabal:
build-depends: base
, split
-}
import Data.List.Split (splitOn)
main :: IO ()
main = do
let orders = splitOn "x" "axbxc"
putStrLn $ head orders
The script can be run with cabal run, or directly by giving it execute permission. If need be, version bounds can be added as usual to the build-depends on the top of the script.
(Note this isn't literally a solution without Cabal, as doing this with GHC alone, even if it is possible, wouldn't be worth the trouble. In any case, it certainly avoid the boilerplate of needing multiple files.)
If you use Stack, the simplest way to do this is to write a ‘Stack script’, which is a Haskell file with a description of the required packages in the first line (really an invocation of stack specifying the appropriate command line arguments). An example (slightly modified from the docs):
$ cat turtle-example.hs
-- stack --resolver lts-6.25 script --package turtle
{-# LANGUAGE OverloadedStrings #-}
import Turtle
main = echo "Hello World!"
$ stack ./turtle-example.hs
Completed 5 action(s).
Hello World!
$ stack ./turtle-example.hs
Hello World!
This script uses the turtle package; when run, Stack downloads and builds this dependency, after which it is available in the script. (Note that the second time it is run, turtle has already been built so does not need to be rebuilt again.)
As it happens, the --package command in Stack is not limited to scripts. It can be used with other Stack commands as well! For instance, to compile your program, you should be able to run stack ghc --resolver lts-16.27 --package split -- -ghc-options your-program-name.hs. And stack ghci --package split will give you a GHCi prompt where you can import Data.List.Split.
(Note: This answer focuses on Stack rather than Cabal, simply because I don’t know Cabal very well. However, I believe all this can be done using Cabal as well. For instance, I do know that Cabal has something very similar to the Stack scripts I mentioned above, though I can’t remember the syntax just at the moment.)
EDIT: See #duplode’s answer for how to do this with Cabal.
You can install into the default environment for the current user by doing cabal install --lib split. The package should then be available to ghc and ghci without needing any special options.
More information is at the bottom of this section in the Cabal manual. The v2 commands that it uses are the default now so if you have a fairly new cabal you can just use install rather than v2-install.
I think this is the quintessential entry point to the package management battle in Haskell. It's not there's not enough advice, but there's so much, each with its own caveats and assumptions. Climbing that mountain for the sake of splitOn feels to the newbie like they're Doing It Wrong.
After spending far too much time trying each permutation, I've collated the fine answers here, and many, many others from elsewhere, put them to the test, and summarised the results. The full write up is here.
The pertinent summary of solutions is:
Install globally
You can still do global installs with cabal install --lib the-package.
Use Stack as a run command
You can use stack directly, eg: stack exec --package containers --package optparse-generic [...and so on] -- runghc hello.hs
Create a Stack project
The real deal. Run stack new my-project hraftery/minimal, put your code in Main.hs and your dependencies in my-project.cabal (and maybe stack.yaml - check the article), and then run stack build to automagically pull and build all dependencies.
Use a Stack script
Make your Haskell file itself an executable Stack script. Add -- stack --resolver lts-6.25 script --package the-package to the top of your .hs file and set its executable bit.
For my Edit/Test/Run workflow (eg. using VS Code, GHCi and GHC respectively), I found it pretty clear what works in practice. In summary:
Amongst a chorus of discouragement, Global Installs suit what I know of your use case just fine.
Where Global Installs don't make sense (eg. for managing dependency versions or being portable) a Stack project starting from my minimal template is a smooth transition to a more sophisticated and popular method.

Could not find module ‘Test.QuickCheck’ on Windows

My ghci version is 8.4.3
I tried
stack install QuickCheck
Something was installed. But when I input import Test.QuickCheck, it tells Could not find module ‘Test.QuickCheck’ again. How can I fix it?
Firstly, stack install is not recommended for installing executables or libraries. Instead, there's a couple of things you can do to use the QuickCheck library:
If you want to use QuickCheck in a command such as stack ghci or stack ghc, you can add it as a --package option e.g. to run a REPL to play around with QuickCheck you can use stack ghci --package QuickCheck and then write import Test.QuickCheck.
If you want to write a small one-file program using QuickCheck, then you can run stack ghc --package QuickCheck -- MyProgram.hs (using the --package option from the last bullet point). Alternately, you can use stack's scripting functionality and include a line such as this at the top of your program:
-- stack --resolver lts-12.18 script --package QuickCheck
If you want to use QuickCheck in a large project, then add it as a dependency in your my-program.cabal or project.yaml file.
The same guidance applies to any package you may want to use.
myos>cabal update
myos>cabal install --lib QuickCheck
myos>ghci
gchi> import Test.QuickCheck

Stack: Compile stand-alone source file

Once you've installed Stack, you can use it to install GHC for you. Great!
...now how do I compile a file with it?
To be clear: What you're supposed to do is write a package specification and have Stack build that. But surely there must be a way to trivially compile a tiny helper file inside that project? It seems silly to have to construct an entire project just so I can build a tiny helper program and have it easily runnable from the shell.
I know I can run any Haskell file using Stack. But I cannot for the life of me figure out how to compile the thing...
You can use stack ghc -- <file names> to compile and link a set of files (it's briefly mentioned in Stack's user guide):
You'll sometimes want to just compile (or run) a single Haskell source file, instead of creating an entire Cabal package for it. You can use stack exec ghc or stack exec runghc for that. As simple helpers, we also provide the stack ghc and stack runghc commands, for these common cases.
The -- is to ensure the arguments we pass are sent to ghc, rather than being parsed as arguments to stack exec. It's the same thing as when trying to pass arguments to an executable you've made using the normal stack toolchain: stack exec myExe -foo passes -foo to exec, not myExe, stack exec myExe -- -foo behaves as desired.
For example:
Bar.hs
module Bar where
bar :: Int
bar = 5
Foo.hs
import Bar
main :: IO ()
main = print bar
Compilation (don't even need to specify Bar.hs in the build files, it's sourced automatically):
> stack ghc -- Foo.hs
[1 of 2] Compiling Bar ( Bar.hs, Bar.o )
[2 of 2] Compiling Main ( Foo.hs, Foo.o )
Linking Foo ...
> ./Foo
5
There's no problem with dependencies either - it looks like all the packages installed locally are available to the build (you can use containers or QuickCheck without any additional build params).

Enable package-wide extensions in "stack runghc"

If I have the following in my package.yaml file:
default-extensions:
- LambdaCase
I am able to compile my project which makes use of the LambdaCase syntax like this:
myFunction = \case
Nothing -> "empty"
Just x -> x
However, if the project is run with stack runghc, the LambdaCase extension is not respected.
My project has about 200 modules, so I would rather not have to add {-# LANGUAGE LambdaCase #-} to the top of every file.
Is there a way to enable a project-wide GHC extension with stack runghc analogously to the package-wide default-extensions property in package.yaml?
Yes, stack should probably have some better support for this -
see https://github.com/commercialhaskell/stack/issues/3338 .
I'd say the summary is that stack runghc came before stack ghci, it's ended up having a much simpler meaning that does not take cabal files into account at all. Not sure how to make things consistent and intuitive on the commandline without changing the meaning of runghc.
In that issue, I describe a hacky workaround. Copying it here:
Here's a workaround for now. Put the following in ~/.local/bin/stack-run-ghc.sh and make it user executable:
#/bin/sh
ghc $(echo "$*" | sed 's/--interactive//g')
This takes the arguments, removes --interactive, and calls ghc. With this, I can build stack using ghc via the following:
stack ghci --with-ghc stack-run-ghc.sh --ghci-options src/main/Main.hs

Force Haskell Stack build to use the Resolver Specified after Shebang

I'm trying to use haskell for scripting in the pattern specified here on the "Script interpreter" section:
https://haskell-lang.org/tutorial/stack-script
When I have comments like this in my script after the shebang:
#!/usr/bin/env stack
{- stack
--resolver nightly-2016-11-26
--install-ghc
runghc
--package http-conduit
-}
{-# LANGUAGE OverloadedStrings #-}
I know I can safely say run:
$ stack myfile.hs
And it correctly picks up the resolver version I specified I need.
However, if I also try to build my script, to have the best of both worlds in scripting for development while I figure things out, and compilation once I'm more sure...
username#myhost:~/$ stack exec -- ghc myfile.hs && ./myfile
Run from outside a project, using implicit global project config
Using resolver: lts-7.10 from implicit global project's config file: /home/username/.stack/global-project/stack.yaml
That will error out because it's using resolver 'lts-7.10' from my global config instead of 'nightly-2016-11-26' even though my comments after the shebang specify a specific, non-global resolver to use.
While I know I could run ghc above and explicitly call --resolver on the command line, I would prefer if this were picked up from what's within the file itself. Is this possible?

Resources