Given the following directory structure:
root
├── scripts
│ └── script1.hs
└── source
├── librarymodule.hs
└── libraryconfig.txt
Where "librarymodule.hs" would be a library exporting multiple functions, where the output is influenced by the contents of the libraryconfig.txt file in his directory.
script1.hs is the file needing to use the functions declared in librarymodule.hs.
I can't find a solution on the internet for a structure as given above and hoped someone could help out.
GHC has a -i option. Under root/scripts/, this will add root/source/ to the search path:
ghc -i../source script1.hs
Also consider packaging your library using cabal so you can install it and use it anywhere without worrying about paths.
Here is a minimal example of a library with data-files:
source/
├── mylibrary.cabal
├── LibraryModule.hs
└── libraryconfig.txt
mylibrary.cabal
name: mylibrary
version: 0.0.1
build-type: Simple
cabal-version: >= 1.10
data-files: libraryconfig.txt
library
exposed-modules: LibraryModule
other-modules: Paths_mylibrary
build-depends: base
default-language: Haskell2010
LibraryModule.hs
module LibraryModule where
import Paths_mylibrary -- This module will be generated by cabal
-- Some function that uses the data-file
printConfig :: IO ()
printConfig = do
n <- getDataFileName "libraryconfig.txt"
-- Paths_mylibrary.getDataFileName resolves paths for files associated with mylibrary
c <- readFile n
print c
See this link for information about the Paths_* module: https://www.haskell.org/cabal/users-guide/developing-packages.html#accessing-data-files-from-package-code
Now running cabal install should install mylibrary.
Then, under scripts/script1.hs, you can just run ghc script1.hs, using the library you installed.
Related
So I have a Haskell project (managed using stack) structured like this:
.
├── Main.hs
├── Other1.hs
├── subfolder
└── Other2.hs
where the Main module imports both Other1 and Other2, as simply as
import Other1
import Other2
My .cabal file says:
name: (...)
executable Main
hs-source-dirs:
.,
subfolder
main-is: Main.hs
other-modules:
Other1
Other2
Now, if I run stack build everything works great, all modules are compiled and it looks like nothing can go wrong. But then if I try to execute my program with stack runghc Main, module Other2 (the one in the subfolder) is not found.
Why is that the case? How can I execute my code?
Full reproducible project here: https://github.com/chrissound/215
I have the following simple cabal file which defines:
a library (source under src-lib)
executable (source under src) in the same project (which depends on the above local library)
cabal-version: 1.12
name: HaskellNixCabalStarter
version: 0.1.0.0
author: HaskellNixCabalStarter
maintainer: HaskellNixCabalStarter
license: MIT
build-type: Simple
library
exposed-modules:
Hello
other-modules:
Paths_HaskellNixCabalStarter
hs-source-dirs:
src-lib
build-depends:
base >=4.12 && <4.13
default-language: Haskell2010
executable app
main-is: Main.hs
other-modules:
Paths_HaskellNixCabalStarter
hs-source-dirs:
src
build-depends:
HaskellNixCabalStarter
, base >=4.12 && <4.13
default-language: Haskell2010
I can open a GHCi repl with:
cabal v2-repl app
However, upon GHCi reloading (:r), it will only reload changes in the app executable, and disregard any changes in the library.
This seems like very limiting / incorrect behavior. How can I fix this / workaround this?
There is a workaround, you either
run cabal repl and then :load src/Main.hs, or
with cabal repl app you'd need to :load src/Main.hs src-lib/Hello.hs.
Now :reload also reloads changes from dependencies.
In the first case it's the :load that somehow also starts loading/following the dependencies. (Not sure why cabal repl app isn't doing exactly the same.)
On the second case you need to explicitly name the modules you want to follow. Also, you need to have the module in who's namespace you want to be in, first. So :load src/Main.hs ..others...
See this on reddit. It appears that cabal can only have one "unit" loaded, but loading other sources with :load seems to subvert that.
I don't think it can be done (yet?). Evidence:
jeff#jbb-dell:cabal-experim$ tree
.
├── cabal.project
├── P1
│ ├── app
│ │ ├── Lib.hs
│ │ └── Main.hs
│ └── P1.cabal
└── P2
├── P2.cabal
└── src
└── MyLib.hs
jeff#jbb-dell:cabal-experim$ cabal repl P1 P2
cabal: Cannot open a repl for multiple components at once. The targets 'P1'
and 'P2' refer to different components.
The reason for this limitation is that current versions of ghci do not support
loading multiple components as source. Load just one component and when you
make changes to a dependent component then quit and reload.
I was following the stack guide and I got a new project setup (yay!).
It generated the following file layout:
.
├── app
│ ├── Main.hs
├── .gitignore
├── LICENSE
├── helloworld.cabal
├── Setup.hs
├── src
│ └── Lib.hs
├── stack.yaml
└── test
└── Spec.hs
According to the "Files in helloworld" section of the guide:
The app/Main.hs, src/Lib.hs, and test/Spec.hs files are all Haskell source files that compose the actual functionality of our project (we won't dwell on them here).
I really wish they had dwelled on that for a second, because I have no idea what the distinction between app/Main.hs and src/Lib.hs should be. Which code should I put where?
In what ways am I supposed to divide code between app/, src/, app/Main.hs and src/Lib.hs?
If I'm just writing an application or just writing a library, do I need both files/directories?
This separation of modules into folders can be any way you want. The naive idea is that you put almost all logic into the Lib folder. Main.hs then just
imports required parts from Lib,
reads command-line arguments, and
runs stuff.
You can rename app into executables and change the corresponding lines in .cabal file. Actually, you can come up with an arbitrary file hierarchy.
In our company project, we use another but also very popular approach. And our file hierarchy looks like this:
.
|-- bench
|-- src
|-- exec1
|-- Main.hs
|-- exec2
|-- Main.hs
|-- SuperCoolLibrary
|-- LibModule1.hs
|-- LibModule2.hs
|-- test
|-- Setup.hs
Other stack.yaml, .cabal, etc. files are not shown here.
Actually, if you are writing an application, you can just create one Main.hs file and put all logic inside the main function. You won't believe it but as a Haskell lecturer I saw such code from my students :(
Though I don't suggest you write code that way.
If you are writing a library then you don't need Main.hs files and the main function at all. You can look at a simple example like this library (it allows you to automatically generate command-line options from data types): optparse-generic
I hope I helped clearing up your confusion.
The main reason it's typically set up like this even for an application is for writing tests. Say you create a default stack project called foo, the test suite foo-test will depend on the foo library, as will the foo-exe. If you were to put all your functions into app/Main.hs, then those functions cannot be tested from the foo-test test suite.
If you're just playing around and don't care about having a test suite, you could base your stack project on the simple template:
$ stack new foo simple
If you'd like to set up testing, I like tasty. You'd modify your .cabal file something like this:
test-suite foo-test
type: exitcode-stdio-1.0
hs-source-dirs: test
main-is: Spec.hs
build-depends: base
, foo
, tasty
, tasty-hunit
, tasty-quickcheck
ghc-options: -threaded -rtsopts -with-rtsopts=-N
default-language: Haskell2010
Then take a look at the example.
I have two projects in my user directory ~, the project A and B.
I run stack init and later stack build on the project A. Then, I have
the binaries of the A package in a folder ~/.stack-work/install/x86_64-linux/lts-6.0/7.10.3/bin. The issue is B needs this version of the binaries from A package, and then try the same build with stack on the B project directory. I tried on ~/B run the following command without success.
stack build ~/.stack-work/install/x86_64-linux/lts-6.0/7.10.3/bin
How can I do that? What if I create a third package C, and need something similar?
Excerpts:
The A.cabal content.
name: A
version: 1.1
And the B.cabal.
name: B
version: 1.0
build-depends: A>= 1.1
Then,
$ stack init
Looking for .cabal or package.yaml files to use to init the project.
Using cabal packages:
- B.cabal
Selecting the best among 8 snapshots...
* Partially matches lts-6.0
A version 1.0 found
- A requires ==1.1
This may be resolved by:
- Using '--omit-packages to exclude mismatching package(s).
- Using '--resolver' to specify a matching snapshot/resolver
But I actually have the version 1.1 of A build.
You don't need to include the project A's bin directory - that was a red herring.
Organize your files like this:
.
├── stack.yaml
├── project-A
│ ├── LICENSE.txt
│ ├── Setup.hs
│ ├── project-A.cabal
│ └── src
│ └── ...
│
└── project-B
├── Setup.hs
├── project-B.cabal
└── src
└── ...
Your top-level stack.yaml file will look like:
resolver: lts-5.13
packages:
- project-A/
- project-B/
Then in the top-level directory run stack build.
I'll take a stab at answering your question...
How about putting
~/.stack-work/install/x86_64-linux/lts-6.0/7.10.3/bin
in your PATH? If the other project really needs binaries (i.e. programs) built by another project, this would be the way to do it.
Or, copy the built programs to some directory in your current PATH - i.e. /usr/local/bin or ~/bin.
If this doesn't answer your question, please post the cabal files for both projects.
I found an answer after digging into the FAQ of stack. Create a file stack.yaml into B folder. At first the content could be:
resolver: lts-6.0
packages:
- '.'
- '/home/jonaprieto/A'
extra-deps: []
Then, it runs:
$ stack build
Short:
I have a cabal project that depends on a library built using cabal sandbox add-source. This library exports a preprocessor function BuildInfo -> LocalBuildInfo -> PreProcessor. When I try to use that preprocessor in the Setup.hs for my main cabal project, I get the error:
Couldn't match type ‘Cabal-1.18.1.3:Distribution.PackageDescription.BuildInfo’
with ‘BuildInfo’
NB: ‘Cabal-1.18.1.3:Distribution.PackageDescription.BuildInfo’
is defined in ‘Distribution.PackageDescription’
in package ‘Cabal-1.18.1.3’
‘BuildInfo’
is defined in ‘Distribution.PackageDescription’
in package ‘Cabal-1.20.0.1’
Less Short:
I currently have the following directory structure in my cabal sandbox:
├── main.cabal
├── Setup.hs
├── Main.hs
├── lib
│ ├── myPP.cabal
│ ├── MyPP.hs
│ └── myPP
│ ├── MyPP.hs
│ ├── myPP.cabal
My main package has myPP has a build depends. myPP is a library with MyPP as the exposed module and Cabal and ghc as dependencies. the MyPP module exports a function
myPP :: BuildInfo -> LocalBuildInfo -> PreProcessor
I then run
cabal sandbox add-source ./lib/*
cabal install myPP
In my Setup.hs I have:
module Main (main) where
import Distribution.Simple
import Distribution.Simple.PreProcess
import Distribution.Simple.Utils
import Distribution.PackageDescription
import Distribution.Simple.LocalBuildInfo
import MyPP(myPP)
main :: IO ()
main = let hooks = simpleUserHooks
pp = ("pp",myPP)
in defaultMainWithHooks hooks {hookedPreProcessors = pp:knownSuffixHandlers}
I then attempt to actually cabal build only to get the error:
Couldn't match type ‘Cabal-1.18.1.3:Distribution.PackageDescription.BuildInfo’
with ‘BuildInfo’
NB: ‘Cabal-1.18.1.3:Distribution.PackageDescription.BuildInfo’
is defined in ‘Distribution.PackageDescription’
in package ‘Cabal-1.18.1.3’
‘BuildInfo’
is defined in ‘Distribution.PackageDescription’
in package ‘Cabal-1.20.0.1’
Attempt to diagnose the error:
My understanding is that when you have a package that depends on both ghc and Cabal, then you get the version of Cabal that is linked to when ghc was built. As a result, the myPP package is built with that old version of cabal. When the custom Setup.hs is run by cabal, it imports its version of cabal (which is the one that I have installed). This then causes the error. However, I am hoping that there is a better solution than to just downgrade my version of Cabal....
You won't like the other solution: build ghc from the current HEAD in order to get a newer Cabal (now at 1.21.1.0). Unfortunately, there's just no way to make two different versions of a package coexist in a single build. Downgrading Cabal is probably the least-painful option.