Flexibility of the hierarchy of module sources allowed in cabal project - haskell

I have a project with source tree:
src/
src/A/
src/A/A.hs
src/B/
src/B/C/
src/B/C/C.hs
...
The two haskell files divide source code into modules:
-- File A/A.hs
module A where
...
and
-- File B/C/C.hs
module B.C where
...
The cabal file contains:
other-modules: A, B.C, ...
hs-source-dirs: src/, src/A/, src/B/, src/B/C/, ...
But while the module A can be easily found, cabal complains about B.C:
cabal: can't find source for B/C in ...
I see no rational explanation why placing a file defining module A under A/A.hs is OK but placing B.C under B/C/C.hs isn't. Is there a workaround other than placing C.hs directly under B (I would like to maintain some separation of sources)?

The reason for the error is that module B.C should be defined in file B/C.hs, not B/C/C.hs (that would be module B.C.C). This error would have appeared if you had only one source dir with one source file, it is not because of the extra parts you have put in.
Also, the dir that appears in the hs-source-dirs directive should only be the root of the dir tree, so it is doubtful that you need all of the parts that you put in, for instance, src/B/C (which would treat src/B/C as another root.... meaning you can define top level modules in that dir. If you are actually doing that, I would consider this a mistake).
What you probably want to do is define multiple top level source dirs, like this
A_src/A.hs
B_src/B/C.hs
hs-source-dirs: A_src, B_src
Even better, I would suggest you use stack, which allows you to separate different modules completely with their own source dirs, called src, and independent .cabal files, allowing for richer dependencies between each module.

Related

Typechecking multiple 'Main's

I have a Haskell library with several executables (tests, benchmarks, etc), in total about six. When I do some refactoring in the library, I usually need to make some small change to each of the executables.
In my current workflow, I separately compile each executable (say, with GHCi) and fix each one up. This is tedious because I have to type out the path to each executable, and moreover have to reload all of the (very large) library, which even with GHCi takes some time.
My first thought to solve this issue was to create a single dummy module that imports the executable "Main" modules. However, this (of course) requires that the "Main" modules have a module name like module Executable1 where .... But now cabal complains when compiling the executable that it can't find a module called "Main" (despite explicitly listing "main-is" in the cabal file for each executable.)
I also tried ghci Exec1.hs Exec2.hs ..., but it complains module ‘main#main:Main’ is defined in multiple files.
Is there an easy way to load multiple "Main" modules at once with GHCi so I can typecheck them simultaneously?
Cabal’s main-is option only tells Cabal what filename it should pass to GHC. Cabal does not care about it’s module name.
GHC itself has a flag, also called -main-is, documented here which tells the compiler what module conains the main function.
So this works:
executable foo
main-is: Foo.hs
ghc-options: -main-is Foo
Of course Foo.hs should start with module Foo where… and export main. As usual, the module name and file name needs to match.
This way, all executable can have different module names and you can load them all in GHCi.
If you also want to change the name of the main function, write ghc-options: -main-is Foo.fooMain. I would guess you could even have all executables have the same module but different main-functions this way.

Stack GHCI doesn't have modules imported through Stack

I created a new project using the stack new Proj-Name simple command, and in a file I created within the /src directory imported a module outside of GHC's base library, import System.Process. Running stack build was successful, and the file worked as I expected it to. However, when I ran stack ghci within the same directory it did not have System.Process listed as an importable module.
How do I make it so that all my imported modules are importable within stack ghci?
While the information in Nicholas Montaño's answer is correct, I believe it doesn't reach the root of the issue. The likely cause of the problem is that the newly created source file wasn't declared in the cabal file, leading stack to ignore it. Every source file in a project must be declared in a section of the cabal file, be it exposed-modules (for parts of libraries which you want to expose to the users of your code) or other-modules (for internal modules which you do not want to expose).
When you run stack new ..., even with the simple template, you'll notice that several files are created which allow for stack to work. One of these is a Proj_Name.cabal file, and if you open it, you'll notice that under the executable Proj_Name section of the file there's a main-is: Main.hs line.
The default main-is file will be Main.hs, but it may be anything. Your imports should go within whatever file you want to act as your main file. So in this case, you can simply put whatever the name of that file you created (which has the System.Process import) in place of Main.hs in that line.
Following this, run stack build, add whatever dependencies it tells you to under the build-depends: base >= 4.7 && < 5 line in Proj_Name.cabal, which in this case will look like:
build-depends: base >= 4.7 && < 5
, process
run stack build again (if there are any further issues you might want to consult the stack guide, and now stack ghci should have all the modules that you imported in that file available to it.

Load a module in GHCi by module name when module name doesn't match file name

Suppose I am given a source file called MyModule.hs and inside it the module declaration is module My.Module where ... (note: not module MyModule where ...).
I am not permitted to alter this source file or change the directory structure where the file resides.
From reading some docs about importing modules in GHCi, it looks like there are ways to import by file name (e.g. either import or :load), but not any ways to specify a module name that will be searched for in all files of the local directory.
Is there a way to import My.Module in GHCi without doing it by specifying the file's name (only the module name) and without installing it (e.g. not building it with cabal, just quickly tossing it into GHCi by module name)?
You can't where the name contains a dot, as per the documentation
For each of these directories, it tries appending basename.extension to the directory, and checks whether the file exists. The value of basename is the module name with dots replaced by the directory separator ('/' or '\', depending on the system), and extension is a source extension (hs, lhs)...
The key part being
The value of basename is the module name with dots replaced by the directory separator ('/' or '\', depending on the system)
So your module name of My.Module will be searched for as My/Module.hs. You would need to have a directory structure like
project/
My/
Module.hs
project.cabal
And from the folder project you could run
$ cabal repl
GHCi, version 7.8.3: http://www.haskell.org/ghc/ :? for help
Loading package ghc-prim ... linking ... done.
Loading package integer-gmp ... linking ... done.
Loading package base ... linking ... done.
> import My.Module
You can do this if your file is named MyModule.hs and your module name is MyModule, but it's just a special case of the rule above.
There are good reasons for this, namely that it enforces a structure to simplify your project structure and GHC's search algorithm. If this rule wasn't in place, what would stop me from having
project/
MyModule1.hs
MyModule2.hs
where both .hs files had the module declaration My.Module? Which one would be correct to load in GHCi if I ran import My.Module? By specifying what the filename and path is, you immediately know that the module X.Y.Z.W.Q.R.S.T is at the path X/Y/Z/W/Q/R/S/T.hs, no searching required. It reduces a lot of the ambiguity that could occur with looser module name specifications.

Warning building a kernel module that uses exported symbols

I have two kernel modules (say modA and modB). modA exports a symbol with EXPORT_SYMBOL(symA) and modB uses it. I have the header modA.h for modA:
...
extern void symA(int param);
...
and in modB.c:
#include "modA.h"
...
static int __init modB_init(void)
{
symA(10);
}
...
If i insmod modB all works fine, my modB is correctly linked in the kernel and the function symA is correctly called. However when i build modB the compiler raises a warning: symA is undefined. An LKM is an ELF relocatable so why the compiler raises this warning? How can be this removed?
This issue (and how to compile correctly in this case) is explained in http://www.kernel.org/doc/Documentation/kbuild/modules.txt
Sometimes, an external module uses exported symbols from another
external module. kbuild needs to have full knowledge of all symbols
to avoid spitting out warnings about undefined symbols. Three
solutions exist for this situation.
NOTE: The method with a top-level kbuild file is recommended but may
be impractical in certain situations.
Use a top-level kbuild file If you have two modules, foo.ko and
bar.ko, where foo.ko needs symbols from bar.ko, you can use a
common top-level kbuild file so both modules are compiled in the
same build. Consider the following directory layout:
./foo/ <= contains foo.ko ./bar/ <= contains bar.ko
The top-level kbuild file would then look like:
#./Kbuild (or ./Makefile): obj-y := foo/ bar/
And executing
$ make -C $KDIR M=$PWD
will then do the expected and compile both modules with full
knowledge of symbols from either module.
Use an extra Module.symvers file When an external module is built,
a Module.symvers file is generated containing all exported symbols
which are not defined in the kernel. To get access to symbols from
bar.ko, copy the Module.symvers file from the compilation of bar.ko
to the directory where foo.ko is built. During the module build,
kbuild will read the Module.symvers file in the directory of the
external module, and when the build is finished, a new
Module.symvers file is created containing the sum of all symbols
defined and not part of the kernel.
Use "make" variable KBUILD_EXTRA_SYMBOLS If it is impractical to
copy Module.symvers from another module, you can assign a space
separated list of files to KBUILD_EXTRA_SYMBOLS in your build file.
These files will be loaded by modpost during the initialization of
its symbol tables.

Why does "cabal sdist" not include all "files needed to build"?

According to the wiki entry,
It packages up the files needed to build the project
I have a simple executables-only .cabal project, which basically contains
Executable myprog
hs-source-dirs: src
main-is: MyMain.hs
and is made up of some additional .hs files below src/ beyond src/MyMain.hs. E.g., src/Utils.hs and a few others.
cabal build has no problems building myprog, and compiles the required additional .hs files below src/, but cabal sdist does not, thus creating a dysfunctional source-tarball. What am I doing wrong? How do I tell cabal to include all source files below hs-source-dirs?
As a side-note, with GNU Autotools, there was a make distcheck target, which would first build a source-tarball, and then try to build the project via the newly generated source-tarball, thus ensuring everything's ok. Is there something similar for cabal, in order to make sure my source-tarball is sound?
You should list the other Haskell files in the .cabal file, inside the Executable stanza.
other-modules: Utils AFewOthers
The distribution only includes source files that are listed in your .cabal file. Cabal has no other way to detect which source files are in your package. You could still build because cabal build calls ghc --make, and ghc will find and compile all the source files it needs.

Resources