My stack build shows warning from external packages, like:
happy > /tmp/stack-d6caed253e9f21bf/happy-1.20.0/src/ProduceGLRCode.lhs:224:12: warning: [-Wincomplete-uni-patterns]
176happy > Pattern match(es) are non-exhaustive
There's not much I can do about external code, and I don't want to see these warnings. However, I still want to build my project strictly with GHC options -Wall -Werror.
Is there a way to suppress the warnings from external packages? I'm using stack with hpack. Note that the solution has to work from command line, since updating stack.yaml is not possible due to a large number of projects involved.
Things I've tried:
In stack.yaml:
ghc-options:
"$targets": -Wall -Werror
"$everything": -w
Combined with --verbosity warn from command line, this works, but I couldn't find a way to specify the above ghc-options from the command line.
In stack.yaml:
apply-ghc-options: targets
This doesn't work, warning from external packages are still visible. Besides, I couldn't find a way to specify the above apply-ghc-options from the command line.
P.S: Opened a ticket on GitHub after getting no answer here.
Related
I want to add ViewPatterns extensions to my build and currently I do it by adding the following lines to package.yaml
ghc-options:
- -Wall
- -XViewPatterns
I get a warning:
Warning: Instead of 'ghc-options: -XViewPatterns' use 'extensions:
ViewPatterns'
But, when I add an extensions field to package.yaml, I get the following:
WARNING: Ignoring unknown field "extensions" in package description
Also, I could not find any definition of extensions in the official stack documentation.
Do not confuse stack with hpack.
package.yaml is actually read by hpack.
But stack build command implicitly calls hpack to automatically convert into a cabal file, which may make you confused.
Anyway, to specify extensions in package.yaml,
use default-extensions:
default-extensions: ViewPatterns
extensions: is currently unsupported by stack, see follow-ups in this github issue.
GHC can be used as a linter for the Neovim plugin ale. The configuration uses the following command to lint a file:
stack ghc -- -fno-code -v0 %t
where %t is the file in question. This is fast and pretty great, however, it doesn't recognize any of the options specified in the Cabal file, and it fails to run if there are internal modules referenced in the file.
Is it possible somehow to get the GHC command that would be issued by cabal build or stack build, so that we can get the extensions/references/etc necessary to get this working?
Turning on verbosity should give you the information you're looking for:
stack build --verbose --cabal-verbose
I have a project which exports a shared static library and I use the following part in my project.cabal file
executable libsxp.so
main-is: Somefile.hs
default-language: Haskell2010
ghc-options: -shared -dynamic -fPIC -lHSrts-ghc7.10.2
The version of GHC is controlled using Stack, so is there a way wherein I can either get and append the version to make -lHSrts-ghc{version} or is there some config for it? I tried setting
stack build --ghc-options='-O0 -lHSrts-ghc7.10.2'
but it doesn't seem to pick it.
Also to clarify, cabal install is called by Stack and not by me.
Does that cabal file work? If so, then it should be sufficient to do something like this:
executable libsxp.so
ghc-options: -shared -dynamic -fPIC
if impl (ghc >= 7.10.2 && < 7.10.3)
ghc-options: -lHSrts-ghc7.10.2
else if impl (ghc >= 7.10.3 && < 7.10.4)
ghc-options: -lHSrts-ghc7.10.3
else if ...
BTW, why does your executable end in .so? I've never seen that in an executable clause.
Are you sure you're using 7.10.2 and not 7.10.3? Try stack exec -- ghc --version
The general principle is described in this answer: https://stackoverflow.com/a/6034881/1663197
Using the configure style in Cabal, you can write a little configure
script that substitutes a variable for the output of the sdl-config
command. The values will then be replaced in a $foo.buildinfo.in file,
yielding a $foo.buildinfo file, that Cabal will include in the build
process.
First you need to switch your cabal build-type to Configure in project.cabal. Configure style is described in cabal users guide. For build type Configure the contents of Setup.hs must be:
import Distribution.Simple
main = defaultMainWithHooks autoconfUserHooks
In case of handling GHC runtime version you can have a variable #GHC_VERSION# corresponding to it in a project.buildinfo.in file:
ghc-options: -lHSrts-ghc#GHC_VERSION#
Finally you write a configure bash script that gets GHC version as mgsloan suggested and generates project.buildinfo file by substitution of #GHC_VERSION# varibale in project.buildinfo.in file:
GHC_VERSION=$(stack exec -- ghc-pkg field ghc version --simple-output)
sed 's,#GHC_VERSION#,'"$GHC_VERSION"',' project.buildinfo.in > project.buildinfo
This way when build is started it will first execute configure script, then read project.buildinfo file and merge with project.cabal.
Also it may be worth to populate extra-source-files with configure and
project.buildinfo.in; extra-tmp-files with project.buildinfo in project.cabal.
A more sophisticated solution may be inspired by this answer: https://stackoverflow.com/a/2940799/1663197
While building a DLL under Windows I get the following output:
Linking main.exe ...
Warning: resolving _findPeaksWrapper by linking to _findPeaksWrapper#16
Use --enable-stdcall-fixup to disable these warnings
Creating library file: HSdll.dll.a
Use --disable-stdcall-fixup to disable these fixups
It’s not clear to me where I should be placing the --enable-stdcall-fixup flag. Putting it into the ghc-options field of my .cabal file gives a GHC error, while putting it into cc-options or ld-options seems not to do anything (the warnings are still displayed). Where should this flag go?
Googling indicates that --enable-stdcall-fixup is an option to ld. There are a few different pathways by which cabal's final link step can happen, but in your case it is apparently
Cabal -> ghc (link step) -> gcc -> ld
so to match this you must specify
ghc-options: -optl-Wl,--enable-stdcall-fixup
I have a project that builds a single executable. With cabal, if I use the -fforce-recomp flag like this:
cabal build --ghc-options="-Wall -fforce-recomp"
GHC compiles every single module (all 24 of them in my project) twice. Any errors and warnings I get are exactly the same the second time around. Even if no errors occurred, it still compiles everything twice. Is this a bug?
EDIT: If I bypass cabal and manually invoke ghc -Wall -fforce-recomp --make $PROJECT_NAME in my source directory, I do not get this behavior. So, it probably has something to do with cabal build in a sandbox. Also, I've noticed that between the duplicate compilation runs, only the last one ends with the message "Linking dist/build/..." --- i.e., it compiles, compiles again, and then links.
EDIT2: OK, so I've done a search into my project's directory, and I think it's definitely a cabal sandbox issue. I get *.hi files for all my modules in ./dist/dist-sandbox-c40738d9/build/maxa/maxa-tmp/ and again in ./dist/build/maxa/maxa-tmp/. I have a strong feeling that these are the footprints of the two compilation runs.
EDIT3: Well, I just realized that the two compilation runs all occur in ./dist/build/maxa/maxa-tmp/ --- it says so like this:
[14 of 24] Compiling Maxa.State ( src/Maxa/State.lhs, dist/build/maxa/maxa-tmp/Maxa/State.o )
(I wonder how I missed it before I wrote EDIT2...). So it has nothing to do with the cabal sanbox hashcode directory.
EDIT4: Well, there is a project of mine on github that also compiles twice with -fforce-recomp --- you can check out the .cabal file and everything at https://github.com/listx/netherworld.