How to use QuickCheck in Hspec tests? - haskell

I build the initial codebase for my Haskell project with cabal init
I have several tests written with Hspec.
On cabal test it compiles and runs these tests like expected and gives a message for failing/passing.
Now I included a quickCheck test and even when this test fails the output in terminal don't recognize the quickCheck test.
But in the dist/test/ dir i can see the test log *** Failed! ...
Is there a way to "include" quickCheck tests in the test workflow. So that i don't have to look on the test log after every test run.
import Test.Hspec
import Test.QuickCheck
spec :: Spec
spec = do
describe "myTest" $ do
it "Something something" $ do
myTest "" `shouldBe` False
quickCheckWith stdArgs { maxSuccess = 1000 } prop_myTest -- <== ?

You want the property function, see here.
Example:
spec :: Spec
spec = do
describe "myTest" $ do
it "Something something" $
property prop_myTest

Related

Cabal package difference between readPackageDescription and parsePackageDescription

Haskell package Cabal-1.24.2 has module Distribution.PackageDescription.Parse.
Module has 2 functions: readPackageDescription and parsePackageDescription.
When I run in ghci:
let d = readPackageDescription normal "C:\\somefile.cabal"
I got parsed GenericPackageDescription
But when I run in ghci:
content <- readFile "C:\\somefile.cabal"
let d = parsePackageDescription content
I got Parse error:
ParseFailed (FromString "Plain fields are not allowed in between stanzas: F 2 \"version\" \"0.1.0.0\"" (Just 2))
File example is a file that generated using cabal init
parsePackageDescription expects the file contents themselves to be passed it, not the file path they are stored at. You'll want to readFile first... though beware of file encoding issues. http://www.snoyman.com/blog/2016/12/beware-of-readfile

Compiling Haskell package using c++ and stack/cabal

There's a package clipper http://hackage.haskell.org/package/clipper that I want to use to detect intersection of complex polygons. It's an FFI to a C++ package. It works fine if you run
cabal build --with-gcc=/usr/bin/g++
but not otherwise. Is there some way to put that gcc option into the cabal file or otherwise get my stack project to build the dependency with g++?
Setting the $PATH for some reason doesn't work:
% cabal build --with-gcc=g++
Building clipper-0.0.1...
Preprocessing library clipper-0.0.1...
[1 of 1] Compiling Algebra.Clipper ( dist/build/Algebra/Clipper.hs, dist/build/Algebra/Clipper.o )
In-place registering clipper-0.0.1...
% PATH=$(pwd):$PATH gcc
g++: fatal error: no input files
compilation terminated.
% PATH=$(pwd):$PATH cabal build
Building clipper-0.0.1...
Preprocessing library clipper-0.0.1...
In file included from Clipper.hsc:27:0:
cbits/clipper.hpp:29:18: fatal error: vector: Dosiero aŭ dosierujo ne ekzistas
compilation terminated.
compiling dist/build/Algebra/Clipper_hsc_make.c failed (exit code 1)
command was: /usr/bin/gcc -c dist/build/Algebra/Clipper_hsc_make.c -o dist/build/Algebra/Clipper_hsc_make.o -fno-stack-protector -D__GLASGOW_HASKELL__=710 -Dlinux_BUILD_OS=1 -Dx86_64_BUILD_ARCH=1 -Dlinux_HOST_OS=1 -Dx86_64_HOST_ARCH=1 -Icbits -Idist/build/autogen -include dist/build/autogen/cabal_macros.h -I/usr/local/haskell/ghc-7.10.2-x86_64/lib/ghc-7.10.2/base_GDytRqRVSUX7zckgKqJjgw/include -I/usr/local/haskell/ghc-7.10.2-x86_64/lib/ghc-7.10.2/integ_2aU3IZNMF9a7mQ0OzsZ0dS/include -I/usr/local/haskell/ghc-7.10.2-x86_64/lib/ghc-7.10.2/include -I/usr/local/haskell/ghc-7.10.2-x86_64/lib/ghc-7.10.2/include/
Similarly, changing the Setup.hs as proposed by the #ErikR below didn't help.
% runghc Setup.hs build
"Hello, I am running"
fomg
BuildFlags
{ buildProgramPaths = []
, buildProgramArgs = []
, buildDistPref = Flag "dist"
, buildVerbosity = Flag Normal
, buildNumJobs = NoFlag
, buildArgs = []
}
fimg
BuildFlags
{ buildProgramPaths = [ ( "gcc" , "/usr/bin/g++" ) ]
, buildProgramArgs = []
, buildDistPref = Flag "dist"
, buildVerbosity = Flag Normal
, buildNumJobs = NoFlag
, buildArgs = []
}
Building clipper-0.0.1...
Preprocessing library clipper-0.0.1...
In file included from Clipper.hsc:27:0:
(etc)
Note that it crashes at the buildHook line, so in order to get the flags printed I needed to change the order around.
Try this simpler custom Setup.hs file:
import Distribution.Simple
import System.Environment
main = do
args <- getArgs
defaultMainArgs $ ["--with-gcc=c++"] ++ args
Original Answer
Some ideas:
(1) Create a wrapper script called gcc which invokes g++. Put the script early in your PATH so that running gcc will run your script instead of the real gcc. Perhaps you have this wrapper script in place only when you are building the clipper package.
(2) Write a custom Setup.hs.
Modify the build-type field in the clipper.cabal file to be Custom instead of Simple
Replace the Setup.hs file with:
import Distribution.Simple
import Distribution.Simple.Setup
import Distribution.Simple.LocalBuildInfo
import Distribution.PackageDescription
import Text.Show.Pretty
main = do
let hooks = simpleUserHooks
myBuild :: PackageDescription -> LocalBuildInfo -> UserHooks -> BuildFlags -> IO ()
myBuild pkgd buildinfo uhooks flags = do
putStrLn $ ppShow flags
let flags' = flags { buildProgramPaths = [("gcc","g++")] ++ (buildProgramPaths flags) }
buildHook hooks pkgd buildinfo uhooks flags'
putStrLn $ ppShow flags'
hooks' = hooks { buildHook = myBuild }
defaultMainWithHooks hooks'
Note: The import of Text.Show.Pretty is not necessary, so you can remove it and the ppShow calls if you don't have it installed.
The above Setup.hs should have the same effect as calling cabal with --with-gcc=g++.

Ivory: how to use the ivory-hw package

I'm experimenting with Ivory (http://ivorylang.org, https://github.com/GaloisInc/ivory) and using the ivory-hw module to manipulate some registers in a microcontroller.
cmain :: Def ('[] :-> ())
cmain = voidProc "main" $ body $ do
setReg regFoo $ do
clearBit foo_bitbar
setBit foo_bitbaz
forever $ return ()
main_module :: Module
main_module = package "main" $ do
incl cmain
main :: IO ()
main = runCompiler [ main_module ] [] (initialOpts {constFold = True,
outDir = Just "out"})
Building and running gives:
$ exe
*** Procedure main
ERROR: [ No location available ]:
Unbound value: 'ivory_hw_io_write_u32'
exe: Sanity-check failed!
Adding the option scErrors = False to runCompiler turns sanity checks off and the code runs to completion generating sources.
However, main.c contains a call to ivory_hw_io_write_u32 but this function is not defined anywhere (perhaps explaining the error). Poking about github, I can find examples that have a file ivory_hw_prim.h.
After some experimentation, I can include this by adding a module for the hw stuff and then adding that as a dependency to my main_module:
hw_module :: Module
hw_module = package "ivory_hw_prim" hw_moduledef
main_module :: Module
main_module = package "main" $ do
depend hw_module
incl cmain
and calling the runCompiler with hw_artifacts added to generate the header:
main = runCompiler [ main_module ] hw_artifacts (initialOpts {scErrors = False,
constFold = True,
outDir = Just "out"})
This adds ivory_hw_prim.h to the collection of files generated and includes the necessary include in main.h.
However, this only works by retaining the scErrors = False option to runCompiler which suggests that I am still not doing this right.
My question is therefore: What is the correct way to use Ivory's HW package?
The solution is to include hw_moduledef in the package:
main_module :: Module
main_module = package "main" $
incl cmain >> hw_moduledef
(The depend function just includes the header.) Including hw_moduledef in the package "main" makes its definitions visible to the sanity-checker.
By the way, the Ivory module system may be improved in the future, so that Ivory computes, at compile time, the dependencies, relieving the programmer from having to make explicit includes.

"Could not find module ‘Test.HUnit’" Error when executing Haskell's unittest (HUnit) in CodeRunner

I have simple unit test code for Haskell's HUnit. I use Mac OS X 10.10, and I installed HUnit with cabal install hunit.
module TestSafePrelude where
import SafePrelude( safeHead )
import Test.HUnit
testSafeHeadForEmptyList :: Test
testSafeHeadForEmptyList =
TestCase $ assertEqual "Should return Nothing for empty list"
Nothing (safeHead ([]::[Int]))
testSafeHeadForNonEmptyList :: Test
testSafeHeadForNonEmptyList =
TestCase $ assertEqual "Should return (Just head) for non empty list" (Just 1)
(safeHead ([1]::[Int]))
main :: IO Counts
main = runTestTT $ TestList [testSafeHeadForEmptyList, testSafeHeadForNonEmptyList]
I can execute it with runhaskell TestSafePrelude.hs to get the results:
Cases: 2 Tried: 2 Errors: 0 Failures: 0
Counts {cases = 2, tried = 2, errors = 0, failures = 0}
However, when I run it in Code Runner, I have error message that can't find the HUnit module.
CodeRunner launches the test on a different shell environment, and this seems to be the issue. If so, what environment variables need to be added? If not, what might be causing the problem?
I also find that ghc-pkg list from the CodeRunner does not search for the directories in ~/.ghc which contains the HUnit.
/usr/local/Cellar/ghc/7.8.3/lib/ghc-7.8.3/package.conf.d:
Cabal-1.18.1.4
array-0.5.0.0
...
xhtml-3000.2.1
This is the results when executed in shell:
/usr/local/Cellar/ghc/7.8.3/lib/ghc-7.8.3/package.conf.d
Cabal-1.18.1.4
array-0.5.0.0
...
/Users/smcho/.ghc/x86_64-darwin-7.8.3/package.conf.d
...
HUnit-1.2.5.2
...
zlib-0.5.4.2
I added both ~/.cabal and ~/.ghc in the path, but it doesn't work.
The problem was the $HOME setup change. I used different $HOME for CodeRunner, but Haskell searches for $HOME/.cabal and $HOME/.ghc for installed package.
After my resetting $HOME to correct location, everything works fine.

Any example of a custom PreProcessor in Haskell?

I've walked through the cabal Distribution.Simple* packages to know that the PreProcessor data type can be used to defined custom pre-processors. But the example provided is not so useful. I don't know how to invoke the pre-processor.
Currently, I just define my own pre-processors in the Setup.hs file.
Are there any complete examples for this feature?
[EDITED]
Check this mail-list archive I just found. But the solution involves transforming from one type of file (identified by the extension of that file) to another.
What I want to do is to inject code into existing .hs files where a custom mark is defined, e.g.
-- <inject point="foo">
-- extra Haskell code goes here
-- </inject>
One of the most important things to do is setting your BuildType in your .Cabal file to Custom. If it stays at Simple Cabal will completely ignore the Setup.hs file.
Build-Type: Custom
Here is an example Custom preprocessor from my package, It first runs cpphs and then runs hsc2hs
#!/usr/bin/env runhaskell
> {-# LANGUAGE BangPatterns #-}
> import Distribution.Simple
> import Distribution.Simple.PreProcess
> import Distribution.Simple.Utils
> import Distribution.PackageDescription
> import Distribution.Simple.LocalBuildInfo
> import Data.Char
> import System.Exit
> import System.IO
> import System.Directory
> import System.FilePath.Windows
> main = let hooks = simpleUserHooks
> xpp = ("xpphs", ppXpp)
> in defaultMainWithHooks hooks { hookedPreProcessors = xpp:knownSuffixHandlers }
>
> ppXpp :: BuildInfo -> LocalBuildInfo -> PreProcessor
> ppXpp build local =
> PreProcessor {
> platformIndependent = True,
> runPreProcessor = mkSimplePreProcessor $ \inFile outFile verbosity ->
> do info verbosity (inFile++" is being preprocessed to "++outFile)
> let hscFile = replaceExtension inFile "hsc"
> runSimplePreProcessor (ppCpp build local) inFile hscFile verbosity
> handle <- openFile hscFile ReadMode
> source <- sGetContents handle
> hClose handle
> let newsource = unlines $ process $ lines source
> writeFile hscFile newsource
> runSimplePreProcessor (ppHsc2hs build local) hscFile outFile verbosity
> removeFile hscFile
> return ()
> }
This preprocessor will automatically be called by Cabal when any file with the extension .xpphs is found.
In your case just register the preprocessor with a .hs extension. (I'm not sure if Cabal allows this. But if it doesn't you can simply rename the files with the injection point to a .xh or something. This would actually be better since you don't process every file in your project then)

Resources