Ivory: how to use the ivory-hw package - haskell

I'm experimenting with Ivory (http://ivorylang.org, https://github.com/GaloisInc/ivory) and using the ivory-hw module to manipulate some registers in a microcontroller.
cmain :: Def ('[] :-> ())
cmain = voidProc "main" $ body $ do
setReg regFoo $ do
clearBit foo_bitbar
setBit foo_bitbaz
forever $ return ()
main_module :: Module
main_module = package "main" $ do
incl cmain
main :: IO ()
main = runCompiler [ main_module ] [] (initialOpts {constFold = True,
outDir = Just "out"})
Building and running gives:
$ exe
*** Procedure main
ERROR: [ No location available ]:
Unbound value: 'ivory_hw_io_write_u32'
exe: Sanity-check failed!
Adding the option scErrors = False to runCompiler turns sanity checks off and the code runs to completion generating sources.
However, main.c contains a call to ivory_hw_io_write_u32 but this function is not defined anywhere (perhaps explaining the error). Poking about github, I can find examples that have a file ivory_hw_prim.h.
After some experimentation, I can include this by adding a module for the hw stuff and then adding that as a dependency to my main_module:
hw_module :: Module
hw_module = package "ivory_hw_prim" hw_moduledef
main_module :: Module
main_module = package "main" $ do
depend hw_module
incl cmain
and calling the runCompiler with hw_artifacts added to generate the header:
main = runCompiler [ main_module ] hw_artifacts (initialOpts {scErrors = False,
constFold = True,
outDir = Just "out"})
This adds ivory_hw_prim.h to the collection of files generated and includes the necessary include in main.h.
However, this only works by retaining the scErrors = False option to runCompiler which suggests that I am still not doing this right.
My question is therefore: What is the correct way to use Ivory's HW package?

The solution is to include hw_moduledef in the package:
main_module :: Module
main_module = package "main" $
incl cmain >> hw_moduledef
(The depend function just includes the header.) Including hw_moduledef in the package "main" makes its definitions visible to the sanity-checker.
By the way, the Ivory module system may be improved in the future, so that Ivory computes, at compile time, the dependencies, relieving the programmer from having to make explicit includes.

Related

Haskell-Stack failing to build project with DateTime dependency despite entry in stack.yaml

So I'm trying to add this package: datetime-0.3.1 and I added what I think is the correct reference in the stack.yaml file. I tried using stack solver but that doesn't seem exist anymore. I also looked for some equivalent of pip so I could just do stack install datetime-0.3.1 or something similar but that doesn't appear to be something stack does.
The code:
module FhirDataTypes (
FhirId (..),
toFhirId
) where
import Data.Maybe (Maybe(..))
import Data.List (length)
import Coding as Coding
import Data.Decimal
import FhirUri (FhirUri(..))
import FhirString (FhirString(..))
import SimpleQuantity (SimpleQuantity(..))
import Data.DateTime
newtype FhirId = FhirId FhirString deriving (Show)
toFhirId :: FhirString -> Maybe FhirId
toFhirId fs#(FhirString s)
| length s > 64 = Nothing
| otherwise = Just $ FhirId fs
data Money = Money { value :: Decimal
, currency :: Code
}
data Range = Range { low :: SimpleQuantity
, high :: SimpleQuantity
}
data Ratio = Ratio { numerator :: Quantity
, denominator :: Quantity
}
data Period = Period { start :: DateTime
, end :: DateTime
}
The error I'm getting:
PS C:\util\haskell\fhir-practice> stack build
Error: While constructing the build plan, the following exceptions were encountered:
In the dependencies for fhir-practice-0.1.0.0:
DateTime needed, but the stack configuration has no specified version (no package with that name found, perhaps there is a typo in
a package's build-depends or an omission from the stack.yaml packages list?) needed since fhir-practice is a build target.
Some different approaches to resolving this:
Plan construction failed.
My stack.yaml file:
flags: {}
packages:
- .
extra-deps:
- network- uri-2.6.1.0#sha256:62cc45c66023e37ef921d5fb546aca56a9c786615e05925fb193a70bf0913690
- Decimal-0.4.2
- datetime-0.3.1
resolver: lts-13.24
stack install is mostly used for installing binaries globally, not for project-specific packages.
You probably want to use the time package, not datetime. as the former is actively maintained. Moreover, in your case, time is present in LTS-13.24, so you shouldn't need to add it to extra-deps. The extra-deps field is only for dependencies (including transitive ones) which are not present in your resolver.

Compiling Haskell package using c++ and stack/cabal

There's a package clipper http://hackage.haskell.org/package/clipper that I want to use to detect intersection of complex polygons. It's an FFI to a C++ package. It works fine if you run
cabal build --with-gcc=/usr/bin/g++
but not otherwise. Is there some way to put that gcc option into the cabal file or otherwise get my stack project to build the dependency with g++?
Setting the $PATH for some reason doesn't work:
% cabal build --with-gcc=g++
Building clipper-0.0.1...
Preprocessing library clipper-0.0.1...
[1 of 1] Compiling Algebra.Clipper ( dist/build/Algebra/Clipper.hs, dist/build/Algebra/Clipper.o )
In-place registering clipper-0.0.1...
% PATH=$(pwd):$PATH gcc
g++: fatal error: no input files
compilation terminated.
% PATH=$(pwd):$PATH cabal build
Building clipper-0.0.1...
Preprocessing library clipper-0.0.1...
In file included from Clipper.hsc:27:0:
cbits/clipper.hpp:29:18: fatal error: vector: Dosiero aŭ dosierujo ne ekzistas
compilation terminated.
compiling dist/build/Algebra/Clipper_hsc_make.c failed (exit code 1)
command was: /usr/bin/gcc -c dist/build/Algebra/Clipper_hsc_make.c -o dist/build/Algebra/Clipper_hsc_make.o -fno-stack-protector -D__GLASGOW_HASKELL__=710 -Dlinux_BUILD_OS=1 -Dx86_64_BUILD_ARCH=1 -Dlinux_HOST_OS=1 -Dx86_64_HOST_ARCH=1 -Icbits -Idist/build/autogen -include dist/build/autogen/cabal_macros.h -I/usr/local/haskell/ghc-7.10.2-x86_64/lib/ghc-7.10.2/base_GDytRqRVSUX7zckgKqJjgw/include -I/usr/local/haskell/ghc-7.10.2-x86_64/lib/ghc-7.10.2/integ_2aU3IZNMF9a7mQ0OzsZ0dS/include -I/usr/local/haskell/ghc-7.10.2-x86_64/lib/ghc-7.10.2/include -I/usr/local/haskell/ghc-7.10.2-x86_64/lib/ghc-7.10.2/include/
Similarly, changing the Setup.hs as proposed by the #ErikR below didn't help.
% runghc Setup.hs build
"Hello, I am running"
fomg
BuildFlags
{ buildProgramPaths = []
, buildProgramArgs = []
, buildDistPref = Flag "dist"
, buildVerbosity = Flag Normal
, buildNumJobs = NoFlag
, buildArgs = []
}
fimg
BuildFlags
{ buildProgramPaths = [ ( "gcc" , "/usr/bin/g++" ) ]
, buildProgramArgs = []
, buildDistPref = Flag "dist"
, buildVerbosity = Flag Normal
, buildNumJobs = NoFlag
, buildArgs = []
}
Building clipper-0.0.1...
Preprocessing library clipper-0.0.1...
In file included from Clipper.hsc:27:0:
(etc)
Note that it crashes at the buildHook line, so in order to get the flags printed I needed to change the order around.
Try this simpler custom Setup.hs file:
import Distribution.Simple
import System.Environment
main = do
args <- getArgs
defaultMainArgs $ ["--with-gcc=c++"] ++ args
Original Answer
Some ideas:
(1) Create a wrapper script called gcc which invokes g++. Put the script early in your PATH so that running gcc will run your script instead of the real gcc. Perhaps you have this wrapper script in place only when you are building the clipper package.
(2) Write a custom Setup.hs.
Modify the build-type field in the clipper.cabal file to be Custom instead of Simple
Replace the Setup.hs file with:
import Distribution.Simple
import Distribution.Simple.Setup
import Distribution.Simple.LocalBuildInfo
import Distribution.PackageDescription
import Text.Show.Pretty
main = do
let hooks = simpleUserHooks
myBuild :: PackageDescription -> LocalBuildInfo -> UserHooks -> BuildFlags -> IO ()
myBuild pkgd buildinfo uhooks flags = do
putStrLn $ ppShow flags
let flags' = flags { buildProgramPaths = [("gcc","g++")] ++ (buildProgramPaths flags) }
buildHook hooks pkgd buildinfo uhooks flags'
putStrLn $ ppShow flags'
hooks' = hooks { buildHook = myBuild }
defaultMainWithHooks hooks'
Note: The import of Text.Show.Pretty is not necessary, so you can remove it and the ppShow calls if you don't have it installed.
The above Setup.hs should have the same effect as calling cabal with --with-gcc=g++.

"Could not find module ‘Test.HUnit’" Error when executing Haskell's unittest (HUnit) in CodeRunner

I have simple unit test code for Haskell's HUnit. I use Mac OS X 10.10, and I installed HUnit with cabal install hunit.
module TestSafePrelude where
import SafePrelude( safeHead )
import Test.HUnit
testSafeHeadForEmptyList :: Test
testSafeHeadForEmptyList =
TestCase $ assertEqual "Should return Nothing for empty list"
Nothing (safeHead ([]::[Int]))
testSafeHeadForNonEmptyList :: Test
testSafeHeadForNonEmptyList =
TestCase $ assertEqual "Should return (Just head) for non empty list" (Just 1)
(safeHead ([1]::[Int]))
main :: IO Counts
main = runTestTT $ TestList [testSafeHeadForEmptyList, testSafeHeadForNonEmptyList]
I can execute it with runhaskell TestSafePrelude.hs to get the results:
Cases: 2 Tried: 2 Errors: 0 Failures: 0
Counts {cases = 2, tried = 2, errors = 0, failures = 0}
However, when I run it in Code Runner, I have error message that can't find the HUnit module.
CodeRunner launches the test on a different shell environment, and this seems to be the issue. If so, what environment variables need to be added? If not, what might be causing the problem?
I also find that ghc-pkg list from the CodeRunner does not search for the directories in ~/.ghc which contains the HUnit.
/usr/local/Cellar/ghc/7.8.3/lib/ghc-7.8.3/package.conf.d:
Cabal-1.18.1.4
array-0.5.0.0
...
xhtml-3000.2.1
This is the results when executed in shell:
/usr/local/Cellar/ghc/7.8.3/lib/ghc-7.8.3/package.conf.d
Cabal-1.18.1.4
array-0.5.0.0
...
/Users/smcho/.ghc/x86_64-darwin-7.8.3/package.conf.d
...
HUnit-1.2.5.2
...
zlib-0.5.4.2
I added both ~/.cabal and ~/.ghc in the path, but it doesn't work.
The problem was the $HOME setup change. I used different $HOME for CodeRunner, but Haskell searches for $HOME/.cabal and $HOME/.ghc for installed package.
After my resetting $HOME to correct location, everything works fine.

How to make interface and implementation files separately?

I would like to make interface (class, or instance) and implementation files in Haskell separately as follow:
file1: (For interface)
class X where
funcX1 = doFuncX1
funcX2 = doFuncX2
....
instance Y where
funcY1 = doFuncY1
funcY2 = doFuncY2
...
file 2: (For implementation)
doFuncX1 = ...
doFuncX2 = ...
doFuncY1 = ...
...
How can I do that when file1 must be imported in file2 and vice versa ?
You don't need any such cumbersome separation in Haskell. Just mark only what you want to be public in the module export list (module Foo ( X(..) ... ) where ...), build your project with cabal, and if you want to export a library but not release the source code you can simply publish only the dist folder with the binary interface files and the Haddock documentation. That's much more convenient than nasty e.g. .h and .cpp files that need to be kept manually in sync.
But of course, nothing prevents you from putting implementations in a seperate, non-public file. You just don't need to do "vice versa" imports for this, only perhaps a common file with the necessary data type declarations. E.g.
Public.hs:
module Public(module Public.Datatypes) where
import Public.Datatypes
import Private.Implementations
instance X Bar where { funcX1 = implFuncX1; ... }
Public/Datatypes.hs:
module Public.Datatypes where
data Bar = Bar { ... }
class X bar where { funcX1 :: ... }
Private/Implementations.hs:
module Private.Implementations(implFuncX1, ...) where
import Public.Datatypes
implFuncX1 :: ...
implFuncX1 = ...
But usually it would be better to simply put everything in Public.hs.

Any example of a custom PreProcessor in Haskell?

I've walked through the cabal Distribution.Simple* packages to know that the PreProcessor data type can be used to defined custom pre-processors. But the example provided is not so useful. I don't know how to invoke the pre-processor.
Currently, I just define my own pre-processors in the Setup.hs file.
Are there any complete examples for this feature?
[EDITED]
Check this mail-list archive I just found. But the solution involves transforming from one type of file (identified by the extension of that file) to another.
What I want to do is to inject code into existing .hs files where a custom mark is defined, e.g.
-- <inject point="foo">
-- extra Haskell code goes here
-- </inject>
One of the most important things to do is setting your BuildType in your .Cabal file to Custom. If it stays at Simple Cabal will completely ignore the Setup.hs file.
Build-Type: Custom
Here is an example Custom preprocessor from my package, It first runs cpphs and then runs hsc2hs
#!/usr/bin/env runhaskell
> {-# LANGUAGE BangPatterns #-}
> import Distribution.Simple
> import Distribution.Simple.PreProcess
> import Distribution.Simple.Utils
> import Distribution.PackageDescription
> import Distribution.Simple.LocalBuildInfo
> import Data.Char
> import System.Exit
> import System.IO
> import System.Directory
> import System.FilePath.Windows
> main = let hooks = simpleUserHooks
> xpp = ("xpphs", ppXpp)
> in defaultMainWithHooks hooks { hookedPreProcessors = xpp:knownSuffixHandlers }
>
> ppXpp :: BuildInfo -> LocalBuildInfo -> PreProcessor
> ppXpp build local =
> PreProcessor {
> platformIndependent = True,
> runPreProcessor = mkSimplePreProcessor $ \inFile outFile verbosity ->
> do info verbosity (inFile++" is being preprocessed to "++outFile)
> let hscFile = replaceExtension inFile "hsc"
> runSimplePreProcessor (ppCpp build local) inFile hscFile verbosity
> handle <- openFile hscFile ReadMode
> source <- sGetContents handle
> hClose handle
> let newsource = unlines $ process $ lines source
> writeFile hscFile newsource
> runSimplePreProcessor (ppHsc2hs build local) hscFile outFile verbosity
> removeFile hscFile
> return ()
> }
This preprocessor will automatically be called by Cabal when any file with the extension .xpphs is found.
In your case just register the preprocessor with a .hs extension. (I'm not sure if Cabal allows this. But if it doesn't you can simply rename the files with the injection point to a .xh or something. This would actually be better since you don't process every file in your project then)

Resources