Is there a way to get the path of the file where the function is defined?
For example:
rootappdir
|- Foo.hs
|- Bar.hs
module Foo where
getThisDir :: IO Filepath
getThisDir = ...
prelude> getThisDir
absolute/path/to/rootappdir/Foo.hs
If it is possible with an even simpler function :: Filepath, that's even better.
Maybe we would need to use the preprocessor?
You need to use Template Haskell to do this.
{-# LANGUAGE TemplateHaskell #-}
import Data.Functor
import Language.Haskell.TH
import System.Directory
import System.FilePath
filePath :: String
filePath = $(do
dir <- runIO getCurrentDirectory
filename <- loc_filename <$> location
litE $ stringL $ dir </> filename)
I suppose you can't get this information in run-time. But you can get it in compile-time through Template Haskell using function Language.Haskell.TH.location or qLocation.
If you need logging functionality you can use package monad-logger. You can find an example of using qLocation there.
Related
I am aware of this thread and the agreed-upon ghci :browse command, but I am looking for something similar to run from a script.hs file:
Say I have a module that I can import into my script.hs. How do I then view the list of functions I have just gained access to?
What I've settled on for now
Adapting this thread that suggests the now-deprecated ghc-mod command-line program, I am
calling the terminal command ghc -e ':browse <module, e.g. Data.List>'
from my script.hs using Shelly.
My full script:
#!/usr/bin/env runghc
{-# LANGUAGE OverloadedStrings #-}
import Safe (headDef)
import Shelly
import System.Environment (getArgs)
import qualified Data.Text as T
mdl :: IO String
mdl = getArgs >>= return . headDef "Data.List"
runShelly :: String -> IO ()
runShelly mdl = shelly $ silently $ do
out <- run "ghc" ["-e", T.pack (":browse " ++ mdl)]
let lns = T.lines out
liftIO $ mapM_ (putStrLn .T.unpack) $ lns
main :: IO ()
main = mdl >>= runShelly
This way I can pass the module name on the command line as <script> <module> and get back the functions, one per line. It defaults to Data.List if I pass no arguments.
So that's a solution, but surely there must be handier introspection facilities than this?
Similar question (Is there a way to see the list of functions in a module, in GHCI?), though not the result that I seek.
Is there a way to get a list of what is exported by a module?
Of course in GHCi you can import it then type Some.Module., hit tab for auto-completion and it will show what I seek. But I want to capture that stuff. Roughly speaking, String -> [String].
Purpose? Suppose that I have a source file with a naked import Some.Module. Question: What belongs to Some.Module in that file? A simple way would be to output the list of what the module exports, feed that to grep and return the contenders, without the need to load that source file in GHCi (might be complicated or not possible). And everything becomes a lot clearer.
If there's a smarter approach to that, I'm listening. I heard of solutions involving GOA and lambdabot. No idea if applicable or how to make use of this.
As #HTNW mentioned in a comment, if you can run ghc on your actual file, you can use -ddump-minimal-imports. Otherwise, if you want to actually get the list of exports from another module, assuming that you're using GHC, the easiest way to do this is probably to look at the .hi interface files. ghc has some built-in support for printing human-readable representations of interface files once you know the path to one, but as the wiki page notes "This textual format is not particularly designed for machine parsing". You can also access the information you might want via the GHC API. A small example of doing something like that follows.
We start with a bunch of random imports for doing IO & from the GHC api:
import Control.Monad.IO.Class
import System.IO
import System.Environment
import GHC
import GHC.Paths (libdir)
import DynFlags
import Outputable
import Name
import Pretty (Mode(..))
With that bureaucracy out of the way, main starts by firing up the GHC Monad:
main :: IO ()
main = defaultErrorHandler defaultFatalMessager defaultFlushOut $ do
runGhc (Just libdir) $ do
We're not actually generating any code so we can set hscTarget = HscNothing during the DynFlags setup boilerplate:
dflags <- getSessionDynFlags
let dflags' = dflags { hscTarget = HscNothing }
setSessionDynFlags dflags'
With that out of the way we can find the module we want from the package database (using the first command-line argument as the name):
mn <- head <$> (liftIO $ getArgs)
m <- lookupModule (mkModuleName mn) Nothing
We can use getModuleInfo to get a module info structure:
mmi <- getModuleInfo m
case mmi of
Nothing -> liftIO $ putStrLn "Could not find module interface"
If we did find the interface, everything we need for this is in the modInfoExports. If we needed more, we could also get the actual ModIface:
Just mi -> mapM_ (printExport dflags') (modInfoExports mi)
Actually printing out an exported is a bit tedious, as it requires working with Names; a simple example printExport might just use the pretty-printing functions, but these are more intended for printing human-readable output than machine-readable:
printExport :: DynFlags -> Name -> Ghc ()
printExport dflags n =
liftIO $ printSDocLn PageMode dflags stdout (defaultUserStyle dflags)
$ pprNameUnqualified n
A particularly simple way for interactive use is :browse. Load up a ghci that has access to the appropriate package, then
> :browse Some.Module
class Some.Module.Foo a where
Some.Module.foo :: a -> a
{-# MINIMAL foo #-}
Some.Module.bar :: Int
All the qualification can get a bit much, especially if there are many functions that operate on types defined in the same module. To reduce the clutter, you can bring Some.Module into scope first:
> :m + Some.Module
> :browse Some.Module
class Foo a where
foo :: a -> a
{-# MINIMAL foo #-}
bar :: Int
I am writing a tool for which I want a modular architecture. By that I mean that the users would be able to write down a list of the modules they want to be loaded at start-up and my tool would be loading the corresponding .o for me.
Here is the code I managed to write up until now:
module Core where
import Data.Monoid ((<>))
import Data.Text (pack, unpack)
import System.Directory (getHomeDirectory)
import System.Plugins.DynamicLoader
loadPlugins :: [Text] -> IO ()
loadPlugins plugins = do
home <- getHomeDirectory
-- addDLL "/home/tchoutri/.stack/programs/x86_64-linux/ghc-tinfo6-8.4.3/lib/ghc-8.4.3/base-4.11.1.0/libHSbase-4.11.1.0-ghc8.4.3.so"
let paths = fmap (\x -> (pack home) <> "/.local/lib/polynot/polynot-" <> x <> ".o") plugins
forM_ paths $ \path -> load path
where
load path = do
m <- loadModuleFromPath (unpack path) (Just $ unpack path)
resolveFunctions
loadFunction m "runPlugin"
The plugin I'm trying to load at this moment is very simple:
{-# LANGUAGE OverloadedStrings #-}
module Polynot.Plugin.Twitter where
runPlugin :: IO ()
runPlugin = putStrLn "[Twitter] 'sup"
It is compiled with stack ghc -- --make -dynamic -fPIC -O3 twitter.hs. It is then renamed polynot-twitter.o, in ~/.local/lib/polynot/.
The compilation goes well, and when I run stack exec -- polynot, I get this error:
polynot: user error (Unable to get qualified name from: /home/tchoutri/.local/lib/polynot/polynot-twitter.o)
A quick google search showed me that the only instances of this error appear in the source code. :/
Moreover, I use the git version of dynamic-loader.
(I may be mistaken about my choice for a modular architecture, I totally accept that. If you have a better approach I could use, you can totally comment on it :)
I wasn't able to duplicate your error. I get a Prelude.head: empty list exception instead.
However, my guess is that it has to do with the functions in dynamic-loader expecting to load modules from a hierarchical directory structure that matches the module hierarchy.
In a nutshell, if I store the plugin in:
~/.local/lib/polynot/Polynot/Plugin/Twitter.o
and use loadModule like so:
loadModule "Polynot.Plugin.Twitter"
(Just "/home/buhr/.local/lib/polynot") (Just "o")
then it works okay for me.
The Main.hs I used was the following:
{-# LANGUAGE OverloadedStrings #-}
import Control.Monad (forM_)
import Data.Monoid ((<>))
import Data.Text (pack, unpack, Text)
import System.Directory (getHomeDirectory)
import System.Plugins.DynamicLoader
loadPlugins :: [Text] -> IO ()
loadPlugins plugins = do
home <- getHomeDirectory
let basedir = (pack home) <> "/.local/lib/polynot"
forM_ plugins (load basedir)
where
load dir plugin = do
m <- loadModule (unpack plugin) (Just $ unpack dir) (Just "o")
resolveFunctions
entry <- loadFunction m "runPlugin"
entry
main = do
putStrLn "starting!"
loadPlugins ["Polynot.Plugin.Twitter"]
putStrLn "done!"
I'm trying to figure out how to use the Shelly (Shell.Pipe) library.
So far i've got:
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE ExtendedDefaultRules #-}
{-# OPTIONS_GHC -fno-warn-type-defaults #-}
import Control.Applicative
import Data.List(sort)
import Shelly.Pipe
import Data.Text.Lazy as LT
default (LT.Text)
findExt ext = findWhen (pure . hasExt ext)
main = shelly $ verbosely $ do
cd bookPath
findExt "epub" "."
I can find all the epub files but then I have no idea how to operate on each of the epub file ?
For example I want to run ebook-convert command on those file names wrapped by Sh Monad.
Btw: The examples are really scarce on the internet...
And it is very confusing that there are two similar libries:Shelly and Shelly.Pipe. The functions inside these two share same name with different Types:
In Shelly.Pipe:
find :: FilePath -> Sh FilePath
find = sh1s S.find
In Shelly:
find :: FilePath -> ShIO [FilePath]
Really frustrating !
PS: With the help from John Wiegley
I finally got the code working.
I post the code below for people who might use it.
Pay attention to the use of unpack.
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE ExtendedDefaultRules #-}
{-# OPTIONS_GHC -fno-warn-type-defaults #-}
import Control.Applicative
import Data.List(sort)
import Control.Monad
import Shelly
import System.Directory
import Data.Text
import System.FilePath
default (Text)
bookPath = "/dir/to/books"
main = shelly $ verbosely $ do
fnames <- Shelly.find bookPath --fnames can not be processed by normal pure String processing functions and need to be "escaped"
forM_ fnames $ \n-> liftIO $ putStrLn $ ProcessBookFileName $ unpack $ toTextIgnore n --ProcessBookFileName::String->String
From what I can gather, you don't want to use the Shelly.Pipe module, just the Shelly module. The ShIO monad implements MonadIO, which allows you to execute arbitrary IO actions while inside ShIO. This would let you do something like
convertEpub :: FilePath -> IO ()
convertEpub fname = undefined
main = shelly $ do
cd "projects/haskell/testing"
liftIO $ putStrLn "Hello, world! I'm in Shelly"
fnames <- find (pure . hasExt "hs") "."
liftIO $ forM_ fnames $ \fname -> do
putStrLn $ "Processing file " ++ show fname
convertEpub fname
I'm looking to have my Haskell program read settings from an external file, to avoid recompiling for minor changes. Being familiar with YAML, I thought it would be a good choice. Now I have to put the two pieces together. Google hasn't been very helpful so far.
A little example code dealing with reading and deconstructing YAML from a file would be very much appreciated.
If I'm interested in what packages are available, I go to hackage, look at the complete package list, and then just search-in-page for the keyword. Doing that brings up these choices (along with a few other less compelling ones):
yaml: http://hackage.haskell.org/package/yaml
HsSyck: http://hackage.haskell.org/package/HsSyck
and a wrapper around HsSyck called yaml-light: http://hackage.haskell.org/package/yaml-light
Both yaml and HsSyck look updated relatively recently, and appear to be used by other packages in widespread use. You can see this by checking the reverse deps:
http://packdeps.haskellers.com/reverse/HsSyck
http://packdeps.haskellers.com/reverse/yaml
Of the two, yaml has more deps, but that is because it is part of the yesod ecosystem. One library that depends on HsSyck is yst, which I happen to know is actively maintained, so that indicates to me that HsSyck is fine too.
The next step in making my choice would be to browse through the documentation of both libraries and see which had the more appealing api for my purposes.
Of the two, it looks like HsSyck exposes more structure but not much else, while yaml goes via the json encodings provided by aeson. This indicates to me that the former is probably more powerful while the latter is more convenient.
A simple example:
First you need a test.yml file:
db: /db.sql
limit: 100
Reading YAML in Haskell
{-# LANGUAGE DeriveGeneric #-}
import GHC.Generics
import Data.Yaml
data Config = Config { db :: String
, limit :: Int
} deriving (Show, Generic)
instance FromJSON Config
main :: IO ()
main = do
file <- decodeFile "test.yml" :: IO (Maybe Config)
putStrLn (maybe "Error" show file)
With yamlparse-applicative you can describe your YAML parser in a static analysis-friendly way, so you can get a description of the YAML format from a parser for free. I'm going to use Matthias Braun's example format for this one:
{-# LANGUAGE ApplicativeDo, RecordWildCards, OverloadedStrings #-}
import Data.Yaml
import Data.Aeson.Types (parse)
import YamlParse.Applicative
import Data.Map (Map)
import qualified Data.Text.IO as T
data MyType = MyType
{ stringsToStrings :: Map String String
, mapOfLists :: Map String [String]
} deriving Show
parseMyType :: YamlParser MyType
parseMyType = unnamedObjectParser $ do
stringsToStrings <- requiredField' "strings_to_strings"
mapOfLists <- requiredField' "map_of_lists"
pure MyType{..}
main :: IO ()
main = do
T.putStrLn $ prettyParserDoc parseMyType
yaml <- decodeFileThrow "config/example.yaml"
print $ parse (implementParser parseMyType) yaml
Note that main is able to print out a schema before even seeing an instance:
strings_to_strings: # required
<key>: <string>
map_of_lists: # required
<key>: - <string>
Success
(MyType
{ stringsToStrings = fromList
[ ("key_one","val_one")
, ("key_two","val_two")
]
, mapOfLists = fromList
[ ("key_one",["val_one","val_two","val_three"])
, ("key_two",["val_four","val_five"])
]
})
Here's how to parse specific objects from your YAML file using the yaml library.
Let's parse parts of this file, config/example.yaml:
# A map from strings to strings
strings_to_strings:
key_one: val_one
key_two: val_two
# A map from strings to list of strings
map_of_lists:
key_one:
- val_one
- val_two
- val_three
key_two:
- val_four
- val_five
# We won't parse this
not_for: us
This module parses strings_to_strings and map_of_lists individually and puts them into a custom record, MyType:
{-# Language OverloadedStrings, LambdaCase #-}
module YamlTests where
import Data.Yaml ( decodeFileEither
, (.:)
, parseEither
, prettyPrintParseException
)
import Data.Map ( Map )
import Control.Applicative ( (<$>) )
import System.FilePath ( FilePath
, (</>)
)
data MyType = MyType {stringsToStrings :: Map String String,
mapOfLists :: Map String [String]} deriving Show
type ErrorMsg = String
readMyType :: FilePath -> IO (Either ErrorMsg MyType)
readMyType file =
(\case
(Right yamlObj) -> do
stringsToStrings <- parseEither (.: "strings_to_strings") yamlObj
mapOfLists <- parseEither (.: "map_of_lists") yamlObj
return $ MyType stringsToStrings mapOfLists
(Left exception) -> Left $ prettyPrintParseException exception
)
<$> decodeFileEither file
yamlTest = do
parsedValue <- readMyType $ "config" </> "example.yaml"
print parsedValue
Run yamlTest to see the parsing result.