More specifically, given an arbritary package name I need to retrieve the same library-dirs field that can be obtained with the ghc-pkg describe command from inside a running Haskell program.
Here's what I could come up with by peeking into the ghc-pkg source code.
The getPkgInfos function returns the package definitions for all installed packages (hopefully including user-installed packages). With this in your hands, you can retrieve the library directories and other package information. See the documentation for details.
The GHC_PKGCONF variable needs to point to the global package config file for systems where it isn't located at the usual place. ghc-pkg solves this problem by receiving a command line flag via a wrapper script in Ubuntu, for instance.
import qualified Config
import qualified System.Info
import Data.List
import Distribution.InstalledPackageInfo
import GHC.Paths
import System.Directory
import System.Environment
import System.FilePath
import System.IO.Error
getPkgInfos :: IO [InstalledPackageInfo]
getPkgInfos = do
global_conf <-
catch (getEnv "GHC_PKGCONF")
(\err -> if isDoesNotExistError err
then do let dir = takeDirectory $ takeDirectory ghc_pkg
path1 = dir </> "package.conf"
path2 = dir </> ".." </> ".." </> ".."
</> "inplace-datadir"
</> "package.conf"
exists1 <- doesFileExist path1
exists2 <- doesFileExist path2
if exists1 then return path1
else if exists2 then return path2
else ioError $ userError "Can't find package.conf"
else ioError err)
let global_conf_dir = global_conf ++ ".d"
global_conf_dir_exists <- doesDirectoryExist global_conf_dir
global_confs <-
if global_conf_dir_exists
then do files <- getDirectoryContents global_conf_dir
return [ global_conf_dir ++ '/' : file
| file <- files
, isSuffixOf ".conf" file]
else return []
user_conf <-
try (getAppUserDataDirectory "ghc") >>= either
(\_ -> return [])
(\appdir -> do
let subdir = currentArch ++ '-':currentOS ++ '-':ghcVersion
user_conf = appdir </> subdir </> "package.conf"
user_exists <- doesFileExist user_conf
return (if user_exists then [user_conf] else []))
let pkg_dbs = user_conf ++ global_confs ++ [global_conf]
return.concat =<< mapM ((>>= return.read).readFile) pkg_dbs
currentArch = System.Info.arch
currentOS = System.Info.os
ghcVersion = Config.cProjectVersion
I wrote this code myself, but it was largely inspired by ghc-pkg (with some pieces copied verbatim). The original code was licensed under a BSD-style license, I think this can be distributed under the cc-wiki license all Stackoverflow content is under, but I'm not really sure. Anyway, as anything else, I did some initial testing and it seems to work, but use it at your own risk.
The format of the installed packages database is Distribution.InstalledPackageInfo.
import Distribution.InstalledPackageInfo
import Distribution.Package
import Distribution.Text
import GHC.Paths
import System
import System.FilePath
main = do
name:_ <- getArgs
packages <- fmap read $ readFile $ joinPath [libdir, "package.conf"]
let matches = filter ((PackageName name ==) . pkgName . package) packages
mapM_ (print . libraryDirs) (matches :: [InstalledPackageInfo_ String])
This doesn't obey the user's package configuration, but should be a start.
Ask Duncan Coutts on the haskell-cafe# or cabal mailing lists. (I'm serious. That is a better forum for Cabal questions than stack overflow).
Sometimes you just have to point people at a different forum.
If you're using cabal to configure and build your program/library you can used the autogenerated Paths_* module.
For example, if you have a foo.cabal file, cabal will generate a Paths_foo module (see its source under dist/build/autogen) which you can import. This module exports a function getLibDir :: IO FilePath which has the value you're looking for.
Related
I am writing a tool for which I want a modular architecture. By that I mean that the users would be able to write down a list of the modules they want to be loaded at start-up and my tool would be loading the corresponding .o for me.
Here is the code I managed to write up until now:
module Core where
import Data.Monoid ((<>))
import Data.Text (pack, unpack)
import System.Directory (getHomeDirectory)
import System.Plugins.DynamicLoader
loadPlugins :: [Text] -> IO ()
loadPlugins plugins = do
home <- getHomeDirectory
-- addDLL "/home/tchoutri/.stack/programs/x86_64-linux/ghc-tinfo6-8.4.3/lib/ghc-8.4.3/base-4.11.1.0/libHSbase-4.11.1.0-ghc8.4.3.so"
let paths = fmap (\x -> (pack home) <> "/.local/lib/polynot/polynot-" <> x <> ".o") plugins
forM_ paths $ \path -> load path
where
load path = do
m <- loadModuleFromPath (unpack path) (Just $ unpack path)
resolveFunctions
loadFunction m "runPlugin"
The plugin I'm trying to load at this moment is very simple:
{-# LANGUAGE OverloadedStrings #-}
module Polynot.Plugin.Twitter where
runPlugin :: IO ()
runPlugin = putStrLn "[Twitter] 'sup"
It is compiled with stack ghc -- --make -dynamic -fPIC -O3 twitter.hs. It is then renamed polynot-twitter.o, in ~/.local/lib/polynot/.
The compilation goes well, and when I run stack exec -- polynot, I get this error:
polynot: user error (Unable to get qualified name from: /home/tchoutri/.local/lib/polynot/polynot-twitter.o)
A quick google search showed me that the only instances of this error appear in the source code. :/
Moreover, I use the git version of dynamic-loader.
(I may be mistaken about my choice for a modular architecture, I totally accept that. If you have a better approach I could use, you can totally comment on it :)
I wasn't able to duplicate your error. I get a Prelude.head: empty list exception instead.
However, my guess is that it has to do with the functions in dynamic-loader expecting to load modules from a hierarchical directory structure that matches the module hierarchy.
In a nutshell, if I store the plugin in:
~/.local/lib/polynot/Polynot/Plugin/Twitter.o
and use loadModule like so:
loadModule "Polynot.Plugin.Twitter"
(Just "/home/buhr/.local/lib/polynot") (Just "o")
then it works okay for me.
The Main.hs I used was the following:
{-# LANGUAGE OverloadedStrings #-}
import Control.Monad (forM_)
import Data.Monoid ((<>))
import Data.Text (pack, unpack, Text)
import System.Directory (getHomeDirectory)
import System.Plugins.DynamicLoader
loadPlugins :: [Text] -> IO ()
loadPlugins plugins = do
home <- getHomeDirectory
let basedir = (pack home) <> "/.local/lib/polynot"
forM_ plugins (load basedir)
where
load dir plugin = do
m <- loadModule (unpack plugin) (Just $ unpack dir) (Just "o")
resolveFunctions
entry <- loadFunction m "runPlugin"
entry
main = do
putStrLn "starting!"
loadPlugins ["Polynot.Plugin.Twitter"]
putStrLn "done!"
I need to have something like
-- Main.hs
module Main where
main :: IO ()
main = do
<import Plugin>
print Plugin.computation
With a Plugin like
-- Plugin.hs
module Plugin where
computation :: Int
computation = 4
However, I need the plugin to be compiled alongside the main application. They need to be deployed together. Only the import (not the compilation) of the module should happen dynamically.
I found Dynamically loading compiled Haskell module - GHC 7.6 along the way and it works just fine with GHC 8.0.2 except for the fact that it requires the source file of the plugin to be in the current working directory when executing the application.
Edit (07.12.2017)
Is it possible to load a module from a String instead of a file using the GHC API? http://hackage.haskell.org/package/ghc-8.2.1/docs/GHC.html#t:Target suggests that it's possible, but the documentation has many holes and I can't find a way to actually do this. If this can be accomplished, I can use file-embed to include the plugin source file into the compiled binary.
Example:
module Main where
-- Dynamic loading of modules
import GHC
import GHC.Paths ( libdir )
import DynFlags
import Unsafe.Coerce
import Data.Time.Clock (getCurrentTime)
import StringBuffer
pluginModuleNameStr :: String
pluginModuleNameStr = "MyPlugin"
pluginSourceStr :: String
pluginSourceStr = unlines
[ "module MyPlugin where"
, "computation :: Int"
, "computation = 4"
]
pluginModuleName :: ModuleName
pluginModuleName = mkModuleName pluginModuleNameStr
pluginSource :: StringBuffer
pluginSource = stringToStringBuffer pluginSourceStr
main :: IO ()
main = do
currentTime <- getCurrentTime
defaultErrorHandler defaultFatalMessager defaultFlushOut $ do
result <- runGhc (Just libdir) $ do
dflags <- getSessionDynFlags
setSessionDynFlags dflags
let target = Target { targetId = TargetModule $ pluginModuleName
, targetAllowObjCode = True
, targetContents = Just ( pluginSource
, currentTime
)
}
setTargets [target]
r <- load LoadAllTargets
case r of
Failed -> error "Compilation failed"
Succeeded -> do
setContext [IIDecl $ simpleImportDecl pluginModuleName]
result <- compileExpr ("MyPlugin.computation")
let result' = unsafeCoerce result :: Int
return result'
print result
This, however, results in
<command-line>: panic! (the 'impossible' happened)
(GHC version 8.0.2 for x86_64-apple-darwin):
module ‘MyPlugin’ is a package module
Edit (08.12.2017)
I can compile the "plugin" directly into the final binary by writing the source to a temp file and then loading it like in the linked post (Dynamically loading compiled Haskell module - GHC 7.6). However, this does not play well if the plugin imports packages from Hackage:
module Main where
import Control.Monad.IO.Class (liftIO)
import DynFlags
import GHC
import GHC.Paths (libdir)
import System.Directory (getTemporaryDirectory, removePathForcibly)
import Unsafe.Coerce (unsafeCoerce)
pluginModuleNameStr :: String
pluginModuleNameStr = "MyPlugin"
pluginSourceStr :: String
pluginSourceStr = unlines
[ "module MyPlugin where"
, "import Data.Aeson"
, "computation :: Int"
, "computation = 4"
]
writeTempFile :: IO FilePath
writeTempFile = do
dir <- getTemporaryDirectory
let file = dir ++ "/" ++ pluginModuleNameStr ++ ".hs"
writeFile file pluginSourceStr
return file
main :: IO ()
main = do
moduleFile <- writeTempFile
defaultErrorHandler defaultFatalMessager defaultFlushOut $ do
result <- runGhc (Just libdir) $ do
dflags <- getSessionDynFlags
setSessionDynFlags dflags
target <- guessTarget moduleFile Nothing
setTargets [target]
r <- load LoadAllTargets
liftIO $ removePathForcibly moduleFile
case r of
Failed -> error "Compilation failed"
Succeeded -> do
setContext [IIDecl $ simpleImportDecl $ mkModuleName pluginModuleNameStr]
result <- compileExpr "MyPlugin.computation"
let result' = unsafeCoerce result :: Int
return result'
print result
Is there a way to load packages when, for instance, MyPlugin contains the statement import Data.Aeson? If I add it to the plugin string, it fails with
/var/folders/t2/hp9y8x6s6rs7zg21hdzvhbf40000gn/T/MyPlugin.hs:2:1: error:
Failed to load interface for ‘Data.Aeson’
Perhaps you meant Data.Version (from base-4.9.1.0)
Use -v to see a list of the files searched for.
haskell-loader-exe: panic! (the 'impossible' happened)
(GHC version 8.0.2 for x86_64-apple-darwin):
Compilation failed
CallStack (from HasCallStack):
error, called at app/Main.hs:40:19 in main:Main
The reason for my request is database support: We use Persistent to access a database and the dynamic import is needed to support multiple databases (MySQL, PostgreSQL and SQLite) while still allowing the end user to only install one of the three database servers (with other words: not requiring the user to install all of them if they only use, for instance, PostgreSQL). The module that is database-specific should only be loaded when the user actually configures the main application to use that module.
If I don't import Database.Persist.MySQL, then the application does not require MySQL to be installed. Otherwise, the application fails with, for instance,
dyld: Library not loaded:
/usr/local/opt/mysql/lib/libmysqlclient.20.dylib
on macOS.
A file with a matching module name must exist by the looks of it - it doesn't seem to matter what the file's content is.
On Linux I can even make it be a symlink to /dev/null and things work but a symlink to itself doesn't.
To learn a bit about Turtle, I thought it would be nice to modify example from the tutorial. I chose to remove the reduntant "FilePath" from each line of the output thinking it would be a simple exercise.
And yet, despite author's efforts into making his library easy to use I nearly failed to use it to solve this simple problem.
I tried everyting I saw that looked like it would allow me to somehow lift >>= from IO into Shell: MonadIO, FoldM, liftIO, _foldIO with no success. I grew frustrated and only through reading Turtle source code I was able to find something that seems to work ("no obvious defects" comes to mind).
Why is this so hard? How does one logically arrive a solution using API of this library?
#!/usr/bin/env stack
-- stack --resolver lts-8.17 --install-ghc runghc --package turtle --package lens
{-# LANGUAGE OverloadedStrings #-}
import Turtle
import Control.Lens
import Control.Foldl as Foldl
import Filesystem.Path.CurrentOS
import Data.Text.IO as T
import Data.Text as T
main = do
homedir <- home
let paths = lstree $ homedir </> "projects"
let t = fmap (Control.Lens.view _Right . toText) paths
customView t
customView s = sh (do
x <- s
liftIO $ T.putStrLn x)
You don't lift >>= from IO into Shell. Shell already has a Monad instance that comes with its own >>= function. Instead you either lift IO actions into Shell with liftIO or run the shell with fold or foldM. Use sh to run the Shell when you don't care about the results.
I believe your example can be simplified to
main = sh $ do
homedir <- home
filepath <- lstree $ homedir </> "projects"
case (toText filepath) of
Right path -> liftIO $ T.putStrLn x
Left approx -> return () -- This shouldn't happen
As for the difficulty with getting a string back from a FilePath, I don't think that can be blamed on the Turtle author. I think it can be simplified to
stringPath :: FilePath -> String
stringPath filepath =
case (toText filePath) of -- try to use the human readable version
Right path -> T.unpack path
Left _ -> encodeString filePath -- fall back on the machine readable one
Combined this would simplify the example to
main = sh $ do
homedir <- home
filepath <- lstree $ homedir </> "projects"
liftIO $ putStrLn (stringPath filepath)
or
main = view $ do
homedir <- home
filepath <- lstree $ homedir </> "projects"
return $ stringPath filepath
I have many haskell packages and I have enabled some flag to allow them generate haddock documents. Now these documents are under directories like /usr/share/doc/{package-name}-{version}/html/.
Is there a tool to organize them? I want something like all packages by name page in hackage,
so that local links to all these installed packages can be found in one page.
It'll be better if hoogle can be told to use these documents. By now my hoogle search resutls are all pointing to the corresponding pages in hackage.
Since my question has not yet been answered, I wrote a quick and dirty program to answer my first question:
import System.Directory
import System.IO
import System.Environment
import System.Exit
import System.Path
import System.FilePath.Posix
import Control.Applicative
import Control.Monad
import Data.Maybe
import Data.List
import Text.Printf
-- | make markdown table row
makeTableRow :: String -> FilePath -> String
makeTableRow dirName htmlPath = intercalate "|" [ dirName
, link "frames"
, link "index"
, link "doc-index"]
where
link s = printf "[%s](%s)" s $ htmlPath </> s ++ ".html"
scanAndMakeTable :: String -> IO [String]
scanAndMakeTable relDocPath = do
(Just docPath) <- absNormPath' <$> getCurrentDirectory <*> pure relDocPath
dirs <- getDirectoryContents docPath
items <- liftM catMaybes
. mapM (asHaskellPackage docPath)
. sort $ dirs
return $ headers1:headers2:map (uncurry makeTableRow) items
where
headers1 = "| " ++ intercalate " | " (words "Package Frames Contents Index") ++ " |"
headers2 = intercalate " --- " $ replicate 5 "|"
absNormPath' a p = addMissingRoot <$> absNormPath a p
-- sometimes the leading '/' is missing in absNormPath results
addMissingRoot s#('/':_) = s
addMissingRoot s = '/' : s
asHaskellPackage :: String -> String -> IO (Maybe (String,FilePath))
asHaskellPackage docPath dirName = do
-- a valid haskell package has a "haddock dir"
-- in which we can at least find a file with ".haddock" as extension name
b1 <- doesDirectoryExist haddockFileDir
if b1
then do
b2 <- any ((== ".haddock") . takeExtension)
<$> getDirectoryContents haddockFileDir
return $ if b2 then Just (dirName,haddockFileDir) else Nothing
else return Nothing
where
-- guess haddock dir
haddockFileDir = docPath </> dirName </> "html"
main :: IO ()
main = do
args <- getArgs
case args of
[docPath'] -> scanAndMakeTable docPath' >>= putStrLn . unlines
_ -> help
where
help = hPutStrLn stderr "Usage: <program> <path-to-packages>"
>> exitFailure
By observing the structure of these haddock directories, I recognize haddock directories by testing:
if there's a subdirectory called html.
if in the subdirectory html, there is a file with .haddock as extension name.
Run the program with runghc <source-file> /usr/share/doc/ >document-nav.md should generate a markdown file containing links to documents. Afterward just pipe it to pandoc or some other markdown2html converter and use the resulting HTML file in a browser to navigate through package documents.
I find myself doing more and more scripting in haskell. But there are some cases where I'm really not sure of how to do it "right".
e.g. copy a directory recursively (a la unix cp -r).
Since I mostly use linux and Mac Os I usually cheat:
import System.Cmd
import System.Exit
copyDir :: FilePath -> FilePath -> IO ExitCode
copyDir src dest = system $ "cp -r " ++ src ++ " " ++ dest
But what is the recommended way to copy a directory in a platform independent fashion?
I didn't find anything suitable on hackage.
This is my rather naiv implementation I use so far:
import System.Directory
import System.FilePath((</>))
import Control.Applicative((<$>))
import Control.Exception(throw)
import Control.Monad(when,forM_)
copyDir :: FilePath -> FilePath -> IO ()
copyDir src dst = do
whenM (not <$> doesDirectoryExist src) $
throw (userError "source does not exist")
whenM (doesFileOrDirectoryExist dst) $
throw (userError "destination already exists")
createDirectory dst
content <- getDirectoryContents src
let xs = filter (`notElem` [".", ".."]) content
forM_ xs $ \name -> do
let srcPath = src </> name
let dstPath = dst </> name
isDirectory <- doesDirectoryExist srcPath
if isDirectory
then copyDir srcPath dstPath
else copyFile srcPath dstPath
where
doesFileOrDirectoryExist x = orM [doesDirectoryExist x, doesFileExist x]
orM xs = or <$> sequence xs
whenM s r = s >>= flip when r
Any suggestions of what really is the way to do it?
I updated this with the suggestions of hammar and FUZxxl.
...but still it feels kind of clumsy to me for such a common task!
It's possible to use the Shelly library in order to do this, see cp_r:
cp_r "sourcedir" "targetdir"
Shelly first tries to use native cp -r if available. If not, it falls back to a native Haskell IO implementation.
For further details on type semantics of cp_r, see this post written by me to described how to use cp_r with String and or Text.
Shelly is not platform independent, since it relies on the Unix package, which is not supported under Windows.
I couldn't find anything that does this on Hackage.
Your code looks pretty good to me. Some comments:
dstExists <- doesDirectoryExist dst
This does not take into account that a file with the destination name might exist.
if or [not srcExists, dstExists] then print "cannot copy"
You might want to throw an exception or return a status instead of printing directly from this function.
paths <- forM xs $ \name -> do
[...]
return ()
Since you're not using paths for anything, you can change this to
forM_ xs $ \name -> do
[...]
The filesystem-trees package provides the means for a very simple implementation:
import System.File.Tree (getDirectory, copyTo_)
copyDirectory :: FilePath -> FilePath -> IO ()
copyDirectory source target = getDirectory source >>= copyTo_ target
The MissingH package provides recursive directory traversals, which you might be able to use to simplify your code.
I assume that the function in Path.IO copyDirRecur with variants to include/exclude symlinks may be a newer and maintained solution. It requires to convert the filepath to Path x Dir which is achieved with parseRelDir respective parseAbsDir, but I think to have a more precise date type than FilePath is worthwile to avoid hard to track errors at run-time.
There are also some functions for copying files and directories in the core Haskell library Cabal modules, specifically Distribution.Simple.Utils in package Cabal. copyDirectoryRecursive is one, and there are other functions near this one in that module.