The Threepenny-gui changelog (https://hackage.haskell.org/package/threepenny-gui-0.6.0.1/changelog) reads: "The functions loadFile and loadDirectory have been removed, as I felt that the jsStatic option is sufficient for most use cases."
My question is: how can we reload an image that is updated during execution without loadFile?
With Threepenny-gui 0.5 I used the following code:
redraw :: UI.Element -> IORef CompTree -> (Maybe Vertex) -> UI ()
redraw img treeRef mcv
= do tree <- UI.liftIO $ readIORef treeRef
UI.liftIO $ writeFile ".Hoed/debugTree.dot" (shw $ summarize tree mcv)
UI.liftIO $ system $ "dot -Tpng -Gsize=9,5 -Gdpi=100 .Hoed/debugTree.dot "
++ "> .Hoed/wwwroot/debugTree.png"
url <- UI.loadFile "image/png" ".Hoed/wwwroot/debugTree.png"
UI.element img # UI.set UI.src url
When, with Threepenny-gui 0.6, I set jsStatic to Just "./.Hoed/wwwroot", the following code (obviously) results in my GUI only showing the initial image that already was there when my application started:
redraw :: UI.Element -> IORef CompTree -> (Maybe Vertex) -> UI ()
redraw img treeRef mcv
= do tree <- UI.liftIO $ readIORef treeRef
UI.liftIO $ writeFile ".Hoed/debugTree.dot" (shw $ summarize tree mcv)
UI.liftIO $ system $ "dot -Tpng -Gsize=9,5 -Gdpi=100 .Hoed/debugTree.dot "
++ "> .Hoed/wwwroot/debugTree.png"
UI.element img # UI.set UI.src "static/debugTree.png"
return ()
My full code for Threepenny-gui 0.5 is here: https://github.com/MaartenFaddegon/Hoed/blob/master/Debug/Hoed/DemoGUI.hs
(Author here.) Apparently, I didn't consider your use case when removing these functions. :-) I can add them back in if you like, could make an issue on github?
There are various methods on the JavaScript side to reload a file at a certain URL. See for instance the question "Refresh image with a new one at the same url".
Related
I want to be able to prompt the user for input (let's say a FilePath), but also to offer a mutable/interactive string as a default, so instead of having the user type the full path, I can prompt with:
C:\Users\John\project\test
and have them be able to backspace 4 times and enter final to yield C:\Users\John\project\final, rather than type the entire path.
However printing a default string with putStr or System.IO.hPutStr stdout does print this default to the terminal, but does not allow me to alter any of it. E.g.
import System.IO
main = do
hSetBuffering stdout NoBuffering
putStr "C:\\Users\\John\\project\\test"
l <- getLine
doSomethingWith l
I suspect Data.Text.IO's interact may be able to do what I want but I could not get it to work.
Any suggestions would be greatly appreciated.
getLine doesn’t offer any facility for line editing. For this you can use a library like haskeline instead, for example:
import System.Console.Haskeline
main :: IO ()
main = do
runInputT defaultSettings $ do
mInput <- getInputLineWithInitial "Enter path: "
("C:\\Users\\John\\project\\test", "")
case mInput of
Nothing -> do
outputStrLn "No entry."
Just input -> do
outputStrLn $ "Entry: " ++ show input
An alternative is to invoke the program with a wrapper that provides line editing, such as rlwrap. For building a more complex fullscreen text UI, there is also brick, which provides a simple text editing component in Brick.Widgets.Edit.
I'm switching (or trying to) from the brilliant tup to haskell shake as my build system..
Only I can't figure out how to get shake to rebuild files on changes.
I could of course use inotify or a wrapper like filewatcher or even watchman.
Since I'm using shake though, I was wondering how to integrate with twitch which shares the do syntax, but otherwise doesn't provide much in way of documentation..
The ultimate goal is to use pandoc for multi format documents.
The only reason tup was inadequate was because it doesn't support targets.
First of all, you should to write your own shake build rules. Then, when some source file will be changed, you should to run your build rules to produce your targets.
Like this:
main = defaultMain $ do
"src/*.md" |> const build
build = shakeArgs shake{shakeFiles="out"} $ do
want ["out/foo.html", "out/foo.pdf"]
"out/*.html" %> \out -> do
let src = "src" </> dropDirectory1 out -<.> "md"
cmd_ "pandoc -o" [out] src
"out/*.pdf" %> \out -> do
let src = "src" </> dropDirectory1 out -<.> "md"
cmd_ "pandoc -o" [out] src
When a markdown file in src directory will be changed, then out/foo.html and out/foo.pdf will be updated.
If you want to optimize work of shake then you can do like this:
main = defaultMain $ do
"src/*.md" |> build . dependentTargets
build targets = shakeArgs shake{shakeFiles="out"} $ do
want targets
...
dependentTargets src
| "*.md" ?== src = ["out/foo.html", "out/foo.pdf"]
| otherwise = []
The package twitch recommends to use extension OverloadedStrings for compile code like this:
"src/*.md" |> ...
But this leads to ambiguous code in other parts of the program. For fix that, you can explicitly converting String to Dep like this:
import Data.String
fromString "src/*.md" |> ...
You can improve this code by redefining the (|>) operator:
import Data.String
import Twitch hiding ((|>))
pattern |> callback = addModify callback $ fromString pattern
"src/*.md" |> ...
I use shake for building a web site and have wrapped it into twitch to rerun the shake build when some files change. The main call for the watching functions (it uses forkIO to watches in two directories, and each can run shake) is bracketed; it also starts the web server.
mainWatch :: SiteLayout -> Port -> Path Abs Dir -> IO ()
mainWatch layout bakedPort bakedPath = bracketIO
(do -- first
shake layout
watchDough <- forkIO (mainWatchDough layout) -- calls shake
watchTemplates <- forkIO (mainWatchThemes layout) -- calls shake
scotty bakedPort (site bakedPath)
return (watchDough,watchTemplates) )
(\(watchDough,watchTemplates) -> do -- last
putIOwords ["main2 end"]
killThread watchDough
killThread watchTemplates
return ()
)
(\watch -> do -- during
return ()
)
Hope this can be adapted to your case!
I would like to remove files that no longer have source but without cleaning.
Is there support for partially cleaning an incremental build? In this case, I guess I could compare against set of source files that were consumed in previous builds and define how to clean those that are gone.
main = shakeArgs shakeOptions { shakeVerbosity = Diagnostic } $ do
want [".build"]
phony ".build" $ do
files <- getDirectoryFiles "." ["//*.txt"]
let goals = map (-<.> "") files
need goals
"*" %> \out -> do
Stdout o <- cmd $ "sort " ++ (out ++ ".txt")
writeFile' out o
Using shakeArgsPrune you can define a function that gets passed the live files afterwards. You can then write something like:
import Development.Shake
import Development.Shake.FilePath
import Development.Shake.Util
import System.Directory.Extra
import Data.List
import System.IO
pruner :: [FilePath] -> IO ()
pruner live = do
present <- listFilesRecursive "output"
mapM_ removeFile $ map toStandard present \\ map toStandard live
main :: IO ()
main = shakeArgsPrune shakeOptions pruner $ do
... rules go here ...
This deletes all files in output that are not generated and up-to-date according to the build system as it stands. For a complete example see
http://neilmitchell.blogspot.co.uk/2015/04/cleaning-stale-files-with-shake.html.
The shakeArgsPrune function is only available in shake-0.15.1 and above, but is based on the shakeLiveFiles feature which has been available for longer and can be used directly if you so desire.
DISCLAIMER: I am somewhat new to Haskell.
I am writing an interpreter, or, in this context, a REPL. For that purpose I am using haskeline, which is nice for REPLs. It has the capability of storing the command line history within a file, which is also nice.
One problem I came across while working with it, though, is that it does not seem to expand "~" to the home directory, which means that I have to retrieve the home directory manually.
I could do it like this (and currently do):
-- | returns a fresh settings variable
addSettings :: Env -> Settings IO
addSettings env = Settings { historyFile = Just getDir
, complete = completeWord Nothing " \t" $
return . completionSearch env
, autoAddHistory = True
}
where
getDir :: FilePath
getDir = unsafePerformIO getHomeDirectory ++ "/.zepto_history"
But that uses unsafePerformIO, which makes me cringe. Do you know of a good and clean workaround that does not involve rewriting the whole function? This can be a haskeline feature I do not know of or something I just did not see.
Telling me there is no way around rewriting and rethinking it all is fine, too.
EDIT:
I know unsafePerformIO is bad, that's why it makes me cringe. If you are new to Haskell and reading this question right now: Just pretend it is not there.
A better approach would be to generate the Settings object inside IO, instead of the other way around, so to speak:
addSettings :: Env -> IO (Settings IO)
addSettings = do
getDir <- fmap (++ "/.zepto_history") getHomeDirectory
return $ Settings
{ historyFile = Just getDir
, complete = completeWord Nothing " \t" $ return . completionSearch env
, autoAddHistory = True
}
This will no doubt require some changes in your current software, but this would be considered the "right" way to go about this.
I'm writing CGI scripts in Haskell. When the user hits ‘submit’, a Haskell program runs on the server, updating (i.e. reading in, processing, overwriting) a status file. Reading then overwriting sometimes causes issues with lazy IO, as we may be able to generate a large output prefix before we've finished reading the input. Worse, users sometimes bounce on the submit button and two instances of the process run concurrently, fighting over the same file!
What's a good way to implement
transactionalUpdate :: FilePath -> (String -> String) -> IO ()
where the function (‘update’) computes the new file contents from the old file contents? It is not safe to presume that ‘update’ is strict, but it may be presumed that it is total (robustness to partial update functions is a bonus). Transactions may be attempted concurrently, but no transaction should be able to update if the file has been written by anyone else since it was read. It's ok for a transaction to abort in case of competition for file access. We may assume a source of systemwide-unique temporary filenames.
My current attempt writes to a temporary file, then uses a system copy command to overwrite. That seems to deal with the lazy IO problems, but it doesn't strike me as safe from races. Is there a tried and tested formula that we could just bottle?
The most idiomatic unixy way to do this is with flock:
http://hackage.haskell.org/package/flock
http://swoolley.org/man.cgi/2/flock
Here is a rough first cut that relies on the atomicity of the underlying mkdir. It seems to fulfill the specification, but I'm not sure how robust or fast it is:
import Control.DeepSeq
import Control.Exception
import System.Directory
import System.IO
transactionalUpdate :: FilePath -> (String -> String) -> IO ()
transactionalUpdate file upd = bracket acquire release update
where
acquire = do
let lockName = file ++ ".lock"
createDirectory lockName
return lockName
release = removeDirectory
update _ = nonTransactionalUpdate file upd
nonTransactionalUpdate :: FilePath -> (String -> String) -> IO ()
nonTransactionalUpdate file upd = do
h <- openFile file ReadMode
s <- upd `fmap` hGetContents h
s `deepseq` hClose h
h <- openFile file WriteMode
hPutStr h s
hClose h
I tested this by adding the following main and throwing a threadDelay in the middle of nonTransactionalUpdate:
main = do
[n] <- getArgs
transactionalUpdate "foo.txt" ((show n ++ "\n") ++)
putStrLn $ "successfully updated " ++ show n
Then I compiled and ran a bunch of instances with this script:
#!/bin/bash
rm foo.txt
touch foo.txt
for i in {1..50}
do
./SO $i &
done
A process that printed a successful update message if and only if the corresponding number was in foo.txt; all the others printed the expected SO: foo.txt.notveryunique: createDirectory: already exists (File exists).
Update: You actually do not want to use unique names here; it must be a consistent name across the competing processes. I've updated the code accordingly.