Sound lib haskell - audio

I need to find a sound library for haskell. I have followed instructions of some that are presented in haskell wiki http://www.haskell.org/haskellwiki/Sound_data_structures but couldnt get any to work. All i need is to play audio files, regardless its format.
I am developing a game with FunGen, and its time for music. I may have somo problems making them work togather, but I cant even play a simple sound example.
Most of my problems are due to cabal installation, or dependancies I cant find anywhare. I am using Windows.

Being on Windows is a major limitation in Haskell audio. Most packages are bindings to C code, and they tend to be either Linux-centric (JACK, Alsa) or at best nominally cross-platform but in practice difficult to build and use on Windows. You'll need to build or install the C library first before installing the Haskell binding.
Have you tried OpenAL? It's probably the most well-suited for what you want to do. If you install the C OpenAL libraries, the Haskell binding should be a fairly simple next step. There are a few add-on packages that are meant to simplify some common tasks, like ALUT and Alure.
Otherwise, most other solutions would actually involve several packages that may not work well together. hsndfile and hsndfile-vector are good for reading audio files (you'll need to have libsndfile installed), but they don't play sound. portaudio will play audio (again, you'll need the C portaudio library), but it hasn't been updated in a while (and some updates are badly needed). You'll also need to be moderately familiar with a C compiler chain (e.g. MinGW or Cygwin) to install the necessary C libraries first.
Another way might be to bind to a dedicated audio language such as Supercollider or Csound. This would be a rather heavyweight solution, but the bindings tend to be better-maintained and at least Csound should be easily installable on Windows (disclosure: I wrote the hCsound package). Sox (C library/executable) might also work for you (I've never tried it on Windows, but it claims to work).

All i need is to play audio files, regardless its format.
Your best bet for this will be hsndfile, mentioned in John L.'s excellent answer above. But you may want to make your life easier initially. The WAV file format is a very simple format -- essentially consisting of raw PCM data with minimal headers prepended. Try to get a WAV file to play, and then broaden your horizons from there.
You will be fighting uphill against a lack of sample code -- and you may find yourself needing foreign memory pointers and other not-so-straightforward techniques -- but there's a decent sample for opening and reading a WAV file here.
Also, when developing on Windows, learn to love the -fno-ghci-sandbox flag. It tells ghci that you're connecting to a library that uses its own thread local storage. It's often the only way to get certain external libraries to work in GHCi.

Related

Anjuta/Glade Tutorials or Better IDE?

I am attempting to develop a GUI application for Tails. I'm doing the initial development on Debian 8 since development directly in Tails can be a pain.
I started out using Anjuta, but the documentation is essentially non-existent. The Anjuta website has nothing at all about how Glade is integrated or how to use it. I can't even track down documentation on how to change the main window title. The only tutorial I found has you start a project and build it using the default files that are generated for a GTKmm project.
Is there a good book or online tutorial out there for doing GUI development in Anjuta?
This is maybe not a complete answer, but it's too large to put in as a comment. I use Anjuta fairly regularly, but I share your feeling about the missing documentation (which is, by the way, not unique for Anjuta). I appreciate Anjuta (and Glade) very much, so don't take the following as criticisms on either program.
I would recommend you consider using PyGTK for GUI creation. It is a lot more productive. You can design the GUI in Glade - exactly the same way you would do for C/C++ - and then implement the code in Python, which you can also edit and manage from Anjuta. There are plenty of code examples, for example on the nullege code search engine.
About the work flow in Anjuta (for C/C++). It is based mainly on the Autotools system, so you should really read up a little on make, Makefile, and related tools. Though in principle Anjuta manages this, you will, sooner or later hit a problem, and some knowledge about Autotools will help you a long way (also this tutorial or this one. This slide series is interesting - probably because it is more graphical. There are even some video tutorials, like this one.).
There is no real necessity to use Glade from inside Anjuta. In fact, Glade has passed a long process distancing itself from 'code generation'. It now only contains an XML generator, which can be called separately. I find the screen space left for Glade inside Anjuta insufficient for comfortable work anyway.
So, in conclusion: If you mainly need a GUI, consider Python + Gtk. If you do need C or C++, Anjuta is a great IDE, but look at Gtk Development examples (like this one). Following those, the use of Anjuta should be a lot clearer.
EDIT:
Very useful answer. I have some underlying legacy code that has to be
C++. Is there a way to mix Python and C++ in Anjuta, or do you know of
any guideposts or tutorials for such?
You can open a C++ project in Anjuta - maybe even import you legacy code directly as a Makefile project. You can also add new files to your C/C++ project and create them as Python files. I've never tried to do that though, and I'm not sure how Anjuta would treat them, for example, in the Makefile(s). I don't have large projects mixing languages at the moment, but for small projects, I like 'Geany', because it doesn't get in the way. You do have to maintain the Makefiles manually.

Mac Firebreath plugin port to Linux dependencies

I have Firebreath plugin for Mac that I need to port to Linux
I am trying to find the replacement dependencies for the Linux version. Here is the dependencies I have in my cmake for Mac:
find_library(COCOA_F Cocoa)
find_library(FOUNDATION_F Foundation)
find_library(APPKIT_F AppKit)
find_library(COREDATA_F CoreData)
find_library(AGL_F AGL)
find_library(CARBON_F Carbon)
find_library(AUDIOTOOLBOX_F AudioToolbox)
find_library(COREAUDIO_F CoreAudio)
find_library(AUDIOUNIT_F AudioUnit)
find_library(QUARTZCORE_F QuartzCore)
find_library(QUICKTIME_F QuickTime)
find_library(OPENGL_F OpenGL)
find_library(QTKIT_F QTKit)
So far,
OpenGL => freeglut3 ?
Cocoa => GLFW ?
any suggestions for the other dependencies?
It is very unlikely that there are many direct replacements for these. Rather than looking for direct replacements for each library, figure out what functionality you need and find libraries that provide it.
For example, Carbon covers so many different types of things that it's impossible to guess what libraries on linux you'd actually need. CoreAudio is a little more clear, as it deals with sound, but are you doing sound playback or some other type of sound manipulation?
Better to look for libraries that provide the functionality you need. For example, for sound playback you'll likely want alsa. OpenCV or video4linux both give you webcam access, etc.
Once you have broken it down to what functionality you need a bit of google searching will almost certainly answer your questions without needing to rely on outside help and you can save your questions for specific problems you may run into while doing the port.

VisualOcaml wanted to write a deminer (rather than using Ocaml's graphics library)

In Ocaml, is there easier ways to write a graphics-based toy programs like deminer (like the one that comes with Windows 95)? I find the only way is to start by scratch using Ocaml's graphics library. There must be better ways around?
There are bindings to the SDL library, that provides more features than Graphics.
There are actually several of them, and I'm not exactly sure which is best:
SdlCaml is a part of the [GLcaml] project
the OcamlSDL library
I think SdlCaml is more bare metal (probably partly automatically generated), and OCamlSDL is an older (but still occasionally updated) library with a larger user base.
Note however that Graphics is simple to use for a start, and you can still move to something more sophisticated later. If you run into speed-of-rendering issues, you have to use double buffering, as explained in the manual.

How to create an audio journal in Ubuntu 11.10

I understand that this question might be a little broad, but I am wanting to make an audio recording and playback program to act as a journal. What is the best way to go about this?
I have never done any programming with sound before and really have no idea where to even start, but I do understand that I am probably going to need codecs and interface my program with specific libraries available in Ubuntu. So, to make things a little more specific:
What libraries should I look at for recording and playback?
Are there any reliable resources that show how to effectively use these libraries?
The first question takes precedence of course, as I can use Google to find the answers to the second question myself.
A couple of Google searches didn't really provide specifics for what I am wanting to do here, so I thought I would ask the experts. However, if the question is unsuitable, let me know and I will remove it.
If this is just for you to use I would pick a scripting language with reasonable library coverage (perl, python, ruby) and then glue together the various modules needed.
In terms of architecture you could just have a folder of audio recordings, write a command-line tool that when run starts recording what you say until you press Control-D - then plays it back, asks "OK? 'Y/N'", and if you say Y saves it to the folder. You could then write a little web page generator that creates an index.html to the recordings.

Why use build tools like Autotools when we can just write our own makefiles?

Recently, I switched my development environment from Windows to Linux. So far, I have only used Visual Studio for C++ development, so many concepts, like make and Autotools, are new to me. I have read the GNU makefile documentation and got almost an idea about it. But I am kind of confused about Autotools.
As far as I know, makefiles are used to make the build process easier.
Why do we need tools like Autotools just for creating the makefiles? Since all knows how to create a makefile, I am not getting the real use of Autotools.
What is the standard? Do we need to use tools like this or would just handwritten makefiles do?
You are talking about two separate but intertwined things here:
Autotools
GNU coding standards
Within Autotools, you have several projects:
Autoconf
Automake
Libtool
Let's look at each one individually.
Autoconf
Autoconf easily scans an existing tree to find its dependencies and create a configure script that will run under almost any kind of shell. The configure script allows the user to control the build behavior (i.e. --with-foo, --without-foo, --prefix, --sysconfdir, etc..) as well as doing checks to ensure that the system can compile the program.
Configure generates a config.h file (from a template) which programs can include to work around portability issues. For example, if HAVE_LIBPTHREAD is not defined, use forks instead.
I personally use Autoconf on many projects. It usually takes people some time to get used to m4. However, it does save time.
You can have makefiles inherit some of the values that configure finds without using automake.
Automake
By providing a short template that describes what programs will be built and what objects need to be linked to build them, Makefiles that adhere to GNU coding standards can automatically be created. This includes dependency handling and all of the required GNU targets.
Some people find this easier. I prefer to write my own makefiles.
Libtool
Libtool is a very cool tool for simplifying the building and installation of shared libraries on any Unix-like system. Sometimes I use it; other times (especially when just building static link objects) I do it by hand.
There are other options too, see StackOverflow question Alternatives to Autoconf and Autotools?.
Build automation & GNU coding standards
In short, you really should use some kind of portable build configuration system if you release your code to the masses. What you use is up to you. GNU software is known to build and run on almost anything. However, you might not need to adhere to such (and sometimes extremely pedantic) standards.
If anything, I'd recommend giving Autoconf a try if you're writing software for POSIX systems. Just because Autotools produce part of a build environment that's compatible with GNU standards doesn't mean you have to follow those standards (many don't!) :) There are plenty of other options, too.
Edit
Don't fear m4 :) There is always the Autoconf macro archive. Plenty of examples, or drop in checks. Write your own or use what's tested. Autoconf is far too often confused with Automake. They are two separate things.
First of all, the Autotools are not an opaque build system but a loosely coupled tool-chain, as tinkertim already pointed out. Let me just add some thoughts on Autoconf and Automake:
Autoconf is the configuration system that creates the configure script based on feature checks that are supposed to work on all kinds of platforms. A lot of system knowledge has gone into its m4 macro database during the 15 years of its existence. On the one hand, I think the latter is the main reason Autotools have not been replaced by something else yet. On the other hand, Autoconf used to be far more important when the target platforms were more heterogeneous and Linux, AIX, HP-UX, SunOS, ..., and a large variety of different processor architecture had to be supported. I don't really see its point if you only want to support recent Linux distributions and Intel-compatible processors.
Automake is an abstraction layer for GNU Make and acts as a Makefile generator from simpler templates. A number of projects eventually got rid of the Automake abstraction and reverted to writing Makefiles manually because you lose control over your Makefiles and you might not need all the canned build targets that obfuscate your Makefile.
Now to the alternatives (and I strongly suggest an alternative to Autotools based on your requirements):
CMake's most notable achievement is replacing AutoTools in KDE. It's probably the closest you can get if you want to have Autoconf-like functionality without m4 idiosyncrasies. It brings Windows support to the table and has proven to be applicable in large projects. My beef with CMake is that it is still a Makefile-generator (at least on Linux) with all its immanent problems (e.g. Makefile debugging, timestamp signatures, implicit dependency order).
SCons is a Make replacement written in Python. It uses Python scripts as build control files allowing very sophisticated techniques. Unfortunately, its configuration system is not on par with Autoconf. SCons is often used for in-house development when adaptation to specific requirements is more important than following conventions.
If you really want to stick with Autotools, I strongly suggest to read Recursive Make Considered Harmful (archived) and write your own GNU Makefile configured through Autoconf.
The answers already provided here are good, but I'd strongly recommend not taking the advice to write your own makefile if you have anything resembling a standard C/C++ project. We need the autotools instead of handwritten makefiles because a standard-compliant makefile generated by automake offers a lot of useful targets under well-known names, and providing all these targets by hand is tedious and error-prone.
Firstly, writing a Makefile by hand seems a great idea at first, but most people will not bother to write more than the rules for all, install and maybe clean. automake generates dist, distcheck, clean, distclean, uninstall and all these little helpers. These additional targets are a great boon to the sysadmin that will eventually install your software.
Secondly, providing all these targets in a portable and flexible way is quite error-prone. I've done a lot of cross-compilation to Windows targets recently, and the autotools performed just great. In contrast to most hand-written files, which were mostly a pain in the ass to compile. Mind you, it is possible to create a good Makefile by hand. But don't overestimate yourself, it takes a lot of experience and knowledge about a bunch of different systems, and automake creates great Makefiles for you right out of the box.
Edit: And don't be tempted to use the "alternatives". CMake and friends are a horror to the deployer because they aren't interface-compatible to configure and friends. Every half-way competent sysadmin or developer can do great things like cross-compilation or simple things like setting a prefix out of his head or with a simple --help with a configure script. But you are damned to spend an hour or three when you have to do such things with BJam. Don't get me wrong, BJam is probably a great system under the hood, but it's a pain in the ass to use because there are almost no projects using it and very little and incomplete documentation. autoconf and automake have a huge lead here in terms of established knowledge.
So, even though I'm a bit late with this advice for this question: Do yourself a favor and use the autotools and automake. The syntax might be a bit strange, but they do a way better job than 99% of the developers do on their own.
For small projects or even for large projects that only run on one platform, handwritten makefiles are the way to go.
Where autotools really shine is when you are compiling for different platforms that require different options. Autotools is frequently the brains behind the typical
./configure
make
make install
compilation and install steps for Linux libraries and applications.
That said, I find autotools to be a pain and I've been looking for a better system. Lately I've been using bjam, but that also has its drawbacks. Good luck finding what works for you.
Autotools are needed because Makefiles are not guaranteed to work the same across different platforms. If you handwrite a Makefile, and it works on your machine, there is a good chance that it won't on mine.
Do you know what unix your users will be using? Or even which distribution of Linux? Do you know where they want software installed? Do you know what tools they have, what architecture they want to compile on, how many CPUs they have, how much RAM and disk might be available to them?
The *nix world is a cross-platform landscape, and your build and install tools need to deal with that.
Mind you, the auto* tools date from an earlier epoch, and there are many valid complaints about them, but the several projects to replace them with more modern alternatives are having trouble developing a lot of momentum.
Lots of things are like that in the *nix world.
Autotools is a disaster.
The generated ./configure script checks for features that have not been present on any Unix system for last 20 years or so. To do this, it spends a huge amount of time.
Running ./configure takes for ages. Although modern server CPUs can have even dozens of cores, and there may be several such CPUs per server, the ./configure is single-threaded. We still have enough years of Moore's law left that the number of CPU cores will go way up as a function of time. So, the time ./configure takes will stay approximately constant whereas parallel build times reduce by a factor of 2 every 2 years due to Moore's law. Or actually, I would say the time ./configure takes might even increase due to increasing software complexity taking advantage of improved hardware.
The mere act of adding just one file to your project requires you to run automake, autoconf and ./configure which will take ages, and then you'll probably find that since some important files have changed, everything will be recompiled. So add just one file, and make -j${CPUCOUNT} recompiles everything.
And about make -j${CPUCOUNT}. The generated build system is a recursive one. Recursive make has for a long amount of time been considered harmful.
Then when you install the software that has been compiled, you'll find that it doesn't work. (Want proof? Clone protobuf repository from Github, check out commit 9f80df026933901883da1d556b38292e14836612, install it to a Debian or Ubuntu system, and hey presto: protoc: error while loading shared libraries: libprotoc.so.15: cannot open shared object file: No such file or directory -- since it's in /usr/local/lib and not /usr/lib; workaround is to do export LD_RUN_PATH=/usr/local/lib before typing make).
The theory is that by using autotools, you could create a software package that can be compiled on Linux, FreeBSD, NetBSD, OpenBSD, DragonflyBSD and other operating systems. The fact? Every non-Linux system to build packages from source has numerous patch files in their repository to work around autotools bugs. Just take a look at e.g. FreeBSD /usr/ports: it's full of patches. So, it would have been as easy to create a small patch for a non-autotools build system on a per project basis than to create a small patch for an autotools build system on a per project basis. Or perhaps even easier, as standard make is much easier to use than autotools.
The fact is, if you create your own build system based on standard make (and make it inclusive and not recursive, following the recommendations of the "Recursive make considered harmful" paper), things work in a much better manner. Also, your build time goes down by an order of magnitude, perhaps even two orders of magnitude if your project is very small project of 10-100 C language files and you have dozens of cores per CPU and multiple CPUs. It's also much easier to interface custom automatic code generation tools with a custom build system based on standard make instead of dealing with the m4 mess of autotools. With standard make, you can at least type a shell command into the Makefile.
So, to answer your question: why use autotools? Answer: there is no reason to do so. Autotools has been obsolete since when commercial Unix has become obsolete. And the advent of multi-core CPUs has made autotools even more obsolete. Why programmers haven't realized that yet, is a mystery. I'll happily use standard make on my build systems, thank you. Yes, it takes some amount of work to generate the dependency files for C language header inclusion, but the amount of work is saved by not having to fight with autotools.
I dont feel I am an expert to answer this but still give you a bit analogy with my experience.
Because upto some extent it is similar to why we should write Embedded Codes in C language(High Level language) rather then writing in Assembly Language.
Both solves the same purpose but latter is more lenghty, tedious ,time consuming and more error prone(unless you know ISA of the processor very well) .
Same is the case with Automake tool and writing your own makefile.
Writing Makefile.am and configure.ac is pretty simple than writing individual project Makefile.

Resources