Where to start reading GHC's source code - haskell

I'm trying to learn how various aspects of GHC, like type inference, pattern matching, and other code transformations, are implemented.
However the codebase is fairly large and the file names use a lot of acronyms (simpl, stg, stranal...). What do these mean, and how is the code organized?

As hammar says, the GHC commentary is probably the best place to start for learning about GHC itself. This does assume some prior knowledge of compilers in general, but if your primary interest is in modifying GHC you can probably get by with just a basic CS background.
If you're interested more generally in the principles behind GHC, e.g. if you want to learn how to write your own compiler, you'd probably be better served by perusing Simon Peyton-Jones' myriad publications on relevant topics, including an entire book on implementing functional languages and a "tutorial" book that goes through the implementation of a non-strict functional language.

The GHC commentary is a good place to start.

Related

Is there any Template Haskell tutorial for someone who doesn't know Lisp?

I wanted to learn Template Haskell but all tutorials I find either assume that you learned lisp and know what lisp macros are, or that you know some cs theory jargon - things as splices, quasiquotations, etc... - or some theoretical results about macros.
I can't code a single line of lisp (and, though I intend to do this some day, I don't have the time to learn it right now). Haskell is my very first functional language and I learned it to the point that I can regularly code in it, use monads, applicative, understand the type system, etc... but I don't know much (also want to learn but I'm too stupid for it... :P) about the theoretical cs stuff behind it. So I'm oblivious to the jargon I typically find on TH tutorials.
So, the question is: is there a tutorial about TH for someone who code Haskell, not as a professional computer scientist, but just as a guy who uses programming for his daily chores, who learned Haskell as his first functional language? Maybe a introduction to macros and meta-programming that use TH as example?
Thanks all. :)
No, I don't think there are any great introductory tutorials to Template Haskell. The best way to learn is to look at examples, or:
First stab at Template Haskell
The user's manual
Igloo's paper :: PS
I never found Lisp was a requirement, however there is terminology to learn, like for any domain-specific library.
The best introductory tutorials to Template Haskell I know of are two documents by Bulat Ziganshin. The links from the Haskell Wiki seem to be broken at the moment, however you can access them via archive.org:
TH Tutorial
TH Documentation explanation
Also the original paper Template metaprogramming for Haskell by Tim Sheard and Simon Peyton Jones might be helpful:
Abstract
We propose a new extension to the purely functional programming language Haskell that supports compile-time meta-programming. The purpose of the system is to support the algorithmic construction of programs at compile-time.
The ability to generate code at compile time allows the programmer to implement such features as polytypic programs, macro-like expansion, user directed optimization (such as inlining), and the generation of supporting data structures and functions from existing data structures and functions.
Our design is being implemented in the Glasgow Haskell Compiler, ghc.

What language to learn after Haskell? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
As my first programming language, I decided to learn Haskell. I'm an analytic philosophy major, and Haskell allowed me to quickly and correctly create programs of interest, for instance, transducers for natural language parsing, theorem provers, and interpreters. Although I've only been programming for two and a half months, I found Haskell's semantics and syntax much easier to learn than more traditional imperative languages, and feel comfortable (now) with the majority of its constructs.
Programming in Haskell is like sorcery, however, and I would like to broaden my knowledge of programming. I would like to choose a new programming language to learn, but I do not have enough time to pick up an arbitrary language, drop it, and repeat. So I thought I would pose the question here, along with several stipulations about the type of language I am looking for. Some are subjective, some are intended to ease the transition from Haskell.
Strong type system. One of my favorite parts of programming in Haskell is writing type declarations. This helps structure my thoughts about individual functions and their relationship to the program as a whole. It also makes informally reasoning about the correctness of my program easier. I'm concerned with correctness, not efficiency.
Emphasis on recursion rather than iteration. I use iterative constructs in Haskell, but implement them recursively. However, it is much easier to understand the structure of a recursive function than a complicated iterative procedure, especially when using combinators and higher-order functions like maps, folds and bind.
Rewarding to learn. Haskell is a rewarding language to work in. It's a little like reading Kant. My experience several years ago with C, however, was not. I'm not looking for C. The language should enforce a conceptually interesting paradigm, which in my entirely subjective opinion, the C-likes do not.
Weighing the answers: These are just notes, of course. I'd just like to reply to everyone who gave well-formed responses. You have been very helpful.
1) Several responses indicated that a strong, statically typed language emphasizing recursion means another functional language. While I want to continue working strongly with Haskell, camccann and larsmans correctly pointed out that another such language would "ease the transition too much." These comments have been very helpful, because I am not looking to write Haskell in Caml! Of the proof assistants, Coq and Agda both look interesting. In particular, Coq would provide a solid introduction to constructive logic and formal type theory. I've spent a little time with first-order predicate and modal logic (Mendellsohn, Enderton, some of Hinman), so I would probably have a lot of fun with Coq.
2) Others heavily favored Lisp (Common Lisp, Scheme and Clojure). From what I gather, both Common Lisp and Scheme have excellent introductory material (On Lisp and The Reasoned Schemer, SICP). The material in SICP causes me to lean towards Scheme. In particular, Scheme through SICP would cover a different evaluation strategy, the implementation of laziness, and a chance to focus on topics like continuations, interpreters, symbolic computation, and so on. Finally, as others have pointed out, Lisp's treatment of code/data would be entirely new. Hence, I am leaning heavily towards option (2), a Lisp.
3) Third, Prolog. Prolog has a wealth of interesting material, and its primary domain is exactly the one I'm interested in. It has a simple syntax and is easy to read. I can't comment more at the moment, but after reading an overview of Prolog and skimming some introductory material, it ranks with (2). And it seems like Prolog's backtracking is always being hacked into Haskell!
4) Of the mainstream languages, Python looks the most interesting. Tim Yates makes the languages sound very appealing. Apparently, Python is often taught to first-year CS majors; so it's either conceptually rich or easy to learn. I'd have to do more research.
Thank you all for your recommendations! It looks like a Lisp (Scheme, Clojure), Prolog, or a proof assistant like Coq or Agda are the main langauages being recommended.
I would like to broaden my knowledge of programming. (...) I thought I would pose the question here, along with several stipulations about the type of language I am looking for. Some are subjective, some are intended to ease the transition from Haskell.
Strong type system. (...) It also makes informally reasoning about the correctness of my program easier. I'm concerned with correctness, not efficiency.
Emphasis on recursion rather than iteration. (...)
You may be easing the transition a bit too much here, I'm afraid. The very strict type system and purely functional style are characteristic of Haskell and pretty much anything resembling a mainstream programming language will require compromising at least somewhat on one of these. So, with that in mind, here are a few broad suggestions aimed at retaining most of what you seem to like about Haskell, but with some major shift.
Disregard practicality and go for "more Haskell than Haskell": Haskell's type system is full of holes, due to nontermination and other messy compromises. Clean up the mess and add more powerful features and you get languages like Coq and Agda, where a function's type contains a proof of its correctness (you can even read the function arrow -> as logical implication!). These languages have been used for mathematical proofs and for programs with extremely high correctness requirements. Coq is probably the most prominent language of the style, but Agda has a more Haskell-y feel (as well as being written in Haskell itself).
Disregard types, add more magic: If Haskell is sorcery, Lisp is the raw, primal magic of creation. Lisp-family languages (also including Scheme and Clojure) have nearly unparalleled flexibility combined with extreme minimalism. The languages have essentially no syntax, writing code directly in the form of a tree data structure; metaprogramming in a Lisp is easier than non-meta programming in some languages.
Compromise a bit and move closer to the mainstream: Haskell falls into the broad family of languages influenced heavily by ML, any of which you could probably shift to without too much difficulty. Haskell is one of the strictest when it comes to correctness guarantees from types and use of functional style, where others are often either hybrid styles and/or make pragmatic compromises for various reasons. If you want some exposure to OOP and access to lots of mainstream technology platforms, either Scala on the JVM or F# on .NET have a lot in common with Haskell while providing easy interoperability with the Java and .NET platforms. F# is supported directly by Microsoft, but has some annoying limitations compared to Haskell and portability issues on non-Windows platforms. Scala has direct counterparts to more of Haskell's type system and Java's cross-platform potential, but has a more heavyweight syntax and lacks the powerful first-party support that F# enjoys.
Most of those recommendations are also mentioned in other answers, but hopefully my rationale for them offers some enlightenment.
I'm going to be That Guy and suggest that you're asking for the wrong thing.
First you say that you want to broaden your horizons. Then you describe the kind of language that you want, and its horizons sound incredibly like the horizons you already have. You're not going to gain very much by learning the same thing over and over.
I would suggest you learn a Lisp — i.e. Common Lisp, Scheme/Racket or Clojure. They're all dynamically typed by default, but feature some sort of type hinting or optional static typing. Racket and Clojure are probably your best bets.
Clojure is more recent and has more Haskellisms like immutability by default and lots of lazy evaluation, but it's based on the Java Virtual Machine, which means it has some odd warts (e.g. the JVM doesn't support tail call elimination, so recursion is kind of a hack).
Racket is much older, but has picked up a lot of power along the way, such as static type support and a focus on functional programming. I think you'd probably get the most out of Racket.
The macro systems in Lisps are very interesting and vastly more powerful than anything you'll see anywhere else. That alone is worth at least looking at.
From the standpoint of what suits your major, the obvious choice seems like a logic language such as Prolog or its derivatives. Logic programming can be done very neatly in a functional language (see, e.g. The Reasoned Schemer) , but you might enjoy working with the logic paradigm directly.
An interactive theorem proving system such as twelf or coq might also strike your fancy.
I'd advise you learn Coq, which is a powerful proof assistant with syntax that will feel comfortable to the Haskell programmer. The cool thing about Coq is it can be extracted to other functional languages, including Haskell. There is even a package (Meldable-Heap) on Hackage that was written in Coq, had properties proven about its operation, then extracted to Haskell.
Another popular language that offers more power than Haskell is Agda - I don't know Agda beyond knowing it is dependently typed, on Hackage, and well respected by people I respect, but those are good enough reasons to me.
I wouldn't expect either of these to be easy. But if you know Haskell and want to move forward to a language that gives more power than the Haskell type system then they should be considered.
As you didn't mention any restrictions besides your subjective interests and emphasize 'rewarding to learn' (well, ok, I'll ignore the static typing restriction), I would suggest to learn a few languages of different paradigms, and preferably ones which are 'exemplary' for each of them.
A Lisp dialect for the code-as-data/homoiconicity thing and because they are good, if not the best, examples of dynamic (more or less strict) functional programming languages
Prolog as the predominant logic programming language
Smalltalk as the one true OOP language (also interesting because of its usually extremely image-centric approach)
maybe Erlang or Clojure if you are interested in languages forged for concurrent/parallel/distributed programming
Forth for stack oriented programming
(Haskell for strict functional statically typed lazy programming)
Especially Lisps (CL not as much as Scheme) and Prolog (and Haskell) embrace recursion.
Although I am not a guru in any of these languages, I did spend some time with each of them, except Erlang and Forth, and they all gave me eye-opening and interesting learning experiences, as each one approaches problem solving from a different angle.
So, though it may seem as if I ignored the part about your having no time to try a few languages, I rather think that time spent with any of these will not be wasted, and you should have a look at all of them.
How about a stack-oriented programming language? Cat hits your high points. It is:
Statically typed with type inference.
Makes you re-think common imperative languages concepts like looping. Conditional execution and looping are handled with combinators.
Rewarding - forces you to understand yet another model of computation. Gives you another way to think about and decompose problems.
Dr. Dobbs published a short article about Cat in 2008 though the language has changed slightly.
If you want a strong(er)ly typed Prolog, Mercury is an interesting choice. I've dabbled in it in the past and I liked the different perspective it gave me. It also has moded-ness (which parameters need to be free/fixed) and determinism (how many results are there?) in the type system.
Clean is very similar to Haskell, but has uniqueness typing, which are used as an alternative to Monads (more specifically, the IO monad). Uniqueness typing also does interesting stuff to working with arrays.
I'm a bit late but I see that no one has mentioned a couple of paradigms and related languages that can interest you for their high-level of abstraction and generality:
rewriting systems, like Maude or ELAN;
Constraint Handling Rules (CHR).
Despite its failure to meet one of your big criteria (static* typing), I'm going to make a case for Python. Here are a few reasons I think you should take a look at it:
For an imperative language, it is surprisingly functional. This was one of the things that struck me when I learned it. Take list comprehensions, for example. It has lambdas, first-class functions, and many functionally-inspired compositions on iterators (maps, folds, zips...). It gives you the option of picking whatever paradigm suits the problem best.
IMHO, it is, like Haskell, beautiful to code in. The syntax is simple and elegant.
It has a culture that focuses on doing things in a straightforward way, rather than focusing too minutely on efficiency.
I understand if you are looking for something else though. Logic programming, for instance, might be right up your alley, as others have suggested.
* I assume you mean static typing here, since you want to declare the types. Techincally, Python is a strongly typed language, since you can't arbitrarily interpret, say, a string as an number. Interestingly, there are Python derivatives that allow static typing, like Boo.
I would recommend you Erlang. It is not strong typed language and you should try it. It is very different approach to programming and you may find that there are problems where strong typing is not The Best Tool(TM). Anyway Erlang provides you tools for static type verification (typer, dialyzer) and you can use strong typing on parts where you gain benefits from it. It can be interesting experience for you but be prepared, it will be very different feeling. If you are looking for "conceptually interesting paradigm" you can found them in Erlang, message passing, memory separation instead sharing, distribution, OTP, error handling and error propagation instead of error "prevention" and so. Erlang can be far away from your current experience but still brain tickling if you have experience with C and Haskell.
Given your description, I would suggest Ocaml or F#.
The ML family are generally very good in terms of a strong type system. The emphasis on recursion, coupled with pattern matching, is also clear.
Where I am a bit hesitant is on the rewarding to learn part. Learning them was rewarding for me, no doubt. But given your restrictions and your description of what you want, it seems you are not actually looking for something much more different than Haskell.
If you didn't put your restrictions I would have suggested Python or Erlang, both of which would take you out of your comfort zone.
In my experience, strong typing + emphasis on recursion means another functional programming language. Then again, I wonder if that's very rewarding, given that none of them will be as "pure" as Haskell.
As other posters have suggested, Prolog and Lisp/Scheme are nice, even though both are dynamically typed. Many great books with a strong theoretical "taste" to them have been published about Scheme in particular. Take a look at SICP, which also conveys a lot of general computer science wisdom (meta-circular interpreters and the like).
Factor will be a good choice.
You could start looking into Lisp.
Prolog is a cool language too.
If you decide to stray from your preference for a type system,you might be interested in the J programming language. It is outstanding for how it emphasizes function composition. If you like point-free style in Haskell, the tacit form of J will be rewarding. I've found it extraordinarily thought-provoking, especially with regard to semantics.
True, it doesn't fit your preconceptions as to what you'd like, but give it a look. Just knowing that it's out there is worth discovering. The sole source of complete implementations is J Software, jsoftware.com.
Go with one of the main streams. Given the resources available, future marketability of your skill, rich developer ecosystem I think you should start with either Java or C#.
Great question-- I've been asking it myself recently after spending several months thoroughly enjoying Haskell, although my background is very different (organic chemistry).
Like you, C and its ilk are out of the question.
I've been oscillating between Python and Ruby as the two practical workhorse scripting languages today (mules?) that both have some functional components to them to keep me happy. Without starting any Rubyist/Pythonist debates here, but my personal pragmatic answer to this question is:
Learn the one (Python or Ruby) that you first get an excuse to apply.

domain specific languages and compilers

I was looking over Martin Fowler's recent book contents - Domain Specific Languages and I noticed some ANTLR example - that got me thinking that writing compilers will become more and more popular since people needs in this matter will increase.
So, will the compiler theory still be as arid (being subjective here) as it was until now or are there any chances that we'll get more applied, programmer oriented materials ?
Even though DSLs may seem to create more opportunities for creating new compilers, I don't think they will make the challenges of writing a compiler any easier. You can either use compiler tools like yacc to generate code to handle your dsl syntax, or you can hand carve your own parser with an eye towards better internal efficiency than what the yacc generators spit out.
Either way, you have to have sufficient knowledge of how to define and manipulate a language grammar to make your DSL work and to avoid loopholes and can't-get-there-from-here problems.
Spiffy tools help to implement the solution, but they don't solve the problem for you. To quote my high school chemistry teacher: "Sure! Bring your calculators to class! Calculators only help you get the wrong answer faster!"
So, will the compiler theory still be as arid (being subjective here) as it was until now or are there any chances that we'll get more applied, programmer oriented materials ?
I'd say that compiler theory is actually pretty rich, but may not centered around C style languages. If you want to look at some powerful tools commonly used by academic language designers, I suggest that you check out functional programming languages (ML, Scheme, LISP, Haskell, OCaml, Scala, Clojure, etc.). Personally I prefer Haskell with Parsec, but there are many options. I think the common consensus is that the structure of these languages is more conducive to language design and implementation, at least in a theoretical sense.
Like Kristopher said above, programmers don't necessarily make the best language designers. I've seen some really cool DSL's and I've seen some pretty awful ones (my opinion, of course, YMMV). Knowledge of language concepts is a must for designing any language, DSL or otherwise (Type theory, category theory, various code analyses, machine optimization, etc). Not to mention, if you're designing a DSL, you have to have a fairly intimate knowledge of the domain you're targeting.
Tools off the shelf like yacc, ANTLR, flex, and cup can make building your compiler easier like buying wood from a lumberyard to build your house is easier than going off into the woods and cutting down trees. Both get you the material for the structure, but you still have to know how to build the house. We will definitely see more DSLs in the near future and these tools will help. Will the DSLs be worth using or even useable, however? The tools won't make a difference here, at least in my opinion. Language design employs a lot of real computer science and/or mathematics. Good language designers will have to at least be familiar with both, and good language implementers must be familiar with language design tools.
As high-quality DSLs get easier to build, we are more likely to see more of them. There are several obstacles:
Choosing a good problem domain for a DSL. It has to broad enough to have appeal to more than the author, and narrow enough to have good solutions (C# doesn't count).
Implementing a DSL well. Lots of people seem to think if they have a parser they are done. Actually, you need a lot of technology: parsing, analysis, code generation, ... (See DMS Software Reengineering Toolkit for an engine that contains what I think is needed to produce DSLs effectively)
Acceptance of the DSL by the community. Its amazing how many people insist on coding in just the programming language they know, and nothing else.
There was an explosion of programming languages in the 70's and 80's. Then Java came along and killed everything off. Now we are in another phase of people inventing lots of languages. So, I'd say it is cyclical, and there is really nothing "new" going on.
However, one aspect that remains constant is that most programmers aren't very good at designing languages. Tools like yacc and ANTLR make some of the implementation easier, but they don't make would-be language designers any better at language design.
There already are some useful tools, look for Xtext, EMFText, Jetbrains MPS, Intentional Domain Workbench and Microsofts former OSLO project with the M language. All these tools make defining languages easier, although it has its cost, however for a DSL you might have a bit other requirements than for regular general purpose programming languages.

Why is the "Dynamic" part of Dynamic languages so good?

Jon Skeet posted this blog post, in which he states that he is going to be asking why the dynamic part of languages are so good. So i thought i'd preemptively ask on his behalf: What makes them so good?
The two fundamentally different approaches to types in programming languages are static types and dynamic types. They enable very different programming paradigms and they each have their own benefits and drawbacks.
I'd highly recommend Chris Smith's excellent article What to Know Before Debating Type Systems for more background on the subject.
From that article:
A static type system is a mechanism by which a compiler examines source code and assigns labels (called "types") to pieces of the syntax, and then uses them to infer something about the program's behavior. A dynamic type system is a mechanism by which a compiler generates code to keep track of the sort of data (coincidentally, also called its "type") used by the program. The use of the same word "type" in each of these two systems is, of course, not really entirely coincidental; yet it is best understood as having a sort of weak historical significance. Great confusion results from trying to find a world view in which "type" really means the same thing in both systems. It doesn't. The better way to approach the issue is to recognize that:
Much of the time, programmers are trying to solve the same problem with
static and dynamic types.
Nevertheless, static types are not limited to problems solved by dynamic
types.
Nor are dynamic types limited to problems that can be solved with
static types.
At their core, these two techniques are not the same thing at all.
The main thing is that you avoid a lot of redundancy that comes from making the programmer "declare" this, that, and the other. A similar advantage could be obtained through type inferencing (boo does that, for example) but not quite as cheaply and flexibly. As I wrote in the past...:
complete type checking or inference
requires analysis of the whole
program, which may be quite
impractical -- and stops what Van Roy
and Haridi, in their masterpiece
"Concepts, Techniques and Models of
Computer Programming", call "totally
open programming". Quoting a post of
mine from 2004: """ I love the
explanations of Van Roy and Haridi, p.
104-106 of their book, though I may or
may not agree with their conclusions
(which are basically that the
intrinsic difference is tiny -- they
point to Oz and Alice as interoperable
languages without and with static
typing, respectively), all the points
they make are good. Most importantly,
I believe, the way dynamic typing
allows real modularity (harder with
static typing, since type discipline
must be enforced across module
boundaries), and "exploratory
computing in a computation model that
integrates several programming
paradigms".
"Dynamic typing is recommended", they
conclude, "when programs must be as
flexible as possible". I recommend
reading the Agile Manifesto to
understand why maximal flexibility is
crucial in most real-world
application programming -- and
therefore why, in said real world
rather than in the more academic
circles Dr. Van Roy and Dr. Hadidi
move in, dynamic typing is generally
preferable, and not such a tiny issue
as they make the difference to be.
Still, they at least show more
awareness of the issues, in devoting 3
excellent pages of discussion about
it, pros and cons, than almost any
other book I've seen -- most books
have clearly delineated and preformed
precedence one way or the other, so
the discussion is rarely as balanced
as that;).
I'd start with recommending reading Steve Yegge's post on Is Weak Typing Strong Enough, then his post on Dynamic Languages Strike Back. That ought to at least get you started!
Let's do a few advantage/disadvantage comparisons:
Dynamic Languages:
Type decisions can be changed with minimal code impact.
Code can be written/compiled in isolation. I don't need an implementation or even formal description of the type to write code.
Have to rely on unit tests to find any type errors.
Language is more terse. Less typing.
Types can be modified at runtime.
Edit and continue is much easier to implement.
Static Languages:
Compiler tells of all type errors.
Editors can offer prompts like Intellisense much more richly.
More strict syntax which can be frustrating.
More typing is (usually) required.
Compiler can do better optimization if it knows the types ahead of time.
To complicate things a little more, consider that languages such as C# are going partially dynamic (in feel anyway) with the var construct or languages like Haskell that are statically typed but feel dynamic because of type inference.
Dynamic programming languages basically do things at runtime that other languages do at Compile time. This includes extension of the program, by adding new code, by extending objects and definitions, or by modifying the type system, all during program execution rather than compilation.
http://en.wikipedia.org/wiki/Dynamic_programming_language
Here are some common examples
http://en.wikipedia.org/wiki/Category:Dynamic_programming_languages
And to answer your original question:
They're slow, You need to use a basic text editor to write them - no Intellisense or Code prompts, they tend to be a big pain in the ass to write and maintain. BUT the most famous one (javascript) runs on practically every browser in the world - that's a good thing I guess. Lets call it 'broad compatibility'. I think you could probably get a dynamic language interpretor for most operating systems, but you certainly couldn't get a compiler for non dynamic languages for most operating systems.

Most interesting non-mainstream language? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I'm interested in compilers, interpreters and languages.
What is the most interesting, but forgotten or unknown, language you know about? And more importantly, why?
I'm interested both in compiled, interpreted and VM languages, but not esoteric languages like Whitespace or BF. Open source would be a plus, of course, since I plan to study and hopefully learn from it.
I love compilers and VMs, and I love Lua.
Lua is not as well supported as many other scripting languages, but from a mindset like yours I'm sure you will fall in love with Lua too. I mean it's like lisp, (can do anything lisp can as far as I know), has lots of the main features from ADA, plus it's got meta programming built right in, with functional programming and object oriented programming loose enough to make any type of domain language you might want. Besides the VM's code is simple C which means you can easily dig right into it to appreciate even at that level.
(And it's open-source MIT license)
I am a fan of the D programming language. Here is a wikipedia article and and intro from the official site.
Some snippets from the wikipedia article:
The D programming language, also known simply as D, is an object-oriented, imperative, multiparadigm system programming language by Walter Bright of Digital Mars. It originated as a re-engineering of C++, but even though it is predominantly influenced by that language, it is not a variant of C++. D has redesigned some C++ features and has been influenced by concepts used in other programming languages, such as Java, C# and Eiffel. A stable version, 1.0, was released on January 2, 2007. An experimental version, 2.0, was released on June 17, 2007.
on features:
D is being designed with lessons learned from practical C++ usage rather than from a theoretical perspective. Even though it uses many C/C++ concepts it also discards some, and as such is not strictly backward compatible with C/C++ source code. It adds to the functionality of C++ by also implementing design by contract, unit testing, true modules, garbage collection, first class arrays, associative arrays, dynamic arrays, array slicing, nested functions, inner classes, closures[2], anonymous functions, compile time function execution, lazy evaluation and has a reengineered template syntax. D retains C++'s ability to do low-level coding, and adds to it with support for an integrated inline assembler. C++ multiple inheritance is replaced by Java style single inheritance with interfaces and mixins. D's declaration, statement and expression syntax closely matches that of C++.
I guess a lot depends on what you mean by 'non-mainstream'.
Would lisp count as non-mainstream?
I would suggest having a look at Erlang - it's been getting a bit of press recently, so some of the learning resources are excellent. If you've used OO and/or procedural languages, Erlang will definitely bend your mind in new and exciting ways.
Erlang is a pure functional language, with ground-up support for concurrent, distributed and fault-tolerant programs. It has a number of interesting features, including the fact that variables aren't really variables at all - they cannot be changed once declared, and are in fact better understood as a form of pattern.
There is some talk around the blogosphere about building on top of the Erlang platform (OTP) and machine support for other languages like Ruby - Erlang would then become a kind of virtual machine for running concurrent apps, which would be a pretty exciting possibility.
I've recently fallen in love with Ocaml and functional languages in general.
Ocaml, for instance, offers the best of all possible worlds. You get code that compiles to executable native machine language as fast as C, or universally portable byte code. You get an interpreter to bring REPL-speed to development. You get all the power of functional programming to produce perfectly orthogonal structures, deep recursion, and true polymorphism. Atop all of this is support for Object-Orientation, which in the context of a functional language that already provides everything OOP promises (encapsulation, modularization, orthogonal functions, and polymorphic recyclability), means OOP that is forced to actually prove itself.
Smalltalk (see discussion linked here). Sort of the grand-daddy of the dynamic languages (with the possible exception of Lisp and SNOBOL). Very nice to work with and sadly trampled by Java and now the newer languages like Python and Ruby.
FORTH was a language designed for low level code on early CPU's. Its most notable feature was RPN stack based math operations. The same type of math used on early HP calculators. For example 1+2+3+4= would be written as 1, 2, 3, 4, + , +, +
Haskell and REBOL are both fascinating languages, for very different reasons.
Haskell can really open your eyes as a developer, with concepts like monads, partial application, pattern matching, algebraic types, etc. It's a smorgasbord for the curious programmer.
REBOL is no slouch either. It's deceptively simple at first, but when you begin to delve into concepts like contexts, PARSE dialects, and Bindology, you realize there's much more than meets the eye. The nice thing about REBOL is that it's much easier to get started with it than with Haskell.
I can't decide which I like better.
Boo targets the .NET framework and is open source. Inspired by Python.
Try colorForth.
PROLOG is a rule-based language with back-track functionality. You can produce very human-readable (as in prosa) code.
I find constraint languages interesting, but it is hard to know what constitutes forgotten or unknown. Here are some languages I know about (this is certainly not an exhaustive list of any kind):
Ciao, YAP, SWI-Prolog, and GNU Prolog are all Prolog implementations. I think they are all open source. Ciao, gnu prolog, and probably the others also, as is common in Prolog implementations, support other constraint types. Integer programming for example.
Mozart and Mercury are both, as I understand it, alternative logic programming languages.
Alice is more in the ML family, but supports constraint programming using the GECODE C++ library.
Drifting a little bit off topic....
Maude is an interesting term rewrite language.
HOL and COQ are both mechanized proof systems which are commonly used in the languages community.
Lambda-the-Ultimate is a good place to talk about and learn more about programming languages.
I would have to say Scheme, especially in it's R6RS incarnation.
Modula-2 is the non-mainstream language that I've found most interesting. Looks mainstream, but doesn't quite work like what we're used to. Inherits a lot from Pascal, and yet is different enough to provide interesting learning possibilities.
Have a look at Io at http://www.iolanguage.com/
or Lisaac at: https://gna.org/projects/isaac/
or Self at: http://self.sourceforge.net/
or Sather (now absolutly forgotten)
or Eiffel http://www.eiffel.com
Why here are a few reasons. Io is absolutly minimalistic and does not even have "control flow elements" as syntacit entities. Lisaad is a follow-up to Eiffel with many simplifications AFAIKT. Self is a followup to Smalltalk and Io has taken quite alot from Self also. The base thing is that the distinction between Class and Object has been given up. Sather is a anwer to Eiffel with a few other rules and better support for functional programming (right from the start).
And Eiffel is definitly a hallmark for statically typed OO-languages. Eiffel was the first langauge whith support for Design by contract, generics (aka templates) and one of the best ways to handle inheritance. It was and is one of the simpler languages still. I for my part found the best libraries for Eiffel.....
It's creator just has one problem, he did not accept other contributions to the OO field.....
Regards
I recently learned of the existence of Icon from this question.
I have since used it in answers to several questions. (1, 2, 3, 4)
It's interesting because of its evaluation strategy - it is the only imperative language I know that supports backtracking. It allows some nice succinct code for many things :)
Learning any language that requires you to rethink your programming habits is a must. A sure sign is the pace at which you skim through the documentation of a language's core (not library). Fast meaning fruitless here.
My short list would be, in my order of exposure and what were the concepts I learned from them:
Assembly, C: great for learning pointers and their arithmetic.
C++: same as C with an introduction to generics, as long as you can stand the incredibly verbose syntax.
Ruby/Lua: scripting languages, dynamically typed, writing bindings for existing C libraries.
Python/C#/Java: skipped, these languages look to me as a rehash of notions originating elsewhere with a huge standard library. Sure the whole packages are nice, but you won't learn new concepts here.
OCaml: type infererence done right, partial application, compiler infered genericity, immutability as a default, how to handle nulls elegantly.
Haskell: laziness by default, monads.
My €.02.
I can't believe Logo is so forgotten. Ok, it's Logo. Sort of like lisp, but with slightly uglier syntax. Although working with lists in Logo, one encounters the delightfully named 'butfirst' and 'butlast' operations. =P
ML. Learning it and using it forces you think differently about programming problems differently. It also grants one patience, in most cases. Most.
How about go? It's brand new, so it's unknown and not mainstream (yet).
It's interesting because the syntax looks like what happens after you put C and pascal into a jar and make 'em fight.
Well once it was called MUMPS but now its called InterSystems Caché
http://www.intersystems.com/cache/
First answer - Scheme. It's not too widely used, but definitely seems like a solid language to use, especially considering the robustness of DrScheme (which in fact compiles Scheme programs to native binary code).
After that - Haskell is incredibly interesting. It's a language which does lazy evaluation right, and the consequences are incredible (including such things as a one-line definition of the fibonnaci sequence).
Going more mainstream, Python is still not really widely accepted in the business circles, but it definitely should be, by now...
Ken Kahn's ToonTalk, a cartoon language with hard-core theoretic underpinnings:
http://www.toontalk.com/
Prograph: http://en.wikipedia.org/wiki/Prograph ... seems Prograph lives on as Marten:
http://andescotia.com/products/marten/
Self's IDE was/is a thing of beauty, talk about Flow (in the Csíkszentmihályi sense)...
Overall, though, I'd have to say Haskell is the most interesting, for the potential adavances in computing that it represents.
Harbour for dynamic type. Great opition to business apps.
Reia!
http://wiki.reia-lang.org/wiki/Reia_Programming_Language
It's Erlang made sense, it's beutifull and I'm in love. It's so unknown that it doesn't even have a wikipedia page!
The first major (non-BASIC) language that I learned was Dream Maker, from http://www.byond.com.
It's somewhat similar to C++ or Java, but it's largely pre-built for designing multiplayer online games. It's very much based on inheritance.
It's an intersting language especially as a starting language, it gets gratifying results quicker, and lets be honest, most people who are first learning to program are interested in one thing... games.
I find Factor, Oz and OCaml quite interesting. In fact, I have started using Factor for personal projects.
Rebol of course !
It's so simple but so powerfull learn it at http://reboltutorial.com
I've recently looked up a lot about Windows PowerShell.
While not necessarily just a language. It's an awesome shell that has a built-in scripting language. It's basically a super-beefed up command line shell.
Unlike Unix shells, where everything is string text (which definitely has it's benefits), PowerShell commands (cmdlets) use objects. It's based on the .Net framework so you guys who are familiar with that will have probably already figured out that anything PowerShell returns can be piped and the properties and methods of that object can be used. It's fun to say "everything is an object!" again just like when OOP was getting big.
Very neat stuff. For the first time, Windows is implementing some of the Unix command-line interface tools similar to grep and the whole bunch.
If you're interested in VMs, you should look at Parrot...There's a bunch of languages supported and that's pretty neat....
O'caml is a good language if you want to learn how to implement a compiler...

Resources