What is the tersest/densest commonly-used programming language currently available? - programming-languages

I refer you to the following video, which describes how to implement Conway's Game of Life in APL, using a few dozen keystrokes:
http://www.youtube.com/watch?v=a9xAKttWgP4
This video was featured prominently in the Return of Uncle Bob Martin podcast, in which Scott Hanselman complains that "his hands hurt" from programming in languages that require too many keystrokes.
Of course, none of us are going to replace our keyboard just to learn an old, obsolete programming language (or are we?), but I have heard that programmers can be two to three times more productive, depending on the language they are working in. Is it because they are working in "denser" languages?
What are the densest commonly used (practical) programming languages currently available? Do they improve productivity because they are dense?

just look at threads tagged code-golf
:)
From a definition of code golf ...
It seems that someone gives us a problem to solve, tags the question
code-golf and the winner is whoever completes the solution in the
fewest characters
Like in golf where the low score wins, the fewest amount of characters
"wins". While certainly the best solution in every case is not
necessarily the solution that has the fewest characters or fewest
lines of code, it can be a fun way to exercise your programming
muscles.

I think this question can only be answered if you consider the sort of support libraries that a language has available. For example I can do things in PHP using very few lines of code because there's loads of help for network requests, graphics processing, array and string handling etc., etc.
Using jQuery means I am typing less also when writing script. So the question isn't as simple as you suggest.

It has to be J.

Nowadays both Perl and many functional languages can be very terse, although APL is still considered the champion in that.
In terms of productivity, there is a level when terseness can help (Python and Ruby are considered more productive than Java/C# because they are more terse), and then there is a level where terseness makes the code very hard to read (APL is famous for that, as well as short Perl scripts). One needs a balance between the two. Also, there are a number of autocompletion editors that allow, for example, longer variable names without requiring a lot of extra typing.

Related

How do I compare programming languages for projects at work?

I am wondering what are some specific questions I should keep in mind when I am comparing programming languages for use on given work projects. For instance, I am told logic programming languages like Prolog are good for natural language processing. I'm not sure why exactly; I assume it is true because experts say so, but I don't know the consideration that guides them to that conclusion. So I am looking for a simple heuristic, a checklist of questions, I can apply to evaluate programming languages and be able to explain my decisions, so that I can say "Language X is good for Y because it does Z."
The only way I know of to figure out which programming language is most appropriate for a given problem, is to know lots of programming languages. After all, if you don't know screwdrivers exist, how will you know not to use a hammer when you encounter a screw?
Unfortunately, there are thousands (maybe tens of thousands) of programming languages, so learning even a significant portion of them is just not realistic.
However, programming languages implement paradigms. And Peter van Roy's famous poster only lists about 34 of those. Although he deliberately decided to ignore several aspects, including anything related to typing, so the real number is probably higher than that. But we can expect it to be well below 100.
That's still a lot, though, but thankfully, paradigms aren't atomic either, they are composed of concepts. The poster lists about a dozen of those (again ignoring typing and a couple of other things). Significantly less than paradigms.
Learning a significant portion of concepts is entirely feasible. Once you know them, you can look at a problem and see which concepts would be useful to have to build a solution. Then you look at which paradigms contain those concepts and which languages implement those paradigms. Pick one, learn it, use it, solve the problem.
And since you already know the concepts (and thus the paradigms) the language implements, you only need to learn the syntax, not the semantics. There aren't actually that many different syntaxes in the wild (C, C++, Objective-C, Objective-C++, D, Go, Java, C#, ECMAScript, PHP, Vala and many others share a lot of syntax, for example, as do Smalltalk, Self, Newspeak and Objective-C, SML, OCaml and F#, and so on), so chances are, you'll pick that up very quickly. (Besides, with today's modern IDEs that's much less of an issue anyway.)
One small point to bear in mind: if you are an expert in language X and you are asked to develop a program in domain Y for which language Z is supposed to be ideal -- will you deliver sooner and better by writing in the language you are an expert in even if it is not (by some measures) ideal for the problem domain ? Or will you deliver better and sooner by first learning a new language ?
I think your search for a simple heuristic is in vain.
Start with what you're team is familiar with. While there's a lot to be said for the philosophy that a great developer can pick up almost any language in short order, there's a practical side that goes to the fact that if you have a ton of .Net or Java coders, you're best served in starting from that base.
Now, within both stacks you have options on things like functional programming (F#, Erlang, etc.) and other languages on the runtime your team is most familiar with. But it really does boil down to the culture, infrastructure, and (most importantly) the experience and flexibility of the individual developers on your team.
There are several factors to consider:
What is your local expertise? If you have a company full of C programmers, it's probably not worth retraining everybody to be Lisp programmers.
What are your libraries? If you have libraries that you want to use, make sure that they are compatible with your language of choice.
If you are starting a new project with a wide-open field of options, I would recommend taking a few sample problems out of the application domain. Nothing too complex, but nothing trivial either. Then, implement (or have someone from your team implement) these samples in each candidate language. Then, choose the one that is clear, easy, and appropriate.

Is similarity to "natural language" a convincing selling point for a programming language? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Look, for example at AppleScript (and there are plenty of others, some admittedly quite good) which advertise their use of the natural language metaphor. Code is apparently more readable because it can be/is intended to be constructed in English-like sentences, says they. I'm sure there are people who would like nothing better than to program using only English sentences. However, I have doubts about the viability of a language that takes that paradigm too far (excepting niche cases).
So, after a certain reasonable point, is natural-languaginess a benefit or a misfeature? What if the concept is carried to an extreme -- will code necessarily be more readable? Or might it be unnecessarily long, difficult to work with, and just as capable of producing hilarity on the scale of obfuscated Perl, obfuscated C, and eye-twisting Bash script logorrhea?
I am aware of some specialty cases like "Inform" that are almost pure English, but these have a niche that they're not likely to venture out from. I hear and read about how great it would be for code to read more like English sentences, but are there discussions of the possible disadvantages? If everyday language is so clear, simple, clean, lovely, concise, understandable, why did we invent mathematical notation in the first place?
Is it really easier to describe complex instructions accurately and precisely to a machine in natural language, or isn't something closer to mathematical markup a much better choice? Where should that line be drawn? And finally, are you attracted to languages that are touted as resembling English sentences? Should this whole question have just been a one liner:
naturalLanguage > computerishLanguage ? booAndHiss : cheerLoudly;
Well, of course, natural languages are rarely clear, simple, clean, lovely, concise, understandable which is one of the reasons that most programming is done in languages far from natural.
My answer to this would be that the ideal programming language lies somewhere between a natural language and a very formal language.
On the one extreme, there's the formal, minimal, mathematical languages. Take for example Brainfuck:
,>++++++[<-------->-],[<+>-]<. // according to Wikipedia, this means addition
Or, what's somewhat preferable to the above mess, any type of lambda calculus.
λfxy.x
λfxy.y
This is one possible way of expressing the Boolean truth values in lambda calculus. Doesn't look very neat, especially when you build logical operators (such as AND being e.g. λpq.pqp) around them.
I claim that most people could not write production code in such a minimalistic, hard-to-grasp language.
The problem on the other end of the spectrum, namely natural languages as they are spoken by humans, is that languages with too much complexity and flexibility allows the programmer to express vague and indefinite things that can mean nothing to today's computers. Let's take this sample program:
MAYBE IT WILL RAIN CATS AND DOGS LATER ON. WOULD YOU LIKE THIS, DEAR COMPUTER?
IF SO, PRINT "HELLO" ON THE SCREEN.
IF YOU HATE RAIN MORE THAN GEORGE DOES, PRINT SOME VAGUE GARBAGE INSTEAD.
(IN THE LATTER CASE, IT IS UP TO YOU WHERE YOU OUTPUT THAT GARBAGE.)
Now this is an obvious case of vagueness. But sometimes you would get things wrong with more reasonable natural language programs, such as:
READ AN INTEGER NUMBER FROM THE TERMINAL.
READ ANOTHER INTEGER NUMBER FROM THE TERMINAL.
IF IT IS LARGER THAN ZERO, PRINT AN ERROR.
Which number is IT referring to? And what kind of error should be printed (you forgot to specify it.) — You would have to be really careful to be extremely explicit about what you mean.
It's already too easy to mis-understand other humans. How do you expect a computer to do better?
Thus, a computer language's syntax and grammar has to be strict enough so that it doesn't allow ambiguity. A statement must evaluate in a deterministic way. (There are maybe corner cases; I'm talking about the general case here.)
I personally prefer languages with a very limited set of keywords. You can quickly learn such a language, and you don't have to choose between 10,000 ways of achieving one goal simply because there's 10,000 keywords for doing the same thing (as in: GO/WALK/RUN/TROD/SLEEPWALK/etc. TO THE FRIDGE AND GET ME A BEER!). It means if you need to think about 10,000 different ways of doing something, it won't be due to the language, but due to the fact that there are 9,999 stupid ways to do it, and 1 elegant solution that just shines more than all the others.
Note that I wrote all natural language examples in upper-case. That's because I sort of had good old GW-BASIC and COBOL in mind while I wrote this. There've been some examples of programming languages that lean on natural language, and I think history has shown that they are, in general, somewhat less widespread than e.g. terse C-style languages.
I recently read that according to Gartner there are over 400 billion lines of COBOL source code in active use worldwide today.
That doesn't prove anything other than that banks and governments are fond of their legacy code, but you could construe it as a testament to the success of English-like programming languages. I'm not aware of any other programming language that is so close to English and so verbose.
Aside from that, I tend to agree with the other respondents: Programmers prefer not to type so much, and in general a language based on mathematics-like shorthand is both more expressive and more precise than one based on English.
There's a point where terse, expressive code looks like line noise. Perl, APL and J come to mind as examples with "illegible one-liners." Programmers are humans, and it may be beneficial to leave them with some similarity to natural language to give their brains something familiar to hold on to. Thus, I propagate a happy medium that's reminiscent of but not too close to natural language.
"When a programming language is created that allows programmers to program in simple English, it will be discovered that programmers cannot speak English." ~ Unknown
In my (not so) humble opinion, no.
Natural language is full of ambiguities. Normally we do not think of them because humans can easily disambiguate them, based on many criteria often unavailable to the computer. First off we have knowledge about the world (elephants don't fit in pajamas), but also we use more senses than just hearing when we speak to each other, body language to name one. The intonation and manner things are said with also helps alot to disambiguate. It is harder to catch irony or sarcasm in written text, which is more or less a transcription of what we would say, more in the case IM less in the case of well written articles. In general there is loads and loads of ambiguity in natural language, for instance where the PPs, prepositional phrases attach:
"Workers [dumped [sacks [with flour]]]"
"Workers [dumped [sacks] [with a fork-lift]]]"
Any human immediatly tells where the PP will attach, its reasonable to have sacks with flour in them, and its reasonable to use a fork-lift to dump something. Another very troublesome area is the word "and" which messes up the grammar horrendously, or all the references we use, the pronouns in general, but also more complex references, ie. "Bill bought a Dodge Viper, sadly the car was a lemon".
So we have three options, keep the ambiguities in and try to deal with them, accepting very many errors in disambiguation and very very slow parsing, no LALR or LL will work here, or try to make an artifical grammar resembling natural language, and keeping it deterministic, which is more reasonable but still horrible. We now have a language that falsely resembles English, but it isn't which is confusing. We have none of the benefits of a proper syntax and none of the benefits of natural language, but an oversized overwordly monstrum, with a diffcult and unintuitive grammar, diffcult to learn and slow to write.
The third way is realizing we need a succinct way of expressing ourselves, which can also be processed by a computer, not resembling any natural language, but focusing on being an unambigous description of an algorithm. This will increase the readability, especially if we compare to a very precise natural language counter part. This is why many people prefer to also read the pseudo-code when dealing with difficult problems or advanced algorithms, it relieves us of the trouble with dealing with ambiguities, and is more optimal for expressing computer instructions.
The issue isn't so much that it's easier to describe complex ideas using one approach or the other, but it certainly is easier understanding machine languages (at least for machines). The biggest issue is, as always, ambiguity. Computers are terrible at understand it, so most grammars for programming languages need to be constructed to either remove all ambiguity, or the general language must be constructed so that ambiguity isn't actually a problem (this is tricky).
Any programming language that allows for ambiguity would be terribly error prone; and any natural language that doesn't allow ambiguity would be terribly verbose and convoluted (I'm looking at you, Lojban [ok, maybe Lojban isn't so bad‚ still…]).
The propensity some people show for preferring natural languages for programming languages might essentially root out in the desire to eventually be able to input a physics textbook into a parser, whereupon it'll do your homework when asked.
Of course, that's not to say that programming languages shouldn't have hints of natural language: Especially for OOP it makes good sense to have calling grammar resemble natural grammar, like in Obj-C, which is sort of a game of mad libs:
[pot makeCoffee:strong withSugar:NO];
Doing the same in BrainFuck would be, well, a brainfuck, three full pages of code to flip a switch will do that to you.
In essensce; the best languages are (probably) the ones that resemble natural languages, without pretending to be one. (Avoiding the uncanny valley of programming languages, [if there is such a thing] if you will. [Subclauses! Yay!])
A natural language is too ambiguous to be used as programming language. It has to be artificially constrained to eliminate ambiguities.
But it defeats the purpose of having a "natural" programming language, because you have its verbosity and none of its advantages in expressibility.
I think the fourth language I coded professionally in (after Fortran, Pascal and Cobol) was Natural. Which is a pretty obscure 4GL of 1980's vintage for developing mainframe systems against an ADABAS database.
Called Natural I believe because it had pretensions to be so. Supposedly management-readable like cobol, but minus the fluff.
Which should tell you that attempts at 'Natural' programming languages have a commercial history of over 30 years now (more if you count cobol) but they have pretty much lost out to languages that don't pretend to be 'natural' but do allow the programmer to define the problem succinctly. When I first started coding the 1GL -> 2GL -> 3GL evolution wasn't that old and the progression to 4GL (defined then as a more english-like programming languages) for mainstream work seem an obvious next step. It hasn't worked out that way. If anything getting up to speed with coding now has got harder because there's more abstract concepts to learn.
SQL was designed with natural language in mind originally. Fortunately it hasn't held on too tightly to this and advances since its conception are less "naturalistic".
But anybody that has tried to write a complicated query in SQL will tell you that its not that easy. You have worry about the range of some keywords over your query. You have this incredibly hard to understand query, that does some crazy shit, but you re-write it every time you need to change something because its easier.
Natural language programming is a bad idea. the further you get from assembly, the more mistakes you can make, not in terms of logical errors or anything like that, but in terms of having the wrong assumption about how the script interpreter/bytecode intepreter/compiler makes your code run on the CPU.
Is seems to be a great feature for beginners, or people who program as a "secondary activity". But I doubt you could reach the complexity and polyvalence of actual programming languages with natural language.
If there was a programming language that actually adhered to all of the conventions of the natural language it mimics, then that would be fantastic.
In reality, however, a lot of so-called "natural" programming languages have far stricter syntax than English, which means that although they are easily readable, it is debatable whether they are actually all that easy to write.
What makes sense in English is often a syntax error in AppleScript.
Everyday language isn't so clear, simple, clean, lovely, concise and understandable - to a computer. However, to a human, readability counts for a lot, and the closer you get to a natural language, the easier it is to read. That's why we're not all using assembly language.
If you have a completely natural language, there are a lot of things that need to be handled - the sentence needs to be parsed, each word must be understood - and there is plenty of room for ambiguity. That's generally not a good thing for a programming language, because then we're venturing into psychic programming - the computer has to figure out what you were thinking, which is not at all easy to get.
However, if you can make something sufficiently close to natural language - and yes, Inform 7 is probably the best example - so sentences look natural, but still have some structure you need to follow - then the code is almost instantly readable, even to people that don't know the language. There's usually also less specialized syntax to remember - because you're really just talking (a slightly modified form of) English - but if you have to do something out of the ordinary, then you might have to jump through some hoops to do that.
In practice, most languages don't bother with this, because that makes it easier for them to allow you to be precise. However, some will still hover closer to the "natural language". This can be a good thing: if you have to translate some pseudocode algorithm to a language, you don't need to manipulate it as much to make it work, reducing the risk that you make an error in the translation.
As an example, let's compare C and Pascal. This Pascal code:
for i := 1 to 10 do begin
j := j + 1;
end;
is equivalent to this C code:
for (i = 1; i <= 10; i++) {
j = j + 1;
}
If you had no prior knowledge of either syntax, the Pascal version is generally going to be simpler to read, if only because it's not as complex as a C for.
Let's also consider operators. Pascal and C both share +, - and *. They also both have /, but with different semantics: In C, / does an integer division if both operands are integers; in Pascal, it always does a "real" division and uses div for integer division. That means that you have to take the types into account when figuring out what actually happens in that line of code.
C also has a bunch of other operators: &&, ||, &, |, ^, <<, >> - in Pascal, those operators are instead named and, or, and, or, xor, shl, shr. Instead of relying on some semi-arbitrary sequence of characters, it's spelled out more. It's instantly obvious that xor is - well, XOR - unlike the C version, where there's no obvious correlation between ^ and XOR.
Of course, this is to some degree a matter of opinion: I much prefer a Pascal-like syntax to a C-like syntax, because I think it's more readable, but that doesn't mean everyone else does: A more natural language is usually going to be more verbose, and some people simply dislike that extra level of verbosity.
Basically, it's a matter of choosing what makes the most sense for the problem domain: if the problem domain is very limited (like with Inform), then a natural language makes perfect sense. If it's a very generic domain (like with C), then you either need far more advanced processing than we are currently capable of, or a lot of verbosity to fill in the details - and in that case, you have to choose a balance depending on what sort of users will be using the languages (for regular people, you need more naturalness, for people who know programming, they're usually comfortable enough with less natural languages and will prefer something closer to that end).
I think the question is, who reads and who writes the application code in question? I think, regardless of the language or architecture, a trained software developer should be writing the code, and analyze the code as bugs arise.

Is Lua a language that a non-developer can learn quickly?

Let's say a tester is to do some programming to create automated tests ... is Lua really easy to learn for someone who is not a developer?
It depends on the particular non-developer in question. Some people will utterly block on any programming language at all. Some will easily grok many languages and basic programming concepts. There is no silver bullet for putting the power of programming in the hands of someone who is untested on it.
That being said, my personal feeling is that Lua is as good of a place to start as any other programming language.
The Lua language has an active and usually novice-friendly community. It has a long history of use on the boundary between non-programmers and programmers. The language reference manual and standard text book are among the best written examples I've seen in my career. The full text of the reference manual is online, and the first edition of Programming in Lua is as well, although the second edition of PiL reflects the differences in the language that happened after PiL was first published and is well worth the investment to purchase.
One of Lua's strengths is the ease with which it can be integrated into an existing system to construct a configuration and scripting interface to an application. That makes the development cost to adopt it relatively low. Its small size makes the impact on an application release remarkably low as well. Thus getting an existing system to the point where it can be scripted enough with Lua to use Lua as a basis for testing will likely be a straightforward task with few hidden obstacles.
Lua is very forgiving which many people associate with "easy". You do not have to enter semi-colons, you do not have to scope variables, you can write all of your functions in the global scope. Of course doing these things only make your life easier when writing. When debugging even a new programmer may soon see why taking these short cuts is not such a good idea.
I also believe that you can write very simple, easy to use APIs in Lua and you could also create very complex APIs, which may involve object oriented concepts (such as the difference between . and :) or functional APIs with closures and passing around functions as function arguments, etc. Whether the user is able to properly use and understand the language to do the task at hand depends largely on the API as much as or more so than the language.
I do believe Lua is an easier language to learn than many others, like Ruby and Python (and obviously Perl). Lua's grammar and syntax are more consistent than Ruby's for instance; in Ruby you have so many reserved keywords, plus all sorts of symbols (curly-brackets for blocks and pipes for local variables etc), plus it gives you too many options (you can either use curly-brackets for blocks, or you can use the keywords do and end to start and end blocks).
I believe that for non-programmers Lua is much easier especially because of the reasons outlined above. As for programmers, I've read many people say this very same thing and I agree: programming in Lua is very pleasant. I believe that's also because of what I said above.
It probably is becausee its very similar to Python:
The number of universities using Python in there introductory Comp Sci courses is probably the highest of any language (empirically through google). Second probably being Java and Scheme.
The number of Python libraries is astronomical. And the number of people that know the language is quite high thus if you hire a new person there is a good chance they have seen the language before.
Ironically I have grown to not like the language so I am not saying this because I am python fan boy.
As long as you clearly explain to the testers the pitfalls that they may face when debugging in LUA it shouldn't be harder than learning the programming basics of any other language.
What goes through my mind is the situation where the tester made a typo and wrote a different, yet almost unnoticeable, name for a variable. The new variable will be created with the given value but the old variable won't be modified. That sort of thing can be pretty hard to debug when people are not extremely aware of it.
I'm a programmer for more than ten years.
I've learned and used different programming languages.
I've heard about Lua in different occasions, but I've never used it before.
Recently, I decided to learn Lua because our customers are using it.
After spending days on reading the PiL, it turned out for me that Lua is a powerful, flexible, yet sophisticated programming language.
From a software developer point of view, I don't think it's easy for me to be a good coder in Lua in a short time.
But if you just want to 'do something' with Lua, especially if you are from a background of non-programmer, you may feel pleased with Lua, which may be much easier to write some ready-to-use code than some 'traditional' language such as Java, C/C++, Python, etc.

What is a good programming language to start my Grade 1 son learning? [duplicate]

This question already has answers here:
Closed 13 years ago.
Possible Duplicates:
How to get kids into programming
Suggestions on starting a child programming.
Is there a really simple programming language that I can use to teach my 6 year old son concepts of programming, syntax and logic?
I'm probably the only one here with this opinion, but I think 6 is too young to start a child on programming. Those years are critical for development of a whole host of skills including social skills that are not computer-related (that, indeed, may be antithetical to computer use) and intellectual ones that actually will contributed to computing skills later on (I'm talking about math and problem solving skills).
I've started introducing my kids to programming at the ages of 8 and 10, but I don't expect them to take a serious interest in it until their middle school years (starting at age 11/12). In general my kids spend much, much less time in front of a computer than their classmates. They both lead their classes academically and are well socially adjusted.
Logo. Designed specifically by Seymour Papert to teach children how to program how to deal with recursion etc. etc. all without using those words to put people of. Particularly when linked to turtle graphics to give a readily available and recognisable output and feedback.
Because it was designed to cover all the fundamentals in programming it does not necessarily major in anything, but the ides is to give the children all the core fundamentals.
Try Scratch.
Take a look at Small Basic from Microsoft.
By providing a small and easy to learn
programming language in a friendly and
inviting development environment,
Small Basic makes programming a
breeze. Ideal for kids and adults
alike, Small Basic helps beginners
take the first step into the wonderful
world of programming.
I think the quote sums it up, really! :)
Yes, there is Plain English Programming Language
Check out www.pythonturtle.org
Guido van Robot is a logo like application that uses Python.
(source: sourceforge.net)
I suggest python via Snake Wrangling for Kids:
“Snake Wrangling for Kids” is a
printable electronic book, for
children 8 years and older, who would
like to learn computer programming. It
covers the very basics of programming,
and uses the Python 3 programming
language to teach the concepts.
Personally I think Tcl is perfect as a beginning language, especially for young people. It has an interactive console for instant gratification, and tk is by far one of the easiest GUI toolkits on the planet. One or two lines of code to see a window on a screen. Just a couple lines of code to create a canvas and draw rudimentary shapes, etc.
I know many people don't like Tcl, but I think that's more out of ignorance than anything else. And I mean that in a good way -- if you don't understand Tcl but know more traditional languages, it's hard to see the beauty in such a simple yet powerful language. The whole definition of the language fits in a single man page, so it's easy to grasp the fundamentals.
Finally, as a teaching tool it lets you recreate just about any language construct you wish. You can not only show them for and while loops, you can create repeat/until loops, or anything other types of looping to emulate other languages.
I started learning programming in the hey-day of Pascal, a language which many would say is designed for learning. Here's a quote from Wikipedia:
Criticism
While very popular (although more so in the 1980s and early 1990s than now), implementations of Pascal which closely followed Wirth's initial definition of the language were widely criticized for being unsuitable for use outside of teaching.
Take that for what you will =)
turbo pascal? :) gwbasic? and nextly python :)
Well, Python has very English like syntax that makes it relatively easy to pick up. Python IDLE works in a read-eval-loop mode, so you don't have to go through compiling or anything. You can type code in line-by-line and get instant feedback. It also has an interactive help mode. If he needed to know what some function does, and you weren't there to help him, he could just type help(someFunction)
There was a comment about how it can become confusing when you mix tabs and spaces in Python. In response to that comment, most editors have an option to automatically replace tabs with X spaces. In IDLE, it's as simple as Format->Toggle Tabs to make it so whenever you press the tab key, it inserts 8 spaces instead of a tab.
Labview is completely visual. It's mainly used to program robots. It's extremely logic oriented. However, there's quite a big price tag on it.
SmallTalk. It was created for educational use.
I have to agree that six years old sounds a bit young though... if they don't want to learn, don't try to force them.
I think the framework is important too. Your kid should be able to create a game without too much ado. Python + pygame springs to mind.
http://en.wikipedia.org/wiki/Logo_(programming_language)

Resources for learning a new language quickly?

The title may seem slightly self-contradictory, and I accept that you can't really learn a language quickly. However, an experienced programmer that already has knowledge of a few languagues and different styles (functional, OO, imperative etc.) often wants to get started quickly. I've seen a few websites doing effective "translations" in the form of "just show me syntax equivalence". I can't remember the sites now, but for related languages (e.g. Perl/PHP) it's quite common.
Is there a better resource that covers more languages? Is there a resource that covers idioms as well as syntax? I think this would be incredibly useful for doing small amounts of work on existing code bases where you are not familiar with the language. Looking at the existing code, as we know, is not always a good indicator of quality. Likewise, for "learn by doing" weekend project I always have the urge to write reasonably idiomatic, clean code from the start. Such a resource could also link to known good example projects of varying sizes for those that prefer to learn by reading. Reading a well-written medium sized code base can also be much more practical when access to development environments might be limited.
I think it's possible to find tutorials and summaries for individual languages that provide some of this functionality in disparate web locations but I'm hoping there is a good, centralised, comparative place that the busy programmer can turn to.
You generally have two main things to overcome:
Syntax
Reference
Syntax you can pick up fairly quickly with a language tutorial and a stack of samplecode.
Reference (library/API calls) you need to find a proper guide to; perhaps the language reference, or perhaps google...
With those two in place, following a walkthrough (to get you used to using the development environment) will have you pretty much ready - you'll be able to look up what you want to say (reference), and know how to say it (syntax).
This, of course, applies principally to procedural/oop languages; languages that require a paradigm switch (ML/Haskell) you should go to lectures for ;)
(and for the weirder moments, there's SO!)
In the past my favour was "learning by doing". So e.g. I know a little bit of C++ and a lot of C#.Net but I must write a FTP Tool in Python.
So I sit for an hour and so the syntax differences by a tutorial, than I develop the form itself and look at the generated code. Then I search a open source Python FTP Client and get pieces of code (Not copy and paste, write it self to see, feel and remember the code!)
After a few hours I get it.
So: The mix is the best. A book, a piece of good code, the willing to learn and a free night with much coffee.
At the risk of sounding cheesy, I would start with the language's website tutorial and/or FAQ, followed by asking more specific questions here. SO is my centralized location for programming knowledge.
I remember when I learned Perl. I was asked to modify some Perl code at work and I'd never seen the language before. I had experience with several other languages, however, so it wasn't hard to figure out the syntax with the online Perl docs in one window and the code in another, side-by-side. I don't know that solely reading existing code is necessarily the best way to learn. In my case, I didn't know Perl but I could tell that the person who originally wrote the code didn't know Perl either. I'm not sure I could've distinguished between good Perl and really confusing Perl. It would've been nice to be able to ask questions here at the time.
Language isn't important. What is important is learning your ways around designing algorithms and the proper application of design patterns. Focus on the technique, not the language that implements a certain technique. Once you understand the proper development techniques, any programming language will just become real easy, no matter how obscure they are...
When you put a focus on a language, you're restricting your own knowledge.
http://devcheatsheet.com/ seems to be a step in the right direction: it aggregates cheat sheets/quick references and they are (somewhat) manually reviewed. It's also wide-ranging. It still comes up short a bit in terms of "idiomatic" quick reference: for example, the page on Ruby doesn't mention yield.
Rosetta Code appears to be an excellent resource that includes hints on coding idiomatically and moves from simple (like for-loops) to things like drawing. I haven't checked out how comprehensive it is, but there are a large number of languages and tasks listed. The drawbacks re: original question are:
Some of the linking is not accurate
(navigating Python->ForLoop will
take you to the top of the ForLoop
page, not the Python section). It's a
wiki, this can be improved.
Ideally you could "slice" the wiki
however you chose to see e.g. the top
20 tasks for two languages
side-by-side.
http://hyperpolyglot.org/ seems to be an almost perfect match for what I was looking for. The quality is not always there, or idiom can be lacking, but it has the same intention and is pretty comprehensive.

Resources