Is it better to use a "natural" language to write code? - nlp

I recently saw a programming language called supernova and they said in the web page :
The Supernova Programming language is
a modern scripting language and the
First one presents the concept of
programming with direct Fiction
Description using
Clear subset of pure Human Language.
and you can write code like:
i want window and the window title is Hello World.
i want button and button caption is Close.
and button name is btn1.
btn1 mouse click. instructions are
you close window
end of instructions
my question is not about the language itself but it is that are we need such languages and did they make writing code easier or not?

The code may look like natural language, but it's really just regular computer code with different keywords. In your example, I want is probably synonymous with new. It's not like you can use natural language directly and say make me a window instead (and if you could, things would get even uglier...).
Lets take a close look at your code and the language implications:
i want window and the window title is Hello World.
i want means new, and denotes beginning of the argument list. the <type_name> <member_name> is sets instance variable member_name on the object being created. Note that you have to write the type_name twice.
i want button and button caption is Close.
and button name is btn1.
. ends a statement. However, you can 'chain' method calls on an object by starting the next statement with and. Also, how do you refer to a variable named Close instead of the string "Close"? Heck, we even have this problem in regular English: what the difference between 'Say your name' and 'Say "your name"'?
btn1 mouse click. instructions are
you close window
end of instructions
mouse click is an identifier containing a space, should be mouseClick. instructions are defines a lambda (see the is vs. are keyword confusion causing trouble yet?). you close window calls window.close(). end of instructions is end of a lambda. All of these are longer than they need to be.
Remember all that? And those are only my guesses at the syntax, which could be completely wrong. Still seem simple? If so, try writing a larger program without breaking any of those rules, AND the additional rules you'll need to define things like conditional logic, loops, classes, generics, inheritance, or whatever else you'll need. All you're doing is changing the symbols in regular programming languages to 'natural language' equivalents that are harder to remember, unnecessarily verbose, and more ambiguous.
Try this translation:
var myWindow = new Window( title="Hello World" );
myWindow.addButton( new Button( caption="Close", name="btn1" ) );
btn1.onMouseClick = function() {
myWindow.close();
}
See how each line maps to its counterpart in the previous example, but states the intent more directly? Natural language may be good for execution by humans, but it is terribly difficult to use for precise specifications.
The more you try to make English communicate these ideas easily and clearly, the more it's going to look like programming languages we already have. In short, programming languages are as close to natural language as we can get without losing clarity and simplicity. :D

Since the fundamental difficulty of programming is getting your thoughts ordered enough to tell the computer what to do, making the language more “natural” is highly unlikely to make it more accessible to non-programmers; the language itself was never the real problem in the first place. What's more, all that extra clutter of natural language doesn't help any programmers (worth the name) with what they're doing either, so why add it?
Or can we have a real natural language programming language, complete with “Um”, “Er”, and “Oh, I don't really know”? :-)

Edsger W. Dijkstra, On the foolishness of "natural language programming". I have nothing to add.

The programming language you have showed us above is extremely verbose (as it seems even more than COBOL).
This comes with several problems:
It takes long code to do simple things.
Code grows unmaintainable fairly fast
It takes long to find out what code does

Whether they're better or not is opinion, but that looks like some mutated cross of COBOL and BASIC, which is most definitely epically bad.
So in my opinion, no. I think somewhat to-the-point languages that still use readable verbs/adjectives/names are better (C++, C#, PHP, etc are my preferred languages).
Some languages start to get too high-level and/or verbose, making the actual logic so abstracted it's hard to know what does what. Some are too low-level and brief, forcing you to explicitly state everything you want done. A balance between readability and brevity, with power and flexibility, is what is best for development.

I'll be brave and offer a slightly different opinion here.
I haven't seen anything of this language other than what has been presented in the question, but looking at that I'd have to say it probably is much more readable for a non-programmer. For a programmer on the other hand it won't hurt readability, but it won't help either because we're used to reading code.
On the actual development side of things, I think it would be horrible to have to type so many extra bits especially if you're used to succinct keywords and constructions like us programmers.
But let's imagine a far away future where speech recognition is actually accurate. I think a language like this would be far easier to code by talking to your computer, I'd hate to have to specify every parenthesis and such. Not having to do that would help keep your train of thought.
In conclusion:
non-programmer readability: great.
programmer readability: no improvement.
codability typing: no, just...no.
codability speech: probably much easier.

person.eBusiness[text.this.author].like(language.supernova.idea.reverse);
language.English[ambiguous[very]];
person.eBusiness.suggest(create(language.Codetalk));
language.Codetalk.grammar.inspiration=language.programming.grammar;
language.Codetalk[new,better,ambiguous[not]];
if(person.all.use(language.Codetalk)){
person.all.understand(person.all.communication);
};
question(language.Codetalk.idea[good]);

In my opinion there is no really useful advantage in using that kind of "human language", because you still need a syntax and special words. You have to learn both of them, and because that's necessary it would be not much more difficult to learn a "programming language", which gives lots of advantages because it is oriented at the machine's structure and not at the human's way of thinking.
You will need to think the way the machine works if you want to write good programs, and a programming language is definitely more powerful to express that way of thinking than human language.

One problem with these languages is: Say you write significant portions of your app in this language and then need different people to maintain,extend or otherwise change the code.
Who are you going to get to do this ? Nobody off the street is going to know it and others are going to have a steep learning curve.
Let's also assume you have a question on how to accomplish a task the language, where do you turn ?

Well the perfect programming language would be an exact copy of the English Language. You could just tell your computer to do some stuff in the same way you might order a coffee or give homework to your class. However, such a language would be extremely difficult to implement (an advanced A.I would be necessary).

Related

Creating programming languages and compiler designing. Are they related?

Alright, I guess this question has been asked a lot of times here.
I want to create a programming language, not necessarily starting today, but over a span of 2-3 yrs. I'm not a very good programmer, but I'm improving. What I wanted to ask is how closely creating a language and writing a compiler related?
Since, a compiler translates a language from one form into another, I guess it's all about writing a compiler for a particular piece of text. SO if I learn compiler design, will I be able to write my own programming language?
You can design a programming language without knowing anything about implementing compilers, and vice versa. The language designer can write a specification for the language, and a compiler implementor can then take that and create the compiler.
However, if this is a personal project, then you will probably have to learn how to do both. A programming language for which there is no compiler is purely theoretical, and it is difficult to figure out how good a programming language is without writing and executing real programs with it. Even if you do find someone willing to implement the compiler for you, you might not want to have to wait for that person every time you have a new idea to try, so you will want to know how to do it yourself.
Implementing a compiler is a pretty advanced programming project, so if you are just getting started as a programmer, you have a steep learning curve ahead of you. You might want to start by looking at the tutorials and examples for LLVM, although that might not actually be a suitable compiler infrastructure for your language.
Naruto, it depends on what kind of "Language" you want to create. If it is a simple, just-for-learning language, and you choose the grammar, etc, etc, you won't need to know a lot about programming. BUT, if you are going to deal with a serious one, you will have to study at least one computer programming language deep not only to use it, but to try to reach several of its concepts, for example, like OO, generics, lambda expressions, etc, etc.
Believe me, this is not a task of months, but a serious journey. Anyway, I wish you luck ;)
Intimately related. You really don't have a language unless you have a way to interpret/compile it into an executable form.
It depends on what you mean by "compiler". Compilers/interpreters usually consist of two big parts: a parser part, which reads a text in your language and builds an internal structure (AST) out of it, and a code generation/interpretation part, which reads the AST and translates it to machine or byte codes. While you definitely will need to know how to write a parser for your language, code generation is less important, at least, at the early stages. You can start by simply translating your language to C and see where you go from there.

Is similarity to "natural language" a convincing selling point for a programming language? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Look, for example at AppleScript (and there are plenty of others, some admittedly quite good) which advertise their use of the natural language metaphor. Code is apparently more readable because it can be/is intended to be constructed in English-like sentences, says they. I'm sure there are people who would like nothing better than to program using only English sentences. However, I have doubts about the viability of a language that takes that paradigm too far (excepting niche cases).
So, after a certain reasonable point, is natural-languaginess a benefit or a misfeature? What if the concept is carried to an extreme -- will code necessarily be more readable? Or might it be unnecessarily long, difficult to work with, and just as capable of producing hilarity on the scale of obfuscated Perl, obfuscated C, and eye-twisting Bash script logorrhea?
I am aware of some specialty cases like "Inform" that are almost pure English, but these have a niche that they're not likely to venture out from. I hear and read about how great it would be for code to read more like English sentences, but are there discussions of the possible disadvantages? If everyday language is so clear, simple, clean, lovely, concise, understandable, why did we invent mathematical notation in the first place?
Is it really easier to describe complex instructions accurately and precisely to a machine in natural language, or isn't something closer to mathematical markup a much better choice? Where should that line be drawn? And finally, are you attracted to languages that are touted as resembling English sentences? Should this whole question have just been a one liner:
naturalLanguage > computerishLanguage ? booAndHiss : cheerLoudly;
Well, of course, natural languages are rarely clear, simple, clean, lovely, concise, understandable which is one of the reasons that most programming is done in languages far from natural.
My answer to this would be that the ideal programming language lies somewhere between a natural language and a very formal language.
On the one extreme, there's the formal, minimal, mathematical languages. Take for example Brainfuck:
,>++++++[<-------->-],[<+>-]<. // according to Wikipedia, this means addition
Or, what's somewhat preferable to the above mess, any type of lambda calculus.
λfxy.x
λfxy.y
This is one possible way of expressing the Boolean truth values in lambda calculus. Doesn't look very neat, especially when you build logical operators (such as AND being e.g. λpq.pqp) around them.
I claim that most people could not write production code in such a minimalistic, hard-to-grasp language.
The problem on the other end of the spectrum, namely natural languages as they are spoken by humans, is that languages with too much complexity and flexibility allows the programmer to express vague and indefinite things that can mean nothing to today's computers. Let's take this sample program:
MAYBE IT WILL RAIN CATS AND DOGS LATER ON. WOULD YOU LIKE THIS, DEAR COMPUTER?
IF SO, PRINT "HELLO" ON THE SCREEN.
IF YOU HATE RAIN MORE THAN GEORGE DOES, PRINT SOME VAGUE GARBAGE INSTEAD.
(IN THE LATTER CASE, IT IS UP TO YOU WHERE YOU OUTPUT THAT GARBAGE.)
Now this is an obvious case of vagueness. But sometimes you would get things wrong with more reasonable natural language programs, such as:
READ AN INTEGER NUMBER FROM THE TERMINAL.
READ ANOTHER INTEGER NUMBER FROM THE TERMINAL.
IF IT IS LARGER THAN ZERO, PRINT AN ERROR.
Which number is IT referring to? And what kind of error should be printed (you forgot to specify it.) — You would have to be really careful to be extremely explicit about what you mean.
It's already too easy to mis-understand other humans. How do you expect a computer to do better?
Thus, a computer language's syntax and grammar has to be strict enough so that it doesn't allow ambiguity. A statement must evaluate in a deterministic way. (There are maybe corner cases; I'm talking about the general case here.)
I personally prefer languages with a very limited set of keywords. You can quickly learn such a language, and you don't have to choose between 10,000 ways of achieving one goal simply because there's 10,000 keywords for doing the same thing (as in: GO/WALK/RUN/TROD/SLEEPWALK/etc. TO THE FRIDGE AND GET ME A BEER!). It means if you need to think about 10,000 different ways of doing something, it won't be due to the language, but due to the fact that there are 9,999 stupid ways to do it, and 1 elegant solution that just shines more than all the others.
Note that I wrote all natural language examples in upper-case. That's because I sort of had good old GW-BASIC and COBOL in mind while I wrote this. There've been some examples of programming languages that lean on natural language, and I think history has shown that they are, in general, somewhat less widespread than e.g. terse C-style languages.
I recently read that according to Gartner there are over 400 billion lines of COBOL source code in active use worldwide today.
That doesn't prove anything other than that banks and governments are fond of their legacy code, but you could construe it as a testament to the success of English-like programming languages. I'm not aware of any other programming language that is so close to English and so verbose.
Aside from that, I tend to agree with the other respondents: Programmers prefer not to type so much, and in general a language based on mathematics-like shorthand is both more expressive and more precise than one based on English.
There's a point where terse, expressive code looks like line noise. Perl, APL and J come to mind as examples with "illegible one-liners." Programmers are humans, and it may be beneficial to leave them with some similarity to natural language to give their brains something familiar to hold on to. Thus, I propagate a happy medium that's reminiscent of but not too close to natural language.
"When a programming language is created that allows programmers to program in simple English, it will be discovered that programmers cannot speak English." ~ Unknown
In my (not so) humble opinion, no.
Natural language is full of ambiguities. Normally we do not think of them because humans can easily disambiguate them, based on many criteria often unavailable to the computer. First off we have knowledge about the world (elephants don't fit in pajamas), but also we use more senses than just hearing when we speak to each other, body language to name one. The intonation and manner things are said with also helps alot to disambiguate. It is harder to catch irony or sarcasm in written text, which is more or less a transcription of what we would say, more in the case IM less in the case of well written articles. In general there is loads and loads of ambiguity in natural language, for instance where the PPs, prepositional phrases attach:
"Workers [dumped [sacks [with flour]]]"
"Workers [dumped [sacks] [with a fork-lift]]]"
Any human immediatly tells where the PP will attach, its reasonable to have sacks with flour in them, and its reasonable to use a fork-lift to dump something. Another very troublesome area is the word "and" which messes up the grammar horrendously, or all the references we use, the pronouns in general, but also more complex references, ie. "Bill bought a Dodge Viper, sadly the car was a lemon".
So we have three options, keep the ambiguities in and try to deal with them, accepting very many errors in disambiguation and very very slow parsing, no LALR or LL will work here, or try to make an artifical grammar resembling natural language, and keeping it deterministic, which is more reasonable but still horrible. We now have a language that falsely resembles English, but it isn't which is confusing. We have none of the benefits of a proper syntax and none of the benefits of natural language, but an oversized overwordly monstrum, with a diffcult and unintuitive grammar, diffcult to learn and slow to write.
The third way is realizing we need a succinct way of expressing ourselves, which can also be processed by a computer, not resembling any natural language, but focusing on being an unambigous description of an algorithm. This will increase the readability, especially if we compare to a very precise natural language counter part. This is why many people prefer to also read the pseudo-code when dealing with difficult problems or advanced algorithms, it relieves us of the trouble with dealing with ambiguities, and is more optimal for expressing computer instructions.
The issue isn't so much that it's easier to describe complex ideas using one approach or the other, but it certainly is easier understanding machine languages (at least for machines). The biggest issue is, as always, ambiguity. Computers are terrible at understand it, so most grammars for programming languages need to be constructed to either remove all ambiguity, or the general language must be constructed so that ambiguity isn't actually a problem (this is tricky).
Any programming language that allows for ambiguity would be terribly error prone; and any natural language that doesn't allow ambiguity would be terribly verbose and convoluted (I'm looking at you, Lojban [ok, maybe Lojban isn't so bad‚ still…]).
The propensity some people show for preferring natural languages for programming languages might essentially root out in the desire to eventually be able to input a physics textbook into a parser, whereupon it'll do your homework when asked.
Of course, that's not to say that programming languages shouldn't have hints of natural language: Especially for OOP it makes good sense to have calling grammar resemble natural grammar, like in Obj-C, which is sort of a game of mad libs:
[pot makeCoffee:strong withSugar:NO];
Doing the same in BrainFuck would be, well, a brainfuck, three full pages of code to flip a switch will do that to you.
In essensce; the best languages are (probably) the ones that resemble natural languages, without pretending to be one. (Avoiding the uncanny valley of programming languages, [if there is such a thing] if you will. [Subclauses! Yay!])
A natural language is too ambiguous to be used as programming language. It has to be artificially constrained to eliminate ambiguities.
But it defeats the purpose of having a "natural" programming language, because you have its verbosity and none of its advantages in expressibility.
I think the fourth language I coded professionally in (after Fortran, Pascal and Cobol) was Natural. Which is a pretty obscure 4GL of 1980's vintage for developing mainframe systems against an ADABAS database.
Called Natural I believe because it had pretensions to be so. Supposedly management-readable like cobol, but minus the fluff.
Which should tell you that attempts at 'Natural' programming languages have a commercial history of over 30 years now (more if you count cobol) but they have pretty much lost out to languages that don't pretend to be 'natural' but do allow the programmer to define the problem succinctly. When I first started coding the 1GL -> 2GL -> 3GL evolution wasn't that old and the progression to 4GL (defined then as a more english-like programming languages) for mainstream work seem an obvious next step. It hasn't worked out that way. If anything getting up to speed with coding now has got harder because there's more abstract concepts to learn.
SQL was designed with natural language in mind originally. Fortunately it hasn't held on too tightly to this and advances since its conception are less "naturalistic".
But anybody that has tried to write a complicated query in SQL will tell you that its not that easy. You have worry about the range of some keywords over your query. You have this incredibly hard to understand query, that does some crazy shit, but you re-write it every time you need to change something because its easier.
Natural language programming is a bad idea. the further you get from assembly, the more mistakes you can make, not in terms of logical errors or anything like that, but in terms of having the wrong assumption about how the script interpreter/bytecode intepreter/compiler makes your code run on the CPU.
Is seems to be a great feature for beginners, or people who program as a "secondary activity". But I doubt you could reach the complexity and polyvalence of actual programming languages with natural language.
If there was a programming language that actually adhered to all of the conventions of the natural language it mimics, then that would be fantastic.
In reality, however, a lot of so-called "natural" programming languages have far stricter syntax than English, which means that although they are easily readable, it is debatable whether they are actually all that easy to write.
What makes sense in English is often a syntax error in AppleScript.
Everyday language isn't so clear, simple, clean, lovely, concise and understandable - to a computer. However, to a human, readability counts for a lot, and the closer you get to a natural language, the easier it is to read. That's why we're not all using assembly language.
If you have a completely natural language, there are a lot of things that need to be handled - the sentence needs to be parsed, each word must be understood - and there is plenty of room for ambiguity. That's generally not a good thing for a programming language, because then we're venturing into psychic programming - the computer has to figure out what you were thinking, which is not at all easy to get.
However, if you can make something sufficiently close to natural language - and yes, Inform 7 is probably the best example - so sentences look natural, but still have some structure you need to follow - then the code is almost instantly readable, even to people that don't know the language. There's usually also less specialized syntax to remember - because you're really just talking (a slightly modified form of) English - but if you have to do something out of the ordinary, then you might have to jump through some hoops to do that.
In practice, most languages don't bother with this, because that makes it easier for them to allow you to be precise. However, some will still hover closer to the "natural language". This can be a good thing: if you have to translate some pseudocode algorithm to a language, you don't need to manipulate it as much to make it work, reducing the risk that you make an error in the translation.
As an example, let's compare C and Pascal. This Pascal code:
for i := 1 to 10 do begin
j := j + 1;
end;
is equivalent to this C code:
for (i = 1; i <= 10; i++) {
j = j + 1;
}
If you had no prior knowledge of either syntax, the Pascal version is generally going to be simpler to read, if only because it's not as complex as a C for.
Let's also consider operators. Pascal and C both share +, - and *. They also both have /, but with different semantics: In C, / does an integer division if both operands are integers; in Pascal, it always does a "real" division and uses div for integer division. That means that you have to take the types into account when figuring out what actually happens in that line of code.
C also has a bunch of other operators: &&, ||, &, |, ^, <<, >> - in Pascal, those operators are instead named and, or, and, or, xor, shl, shr. Instead of relying on some semi-arbitrary sequence of characters, it's spelled out more. It's instantly obvious that xor is - well, XOR - unlike the C version, where there's no obvious correlation between ^ and XOR.
Of course, this is to some degree a matter of opinion: I much prefer a Pascal-like syntax to a C-like syntax, because I think it's more readable, but that doesn't mean everyone else does: A more natural language is usually going to be more verbose, and some people simply dislike that extra level of verbosity.
Basically, it's a matter of choosing what makes the most sense for the problem domain: if the problem domain is very limited (like with Inform), then a natural language makes perfect sense. If it's a very generic domain (like with C), then you either need far more advanced processing than we are currently capable of, or a lot of verbosity to fill in the details - and in that case, you have to choose a balance depending on what sort of users will be using the languages (for regular people, you need more naturalness, for people who know programming, they're usually comfortable enough with less natural languages and will prefer something closer to that end).
I think the question is, who reads and who writes the application code in question? I think, regardless of the language or architecture, a trained software developer should be writing the code, and analyze the code as bugs arise.

New or not so well-known paradigms, syntax features and behaviours of programming languages?

I've designed some educational programming languages and interpreters for them, but my problem always was that they ended up "normal" and "boring", mostly similar to some kind of existing language (ASM and BASIC).
I find it really hard to come up with new ideas for syntax features, "neat things" and new or very modified programming paradigms for it. I always thought that it was hard to come up with good new things not fun/useless new things for this case.
I wondered if you could help me out with your creativity:
What features in terms of language syntax and built-in functions as well as maybe even new paradigms can I work into my language to keep it useless but more fun, enjoyable, interesting and/or different to program in?
I always thought that it was hard to come up with good new things
You were right. This is why John Backus, Ken Iverson, Niklaus Wirth, Robin Milner, Kristen Nygaard and Ole-Johan Dahl, Alan Kay, and Barbara Liskov all won Turing Awards—they contributed good new ideas to the design of programming languages.
If you want to add a dash of interest to your own designs, these are excellent people to steal from.
Both ASM and BASIC are imperative languages, so you might want to consider features of functional programming languages, especially lambdas and maps. You might also want to consider interesting flows of control, for example, being able to throw an exception and then later, as a result of catching the exception and making a certain call, resume from the point that the exception was thrown (albeit using a modified environment). Also, co-routines, or other forms of language-level parallelism are often interesting.
In addition to Michael's comment on functional languages, look at closures and blocks (like they're done in Objective-C). Those let you treat functions or pieces of code as first-class objects that you can pass around and call on demand. Some cool stuff can be done with that, and it's also shaping up to becoming the paradigm for programming massively multi-core systems.
You could also look into currying, which means binding some of a function's parameters, so you can then use it on fewer arguments. That way, you could create a base-b logarithm function, which you could curry to create functions for the base-2, base-10, etc. logarithm.
And something less functional (as in language): look at Ruby's way of treating everything as an object (even numbers), you can do quite a bit with that. Like an object-oriented runtime with introspection, an interpreter "for free," etc. Implementing OOP stuff is easier than you'd think.
A lot of stuff has been done in the last 30-odd years, don't restrict yourself to 70s-style programming! ;) If you're looking for inspiration, check out Ruby, Python, Scala, Objective-C, JavaScript (read Douglas Crockford's JavaScript: The Good Parts), etc.
The Esolang wiki gives a good sample of the weirds and wonderfuls of all kinds of esoteric programming languages, including many user creations. Perhaps some inspiration for something sane lies therein.
look at Forth. It is something original. Too original.
intercal has plenty of unusual language features B-)
I've always thought it would be neat to apply CSP to a stack based language. Could get pretty interesting.
See Wikipedia: Programming Languages. There are many useful links, especially in the Taxonomies section.
So much of the "new" is really just "forgotten old". I will hold my thoughts on some of the "popular" programming languages of the day.
There are many things that could be explored and active research is being done on some of them. Some of the things I think would be useful are:
real continuations in a non-functional language
here is an attempt to add them on to C++: http://mainisusuallyafunction.blogspot.co.nz/2012/02/continuations-in-c-with-fork.html
languages that let the user create new syntax elements
FORTH and J might be starting points.
Pogoscript is interesting as well because flow control constructs like if/elseif/else and while/wend arten't special can be created in user code.
custom user defined operators actually aren't new: I think Haskell, Nemerle, Kaleidoscope and several others already do this but even that wouldn't be "boring"

Should programming languages be intuitive?

What features could be added to a new programming language
to make it more "intuitive"? When it comes to websites and
desktops, we favor high usability, almost intuitive
usability. It is becoming increasingly expected that your
application should "just work". For a certain class of
applications the idea that one has to RTFM, is a mark
against the effectiveness of the application. People tend to
expect the application to just work the way they "think" it
should work. One could argue that this is a worthy standard
that designers should strive for.
Can the same usability rigor apply to programming languages
and developer environments? I realize there are tools like
IntelliSense that provide hints, and a good IDE provides a
lot of assist. But what about the core language itself? What
could be added (or removed) that makes certain programming
techniques or algorithms more obvious to implement? How does
one make regular expressions or recursion more intiutive? Or
is this just folly?
Take a more concrete example: liquid layouts in HTML, CSS,
or Flex and MXML. In HTML and CSS, the box model is anything
but intuitive given the different implementations of
Internet Explorer and the other browsers. And unless someone
reads the documentation or studies the concept of the box
model it would be difficult to "just get it" when designing
a layout on one's first stab at CSS. I would argue this is
why tables thrived in the early days. The box model was
implicit in the concept of a table cell. With the help of
tools like Dreamweaver one could get their mind around
percentage widths and layout within the constraints of table
cells. Then CSS came into maturity and a whole set of valid
reasons emerged for why tables are not for layout. But to
achieve the same effects designers had to really study the
CSS implementations and the box model, and inject a new
layer of abstraction into their thinking.
In another example, I find when programming lots of things
in ActionScript and MXML, the whole concept of fluid layouts
and percentage based widths of elements not very obvious and
doesn't always follow intuition. I understand the basic
problem in that the Adobe Flash player and the layout need
to understand things in absolute pixel terms. When it comes
to the potential width of a component, I understand why
percentages are not immediately obvious to implement at the
core level of the code. Theoretically speaking the Flash
Player needs to know (or calculate) the exact width of a
component so that it can provide the proper geometry to the
video card when doing a draw on the screen. But when you
introduce some concept of percentages then you introduce the
theoretical possibility of an infinite width. And to find
"infinity - 1" pixels is not something a computer can
directly do without some layer of abstraction and
calculation. The viewport must be referenced. The program
must know its boundaries. So absolute widths are the norm,
although humans might prefer to design in terms of
percentages.
When it comes to programming languages can there be
expressions and features that assist intuition when thinking
about a programming task. Or are we better off "thinking
like a computer" and just RTFM'ing the manual when we need
to understand how to implement some feature or layout in
code?
If you could change the syntax or semantics of your
programming language of choice what would you add, change,
or remove to improve the "intuitiveness" of it?
Addendum, the reason for asking this question is inspired by
seeing example of what "novices" were able to achieve in
Smalltalk in Alan Kay's lecture: Doing with Images Makes
Symbols.
"If you could change the syntax or semantics of your programming language of choice what would you add, change, or remove to improve the "intuitiveness" of it?
"
Programming is hard. Really hard. Syntax changes don't matter much. IDE's are irrelevant to the fundamental challenge of programming.
The thing that is often baffling is the semantics of the language.
I don't know what "intuitive" means with respect to a thing as abstract as a programming language. Indeed, "intuition" is probably a bad thing. Coming to a programming language with intuition means preconceived notions, biases and intellectual junk will take over.
I would never expect to "just get it" for anything on any level anywhere. Programming requires clear thinking -- not "intuition" -- not "expectation".
The only thing we can ever do is read the manual and understand the unique, distinct, novel semantics of the new thing we're confronted with.
I do know this: elegant simplicity is essential. Orthogonality of features. Clarity. Precision. Absence of exceptions or special cases. Above all, simplicity.
Layering on language features is fundamentally bad.
Covering language problems by layering in a complex IDE is worse.
See http://www.cs.utexas.edu/~EWD/transcriptions/EWD08xx/EWD854.html
"when faced with something new and unfamiliar we try to relate it to what we are familiar with. In the course of the process we invent the analogies that enable us to do so.
It is clear that the above way of trying to understand does not work too well when we are faced with something so radically new, so without precedent, that all analogies we can come up with are too weak and too shallow to be of great help. A radically new technology can create such circumstances and the wide-spread misunderstanding about programming strongly suggests that this has happened with the advent of the automatic computer. "
In short, "intuition" and "intellectual baggage" is the problem of the programmer. The best way to understand a technology is to approach it as something fresh, new and otherwise unknown.
Bottom Line.
The complexity is inherent.
You have two choices.
Develop intellectual tools (i.e., abstraction, summarization, etc.) to cope with it.
Get a job in another field.
Asking for the inherently complicated world of computing to morph into something any one person finds "intuitive" can't happen. Computing is too complicated to be "intuitive".
Another field I've seen that addresses the complexity of the "syntax" of a programming languages is that of Visual Programming Languages. The basic idea behind VPLs is to take the constructs of programming languages (decisions, subroutines, functions, etc.) and represent them graphically, typically as a data-flow diagram. One such language that's gaining popularity recently is the Microsoft Visual Programming Language. I have not used it, and cannot make claims as to its power, but I have used LabView to great effect and I can say that you can do pretty much anything you can think of even in LabView -- but you do have to think of it in a very different way.
That said, I find I have a personal preference for code rather than VPLs.
One step folks are taking that has as much to do with base class library as it does the language itself -- although to be honest, the two are often synonymous -- is the concept of a Fluent API. The basic idea is to make code "read like a sentence", the idea being that this makes the code more flexible and maintainable.

Creating a Mobile Programming Language

I'm thinking about creating a small language that is very easy to type on a mobile phone (J2ME),
What is the more appropriate language to implement in order to run it inside a mobile phone (j2me always)? Appropriate meaning, small/easy syntax, easy to type in a mobile phone.
Is it lisp? Some sort of Basic/Python/Ruby (I think not...)? Or another new (can you propose a new syntax?)?
I am the author of just such a language: Hecl, at http://www.hecl.org . In order to make quite applications easier, I also created a site where you can build simple apps through a web interface: http://www.heclbuilder.com . I also wrote an article discussing the implementation of the language:
http://www.welton.it/articles/hecl_implementation
Other languages that are worth looking at include Lua, and Javascript, both of which have mobile implementations.
If you include editor support (nesting structures, indented display, balancing, ...) then some form of LISP would be relatively straightforward to implement and use. I've seen screenshots (but can't find them now) of a LISP-based language for live interactive-performance programming. It used indented, shaded rectangular areas on the screen (instead of parentheses) to show nesting of structure.
I would think the design of the editor would be the biggest consideration, not the language. For instance, supporting some kind of "intellisense"-like autocompletion would be vital for saving thumbstrokes. Some kind of language sensitivity in the editor would help a lot too. For instance, when a C user types "for" the autocomplete should show an option for filling out the syntax of a loop:
for (;;) {
}
You might want to look into Hecl: http://www.hecl.org/
I'm not sure what's easy to type on a mobile phone, but the language I know with the most computing power per character is APL. As a source of syntactic or design ideas, you might prefer its modern successor, the J programming language.
On a mobile phone, you should also consider languages like Scratch (smalltalk), because the non-typing interface would be easy to use.
Also on the smartphones with drag&drop capability, it would be something good.
On the other hand, the IDE would be a lot heavier on CPU & other resources.
Forth is usually considered a legitimate contender for these kinds of requirements. And it's about as terse as can be imagined. Extensible, small and malleable. Built-in small screen editor, too.
If you want super-compact, try nano-False http://www.aldweb.com/pages/winikoff/#false
It isn't very usable, although more so than the deliberately painful Brainfuck and Whitepace. Think of it as Forth with the easy syntax made more concise ;-)
I found Quartus Forth reasonably easy to use, provided you can think in stacks, and with more Intellisense support for the API it would have been much more productive. For prototyping little algorithms on the Palm I preferred Plua or Lispme. The LispMe environment is worth studying anyway because it provided good use of lists for finding keywords and so eased GUI programming
The big decision you have to make is whether you expect users to just use a phone numeric keypad or be able to type in reasonable approximations to a full keyboard. One of the huge benefits of the Palm was the high-quality full-size folding keyboards which I sadly miss (and hope someone makes an iPhone accessory to connect). If you don't have a full keyboard, make use of selectors for verbs so they can use picking actions rather than having to type in words. Consider the amount of code typed in traditional code for the framework classes and methods compared to the user code.
When I go about dreaming about a language, I think about what features are important to me at the time I'm dreaming. Only once you figure out what features are important to you can you come up with the best answer to what syntax. For example, if you want named parameters, it greatly influences your design choice about how method calls look (a la Objective-C or Python).
Designing a language can be a really fun task. I encourage you to step back and ask yourself "Do I really like how this is done in X?" (substituting some language name). If that's something you've always loved, steal it. If not, look elsewhere. Create your ultimate mashup of what you love, and leave out what you hate!
Lisp would be difficult to type because of all the ()s, although joel.neely's answer demonstrates one way of working around that problem.
So if you want to use an existing language you might want to look at which ones use least unusual characters.
Then there's the screen size issue. The more verbose the language the less code you're going to be able to fit onto the screen at once. What kind of devices are you aiming at? Smartphones with big screens (a limited audience) or 240x240 pixel feature phones?
Bear in mind that the interpreter/VM for your language will have to fit into a small amount of memory and performance may not be very good.
Brainfuck has only 8 characters -- very easy to type in on a mobile phone.
Of course, understanding and doing stuff with it... not so easy. But it satisfies the requirement....
Basic is very easy.
I would stay away from lisp. Unless you want to give your mobile users a headache on top of the headache they have from radio waves.

Resources