World's First Computer Programming _Language_? [closed] - history

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
OK -- a bit of an undefined question (is the pattern of plugs in an Eniac plugboard a language ??) but contenders include:
Konrad Zuse's PlanKalkül (1940s) -
never implemented (generally
accepted as the first).
Whatever Ada Lovelace (1840s) programmed in (not
Ada) -- if she is the first
programmer, as everyone says, she
must have used the first programming
language, no? Again probably never
implemented - but did Babbage have
anything that could be called a
language?
Turing's description of
his Turing machine (1936 paper). In
the paper he actually writes
programs and simulates their
execution mathematically - that
makes it as good as (and earlier
than) PlanKalkül in my book.
Autocode for the Machester Mark 1 computer (1952) -- compiled, high level, beats Fortan to the punch (?). Mr Turing again (!).
Fortran (Early 1950's) - beats out Lisp by a couple of years and undoubtedly passes the sniff test. But was it earlier than Mark 1 autocode ??

The PBS series Connections made the argument that the holes punched in tiles to control the patterns created on looms (circa 1700s??) were the first programming "language".
These were followed by player piano scrolls: Codes, on paper, which are read by, and control the operation of a machine. That's a programming language, isn't it?

DNA -- or does it have to involve silicon computers? ;-)

Since Ada Lovelace is widely regarded as the first programmer, I'd investigate what she called the set of symbols she was using.
Update: You can read the notation that Lovelace used in her Notes on Sketch of The Analytical Engine Invented by Charles Babbage By L. F. MENABREA. Lovelace was the translator, but her notes describing the programming of the Analytical Engine ended up being about four times longer than the original publication.

I think we need to agree on a definition of "programming language" to answer this question in any useful way. Is directly manipulating machine code a programming language?

Konrad Zuse's PlanKalkül (1940s) - never implemented
There was actually an implementation of the language published by Rojas et al. somewhere around the year 2000.

DNA -- or does it have to involve silicon computers? ;-)
Well, if you go down that road then the correct answer has to be RNA which existed before DNA. But then, do we have a Blind Programmer? ;-)

In the beginning there was Ada Lovelace , Then Bill said 'Let there be C#' And there was light !!

Assuming a definition of "programming language" as "a textual notation used to describe/control the intended behavior of a digital computer", I think there's only one possible answer: raw (numerical) machine code.
Many of the other answers (e.g. recipes for cooking) are clever, but aren't about programming per se, but about description/control in a different context or more general sense.

I would say that the first programming language actually used was the machine language of the first stored program computer, which I believe was Baby: http://www.computer50.org/

The language the analytical engine would have used was its own machine code, entered via punch cards indicating the operation to be performed and the columns (effectively registers) to perform it to. See these notes for some details.

Programming, at least in the declarative sense, comes down to combinations of sequence, alternation, and repetition. One might consider recipe authors as programmers, and therefore very early ones. Think about a recipe: it contains sequence (slice this, then chop that, then heat so and so...), alternation (if you want it moist then bake for 40 minutes, else if you want it "cakey" bake for 55 minutes), and repetition (while not stiff kneed the dough, repeat stirring until the batter is smooth). Recipes go back thousands of years.

Related

Why is Verilog not considered a programming language? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
In class the professor said that students shouldn't say that they learned to program in Verilog. He said something like Verilog isn't used to program it's used to design. So how is Verilog different from other programming languages?
Verilog, just like VHDL, is meant to describe hardware. Instead, programming languages such as C or C++ provide a high level description of software programs, that is, a series of instructions that a microprocessor executes.
In practice, Verilog and VHDL do not offer the same features as programming languages, even though they look very much alike. For instance, a for loop in C/C++ describes the sequential execution of a given snippet of code; instead, a for ... generate loop in Verilog/VHDL describes multiple parallel instances of a same hardware building block (say, a AND logic gate). To be precise, there also exists a plain for loop in Verilog, but again, it has to be "synthesizable", that is, the compiler must be able to generate logic that fits the description.
Typically, a beginner in Verilog/VHDL will be tempted to "translate" a given function/algorithm from a C/C++ type of pseudocode directly to Verilog/VHDL: surprisingly, it might sometimes work, but it always lead to dramatically poor design. One must really be aware of these differences in order to become a good Verilog/VHDL programmer.
Verilog is a hardware definition language. Programming languages are generally understood to be languages for telling existing hardware what to do, not for reconfiguring said hardware.
I don't know anything about Verilog but just did a quick googling and the wiki pages seem to do a pretty good job of explaining the differences in concept that your teacher seemed to be eluding to. As some of the other posters here wrote I don't know that I would dismiss this as not a programming language, I think there's a high tendency for programmers to believe if it isn't somehow application programming or assembly programming then it's not really programming, but in short that's BS. Everything above machine code is basically the same to me, if it's a file I give to a computer and it tells the computer how to do something it's programming the computer (I guess the problem is drawing a line between users and developers, we like to feel special). Unless we plan to roll back to punch-cards sometime soon, I think anything that has a C like syntax or allows you to describe in a syntactically strict (well defined) way and modifies the behavior of the computer (what it outputs for a given input) then you've done some programming in one sense or another.
http://dictionary.reference.com/browse/programming
From the wiki page:
http://en.wikipedia.org/wiki/Dataflow_language
Dataflow programming focuses on how things connect, unlike imperative programming, which focuses on how things happen. In imperative programming a program is modeled as a series of operations (thing that "happen"), the flow of data between these operations is of secondary concern to the behavior of the operations themselves. However, dataflow programming models programs as a series of (sometimes interdependent) connections, with the operations between these connections being of secondary importance.
(I think the key here is the qualifiers of the type of programming not that one is a "programming language" and the other is a "design language", from what I understand they're both programming languages they just have distinct purposes and implementations). When I think of design I basically think of this:
http://dictionary.reference.com/browse/design
and that is not a program although a program may utilize designs (and probably should, generally referred to as design patterns, but not what you're doing)
Linked in from: http://en.wikipedia.org/wiki/Verilog
To your teachers point this language would likely be used to solve different problems from your every day Java/C program, and via a different means, however to say it is not a program seems wrong.
Because it is an HDL, so it is to define hardware, and anything done in verilog (not really anything, but synthesizable things) will be synthesized into actual hardware. So you can't just use programming features like class and OOPS concept because it can't create any hardware.
But in C, everything will be converted into executable hex file, which will be loaded in your ram while executing the program.
Another basic difference is everything in hardware is concurrent, so if you have written a=b+1 and c=d+1 in verilog, then in the synthesized hardware, both modules will work simaltaneously. But in C everything is sequential, so in same C program actually both instruction will be loaded one by one in your processor.
It is a programming language, not to program software, but to describe hardware design - but the output is not necessarily an "application" as we understand it.
The language has a formal syntax.
Verilog contains features to describe logical netlists(RTL) and features to facilitate simulation of them. Describing an RTL description as a program may convey that one who describes it as such does not throughly understand logic design or synthesis. Describing a testbench stimulus as a program would be appropriate.
verilog/vhdl is used to create and design specific application system on the chip which embedded into electronic devices.
c/c++ used design softwares on the computer
I am going to tackle this question in a different way. What is a purpose of a programming language? Can the output of a program affect real world and your goals and expectation? If yes then ofcourse verilog is a programming language. Console.log has as much meaning as what it translates to in real world eg. console.log("you have a million unit") has no fiat without authority. So verilog is a programming language in certain sense.

How to counter the "one true language" perspective? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
How do you work with someone when they haven't been able to see that there is a range of other languages out there beyond "The One True Path"?
I mean someone who hasn't realised that the modern software professional has a range of tools in his toolbox. The person whose knee jerk reaction is, for example, "We must do this is C++!" "Everything must be done in C++!"
What's the best approach to open people up to the fact that "not everything is a nail"? How may I introduce them to having a well-equipped toolbox, selecting the best tool for the job at hand?
As long as there are valid reasons for it to be done in C++, I don't see anything wrong with this monolithic approach.
Of course a good programmer must have many different tools in his/hers toolbox, but these tools don't need to be a new language, it can simply be about learning new programming paradigms.
As much as I've experienced actually, learning many different languages doesn't make you much of a better programmer at all.
This is also true with finding the right language for the job. Yeah ok, if you're doing concurrency you might want a functional language rather than an Object Oriented language, but what are the gains of using another programming language?
At the end of the day; "Maintenance".
If it can be maintained without undue problems then the debate may well be moot and comes down to preference or at least company policy/adopted technology.
If that is satisfied then the debate becomes "Can it be built efficiently to be cost effective and not cause integration problems?"
Beyond that it's simply the screwdriver/build a house argument.
Give them a task which can be done much easily in some other language/technology and also its hard to do it the language/technology that he/she is suggesting for everything.
This way they will eventually search for alternatives as it gets harder and harder for them to accomplish the task using the language/technology that they know.
Lead by example, give them projects that play to their strengths, and encourage them to learn.
If they are given a task that is obviously better suited for some other technology and they choose to use a less effective language, don't accept the work. Tell them it's not an appropriate solution to the problem. Think of it as no different then them choosing Cobol to take the replace of a shell script -- maybe it works, but it will be hard to maintain over time, take too long to develop, require expensive tools, etc.
You also need to take a hard look at the work they do and decide if it's really a big deal or not if it's done in C++. For example, if you have plenty of staff that knows that language and they finished the task in a decent amount of time, what's the harm? On the other hand, if the language they choose slows them down or will lead to long term maintenance problems they need to be aware of that.
There are plenty of good programmers who only know one language well. That fact in and of itself can't be used to determine if they are a valuable member of a team. I've known one-language guys who were out of this word, and some that I wouldn't have on a team if they worked for free.
Don't hire them.
Put them in charge of a team of COBOL programmers.
Ask them to produce a binary that outputs an infinite Fibonacci sequence.
Then show them the few lines (or 1 line, depending on the implementation) it takes in Haskell, and that it too can be compiled into a binary so there are better ways forward.
How may I introduce them to having a
well-equipped toolbox, selecting the
best tool for the job at hand?
I believe that the opposite of "one true language" is "polyglot programming", and I will then refer to another answer of mine:
Is polyglot programming important?
I actually doubt that anybody can nowadays realize a project in one and only one language (even though there might be exceptions). The easiest way to show them the usefulness of specific tools and languages, is then to show them that they are already using several ones, e.g. SQL, build file, various XML dialect, etc.
Though I embrace the polyglot perspective, I do also believe that in many area "less is more". There is a balance to find between the number of language/tools, the learning curve, and the overall productivity.
The challenge is to decide which small set of languages/tools fit nicely together in your domain and will push productivity and creativity to new limits.
Give them a screwdriver and tell them to build a house?

What is the tersest/densest commonly-used programming language currently available?

I refer you to the following video, which describes how to implement Conway's Game of Life in APL, using a few dozen keystrokes:
http://www.youtube.com/watch?v=a9xAKttWgP4
This video was featured prominently in the Return of Uncle Bob Martin podcast, in which Scott Hanselman complains that "his hands hurt" from programming in languages that require too many keystrokes.
Of course, none of us are going to replace our keyboard just to learn an old, obsolete programming language (or are we?), but I have heard that programmers can be two to three times more productive, depending on the language they are working in. Is it because they are working in "denser" languages?
What are the densest commonly used (practical) programming languages currently available? Do they improve productivity because they are dense?
just look at threads tagged code-golf
:)
From a definition of code golf ...
It seems that someone gives us a problem to solve, tags the question
code-golf and the winner is whoever completes the solution in the
fewest characters
Like in golf where the low score wins, the fewest amount of characters
"wins". While certainly the best solution in every case is not
necessarily the solution that has the fewest characters or fewest
lines of code, it can be a fun way to exercise your programming
muscles.
I think this question can only be answered if you consider the sort of support libraries that a language has available. For example I can do things in PHP using very few lines of code because there's loads of help for network requests, graphics processing, array and string handling etc., etc.
Using jQuery means I am typing less also when writing script. So the question isn't as simple as you suggest.
It has to be J.
Nowadays both Perl and many functional languages can be very terse, although APL is still considered the champion in that.
In terms of productivity, there is a level when terseness can help (Python and Ruby are considered more productive than Java/C# because they are more terse), and then there is a level where terseness makes the code very hard to read (APL is famous for that, as well as short Perl scripts). One needs a balance between the two. Also, there are a number of autocompletion editors that allow, for example, longer variable names without requiring a lot of extra typing.

Real world Haskell programming [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Having been an imperative developer for some years now, I had never had the urge to learn functional programming.
A couple months ago at last I decided to learn Haskell. It's quite a cool language, but I'm puzzled about how an event driven real app would be programmed in such a language. Do you know of a good tutorial about it?
Note: When I say "real app" I'm not talking of a real world, production ready app. I just mean a little sample app, just to get the grasp of it. I think something like a simplified version of the windows caculator would be great, and then perhaps something a bit more complex.
When you say "real world" examples you are presumably thinking about problems that are inherently sequential or stateful or do lots of I/O, right?
So, how about games?
Frag is a Quake clone, implemented for an undergraduate thesis (Functional Programming and 3D Games, Mun Hon Cheong, 2005). Here's a video of it in action.
Super Monao Bros. (formerly known as Super Nario Bros.) is, well, you can probably figure out which game it is a clone of. (This is the author's English language weblog.)
Purely Functional Retrogames is a 4-part series of blog articles about how to write games in a purely functional language, explained using Pacman as the example. (Part 2, Part 3, Part 4.)
Or, what about an X Window Manager, an extensible Emacs clone text editor or an IDE?
Then, there is the book, which even has your question already in the title: Real World Haskell and which is also available for free!
Another thing you might want to look at, is Functional Reactive Programming. (It is used in Frag, for example.) The interesting thing about FRP is that it allows you to look at the problem of, say, GUI programming from a very different angle. If you read the GUI chapter in the RWH book, you will see that it talks about how you can write a GUI application just like in C, only better. FRP OTOH allows you to write it in a totally different way that wouldn't even be possible in C.
A lot of times (I'm not saying that this is the case in your question, but it is a recurring pattern) when someone says "but can Haskell be used in the real world", what they are really saying is "I know how to do this in C, and in Haskell I cannot do it in exactly the same way, therefore it must be impossible in Haskell, therefore Haskell is not ready for the real world". But what they are missing out on, is that there might be a totally different and much better way to solve the problem. (It's like saying "Erlang doesn't have threads, therefore it cannot possibly be used to implement concurrent systems.") And FRP is just one example.
For a lightning talk today I have assembled this list of show-case Haskell applications, deliberately excluding anything that only targets programmers:
darcs (since 2002, 35 000 loc): Distributed
version control system with an innovative focus on changes instead
of states.
xmonad (since 2007, 30000 loc): Well known
tiling window manager with a huge library of layout and other
plugins. Made it into the list despite its configuration file being
a Haskell file.
hledger (since 2007, 9000 loc): Text-file
based double-ledger accounting tool, a clone of
ledger.
Raincat (since 2008, 2000 loc):
Platform game with a cat that does not want to get wet.
arbtt (since 2009, 2000 loc): My
automatic rule-based time tracker. Made it into the list as a
shameless plug; probably not that popular. It has now a proper web
page contributed by Waldir Pimenta.
detexify (since 2010, 500 loc): The
back end of the very useful LaTeX character command finder is
written in Haskell.
git-annex (since 2010, 28 000
loc): Manages your files and their location, a mixture of dropbox
and git. Written by famous Joey Hess, who made a living from it via
kickstarter
He is currently running a second round of
funding!
Nikki and the Robots (since 2010, 18 000
loc): Platform game with Nikki and, well, his robots. It was
produced as a commercial independent game and sold via a
pay-what-you-like scheme, but the company unfortunately closed down.
hoodle (since 2011, 13 000 loc): A
note-taking and PDF annotation software like
xournal.
Chordify (since 2012, ? loc): Analyses
music, e.g. from a YouTube video, and calculates the corresponding
guitar chords. Closed software, but supposedly written in Haskell.
(Also featured on my blog, and on the slides of the talk, with nice representative pictures of each program.)
xmonad is event driven (literally). It has a listener loop that wakes up on events, modifying an internal state modelling the X server, which is then rendered to the screen.
http://xmonad.org
I once found this irc bot written in haskell:
http://www.haskell.org/haskellwiki/Roll_your_own_IRC_bot
Here are some links as you requested.
This one explains a lot of things that doesn't 'make sense' to an imperative programmer about Haskell
Haskell Tutorial for C Programmers
This one is a very good easy to follow tutorial
Learn You a Haskell for Great Good
Raytracer written in Haskell
Haskell Raytracer
You can download Glasgow Haskell Compiler from here.
GHC
you should check out Real World Haskell. The book is freely available and shows how Haskell can be applied to real world problems. I wouldn't call it a tutorial, tho, as it is much more comprehensive.
Check out functional reactive programming.

Rule of thumb for capitalizing the letters in a programming language [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I was wondering if anyone knew why some programming languages that I see most frequently spelled in all caps (like an acronym), are also commonly written in lower case. FORTRAN, LISP, and COBOL come to mind but I'm sure there are many more.
Perhaps there isn't any reason for this, but I'm curious to know if any of these changes are due to standards or decisions by their respective communities. Or are people just getting too lazy to hit the caps lock key? (I know I am)
The Lisp community has switched to Lisp from LISP with the invention of lower caps keyboards shortly after mankind killed the last dinosaur.
FORTRAN, LISP, and COBOL ARE acronyms:
FORTRAN: Formula Translation/Translator
LISP: LISt Processing
COBOL: COmmon Business-Oriented Language
BASIC: Beginner's All-purpose Symbolic Instruction Code
Amongst others.
Some of it has to do with the version of the language (i.e. FORTRAN 77 vs. Fortran 90). From the Fortran Wikipedia entry (emphasis mine):
The names of earlier versions of the language through FORTRAN 77 were conventionally spelled in all-caps (FORTRAN 77 was the version in which the use of lowercase letters in keywords was strictly nonstandard). The capitalization has been dropped in referring to newer versions beginning with Fortran 90. The official language standards now refer to the language as "Fortran." Because the capitalisation (or lack thereof) of the word FORTRAN was never 100% consistent in actual usage, and because many hold impassioned beliefs on the issue, this article, rather than attempt to be normative, adopts the convention of using the all-caps FORTRAN in referring to versions of FORTRAN through FORTRAN 77 and the title-caps Fortran in referring to versions of Fortran from Fortran 90 onward. This convention is reflected in the capitalization of FORTRAN in the ANSI X3.9-1966 (FORTRAN 66) and ANSI X3.9-1978 (FORTRAN 77) standards and the title caps Fortran in the ANSI X3.198-1992 (Fortran 90) standard.
When I see FORTRAN, I think fixed-spacing, punch cards, non-dynamic memory, and a bad taste in my mouth. Fortran means things like user-defined types, modules, array intrinsic functions, and isn't so bad.
To add another to the list, MATLAB is supposed to be spelled with all caps. Since it is short for "matrix laboratory", some people tend to write it as MatLab. Others just write it as Matlab or matlab, but these are all technically incorrect.
Apparently, lot of people doesn't know why, and capitalize all short programming language names they find. A sad example is Lua, too often written LUA for no reason.
Note that some languages names have internal capitalization, due to the way they were build, and well, just because the company making them wanted them this way. For example JavaScript and PostScript, or ActionScript (do I see a pattern there?). Or you have a strange mix, like ECMAScript (yes, I see a pattern!).
Because they are acronyms for stuff but at the end of the day it doesn't matter. They're just names.
E.g.
LISP = "LISt Processing"
Whereas Java for example is just named Java - it doesn't stand for anything. It used to be called Oak because they guy who named it had an oak tree outside his office.
Wikipedia a language and you'll find your answers.
The rule is that there is no rule. It's not like there is an Academy or other governing body that rules of what people get to call their programming language and how they have to spell it. Everyone makes up their own rules for their own language.
Typically, if your writing about this stuff, you either follow a house style guide or lookup the official name for it.
The reason you see so many different uses is that most people don't care or are just ignorant. I still wonder who ever told anyone to refer to a Macintosh computer as "MAC", yet that spelling is pervasive. Some people just love their shift keys, I guess.
If it is in all caps it is (supposed to be) an acronym.
if variableName == VARIABLENAME
print "USE CAPS"
else
print "Follow your team's coding standards"

Resources