Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
I know that ALGOL language is super-uber-extremely important as a theoretical language, and it also had a variety of implementations as per Wikipedia.
However, what's unclear is, was ALGOL (pure ALGOL, not any of its derivatives like Simula) ever actually used for any "real" programming in any way?
By "real", I mean used for several good-sized projects other than programming language/CS research, or by a significant number of developers (say, > 1000).
Personally, the only ALGOL programming I have ever done was on paper, thus the curiosity.
Algol58 seems to have been the most successful in terms of important applications.
From Wikipedia:
JOVIAL is an acronym for "Jules Own
Version of the International
Algorithmic Language." The
"International Algorithmic Language"
was a name originally proposed for
ALGOL 58. It was developed to compose
software for the electronics of
military aircraft by Jules Schwartz in
1959.
Then:
Notable systems using JOVIAL include
the Milstar Communications
Satellite, Advanced Cruise
Missile, B-52, B-1B,
B-2 bombers, C-130,
C-141, and C-17 transport
aircraft, F-111,
F-15, F-16 (prior to Block
50), and F-117 fighter aircraft,
LANTIRN, U-2 aircraft,
E-3 Sentry AWACS aircraft,
Navy Aegis cruisers, Army
Multiple Launch Rocket System
(MLRS), Army UH-60 Black
Hawk helicopters, F100,
F117, and F119 jet
engines, the NORAD air
defense & control system (Hughes
HME-5118ME system) and RL-10
rocket engines. Airborne radar
systems with embedded JOVIAL software
include the APG-70, APG-71
and APG-73
ALGOL 68 was used in part of DRA for the same purpose. cf. Wikipedia:
The '''Defence Research Agency'''
(normally known as '''DRA'''), was an
executive agency of the UK Ministry of Defence
(MOD) from April 1991 until April 1995. At the
time the DRA was Britain's largest science and
technology organisation.
DRA's Algol68 compiler was finally open-sourced in April 1999 and is now available for linux for download from sourceforge. (However an interpreter for "Algol68g" is easier to use).
ICL's Algol68 was/is S3 - It was developed by the UK company International Computers Limited (ICL) for its 2900 Series mainframes. It is a system programming language based on ALGOL 68 but with data types and operators aligned to those offered by the 2900 Series. It was the implementation language of the operating system VME.
There are (at least) two other British operating systems - Flex and Cambridge_CAP_computer - written in Algol68 variants. And also 1 Soviet OS: Эльбрус-1 (Elbrus-1), but I have yet to find any of their source code. (If anyone can find and distribute to this source code please let me know)
BTW: I believe that VME is still running - in production - as a Linux/Unixware guest VM. Mostly at Commonwealth of Nations Custom/Immigration services.
Also over the same period the USSR was using Algol68, c.f. history link. Algol68 is used in Russian telephone exchanges. And Algol58 was used in the Russian "Buran/Буран" Space Shuttle landing system.
ALGOL68 was internationalized in 1968. I suspect there are other Algol projects in other countries. esp in German, in Dutch Japanese and Chinese but I have no details.
If you want to actually tryout Algol68 and/or contribute your code, check out Rosettacode's ALGOL 68 repository, then as a class project try one of the "Tasks not implemented".
Nothing like responding to 2 year old threads. I program in ALGOL almost daily. I am a programmer on a Unisys ClearPath mainframe and the majority of the system code is written in ALGOL or variants. The Burroughs B5500 was really designed around the language so it is a pretty efficient language/compilation process. Granted, this version is ALGOL with some extensions like limited classes (structure blocks), etc.
i := 80;
while i > 0 do
begin
scan ptrRay:ptrRay for i:i until in ALPHA;
scan ptrEnd:ptrRay for i:i while in ALPHA;
if i > 0 then
begin
replace nextToken by ptrRay for (offset(ptrEnd) - offset(ptrRay));
end;
end;
That code scans for ALPHA only tokens. It uses the OFFSET function which is a little more costly than using the residual count math yourself (i, starti, etc);
Like Tom, I program in ALGOL almost daily - and I'm also on a Unisys Clearpath. ALGOL has been the primary source of my mortgage repayments for more years than I care to remember.
When I started programming, Algol was the only compiler available. Yes, it was mainstream till we got a Fortran compiler.
To follow up on themis' answer, the entire Burroughs "large system" family (5000, 5500, 5700, 6500, 6700...) was really designed to run Algol well. The operating system, compilers, and major system utilities were written in Algol; if that's not "real" programming, what is?
To be precise, over the life of the product family Burroughs extended Algol into a superset called ESPOL. When Burroughs brought out the "small systems" family (1700, 1800, 1900 series), they defined another Algol-like language called SDL (Systems Development Language) in which the operating software of that line was written. Access to SDL was restricted for security reasons. A variant of SDL was subsequently created with a few of the "priveleged" features removed. The resulting language, called UPL (User Programming Language), was available for customer use.
Some of us still remember when the phrase "Algol-like language" was used to describe any programming language with block-oriented control structures and variable scoping. Widely-known examples of Algol-like languages included PL/I, Pascal, and (...wait for it...) C.
Algol was the major programming language for the Burroughs B5000.
However, what's unclear is, was Algol (pure Algol, not any of its derivatives like Simula) ever actually used for any "real" programming in any way?
Please, avoid the term "real" programming. "Real" - as opposed to what ? Imaginative ?
By "real", I mean used for several good-sized projects other than programming language/CS research, or by a significant number of developers (say, > 1000).
Yes. It was used for a certain number of projects on which worked a certain number of developers.
Only, what is usually misinterpreted often today is this; in those days computers weren't exactly a household commodity. Hell, they weren't that 30 years ago, less alone 60.
Programming was done in computer centres which were either in goverment ownership (military, academic, institutes of various kinds) or in private enterprises (large companies). And programming wasn't a profession - it was something which engineers, mathematicians, scientiscs and the like used to do when their work was done on paper ... or they had specialized operators which did it for them. Often women, who may or may have not had a scientific background in that particular field - they were "language translators", in lack of a better term (and my bad english).
Programming theories and research was at its beginnings ... vendors being few (and naturally uncooperative to each other) ... each of them used their own extensions, and often programs written for one didn't work well with the other vendor's systems.
There wasn't a "right way" to do something ... you had that and that, and you used whatever catch you could figure to work around your problem.
But, I've wandered off. Let me get back to the number of people. This also goes for several other languages; fortran and cobol, for example.
People say, "very few use it". That's simply not true. What is true is that a small percentage of people uses it today, but a larger percent of people used to use it.
As I said, in those days only the sci. and eng. community used to do it. And their number was relatively small, compared to the total population. Nowadays, everybody uses computers, but the absolute number of engineers, mathematicians and the like, is pretty much the same. So it seems that nobody uses those languages anymore ... while in reality, for certain specialized languages (well, nowadays this goes for fortran and cobol, more than algol) the number of users is pretty much constant.
Personally, the only Algol programming I have ever done was on paper, thus the curiosity.
I know I didn't answer your question, but just wanted to clear this. Algol was a little "beofre my time".
My first programming experience was on a Burroughs B5500 owned by Northern Natural Gas Company starting in 1970. I started out in COBOL but switched to ALGOL (actually used both) when they needed additional support for a large Oil & Gas Lease Information system that was written almost entirely in ALGOL. At the time there were two programming departments, Business Systems and Scientific Computing. The Scientific Computing department programmed in ALGOL and FORTRAN while the Business Systems department was mostly COBOL with a smattering of ALGOL. Northern advanced from the B5500 to B6500, B6700, B6900, B7800, and B7900 while I was there. I eventually transferred to the Technical Support department and got into making and supporting MCP patches to customize the system for Northern's needs. That was fun!
Short answer to the question. Yes. Northern had a number of application systems written in ALGOL. Of course it was Burrough's version of ALGOL (extended ALGOL).
Burroughs B5500 Extended Algol was used heavily for research in astrophysics, linguistics, and statistics at my university (Monash University, Australia) in the late 60s. It was also used in commercial applications that helped pay the bills for the computer center.
As I write this I am running Algol programs in the latest release of the Burroughs B5500 emulator from the team at retro-b5500 in Tasmania. The emulator runs entirely in the browser and faithfully models the processors, disks, tapes, card readers, line printers, card punches and datacom gear!
You can read about the project at http://retro-b5500.blogspot.com/ and http://code.google.com/p/retro-b5500 and you can write Algol programs for arguably the finest Algol machine ever made (except perhaps its successor the B6700.)
One of the postdocs from Monash wrote a reverse compiler from IBM Assembler to Burroughs COBOL in Algol, which was used to migrate all the billing applications at the state-run Gas & Fuel Corporation from IBM 360s to Burroughs 6700s.
Back in 1970, I helped develop a Jovial compiler for the Royal Dutch Navy. One of its big advantages was that it was written in Jovial, hence we all got to become pretty good Jovial experts. In fact, as part of the test cycle we would compile the compiler though the latest incarnation of itself and run all our test sets on that. If it passed we would release the first compiler. Thus each release had the capability of compiling itself and that compiler could pass all tests. As every found bug was always added to our self-checking test set the quality of the compiler improved and improved. By the time we left the project we had no known bugs...my once and only time that ever happened.
I programmed in Algol/Jovial back in the 70's for the military. I loved the language. You couldn't do recursion in Fortran and I often could make a program much easier by using the correct data structure and a little recursion.
After I had left that assignment, I found that the other developers on the project didn't want to maintain the Jovial code and tried to replicate what I had done in Fortran. It just didn't work and was much slower.
I learned about compiler theory by digging into the source code for the Jovial compiler. Ah... those were the days.
Algol was well implemented on the Elliott 4100 machine and was used extensively to develop early refinery process simulations at BP Research center in the late 60s. However, at that time Input/output was not well defined (varied between machines) and at BP it was quickly overtaken by Fortran IV as programs written in strict Fortran IV would run on almost any machine variation - IBM, Univac, Atlas, etc., etc.
Related
Are there any good reason to learn languages such as Ada and COBOL? Are there any future in programming in those languages? I'm interested in those languages and i'm currently learning them just for fun.
Its always worthwhile learning new languages. Even if they're never useful to you professionally chances are they'll teach you something about programming you didn't know before or at very least broaden your outlook.
As for prospects from a quick bit of reading around it seems ada is still somewhat in favour for critical systems in the aviation industry and Cobol still has its place in business. I know an engineer in his mid 20's who writes all his code in fortran77 as that's what industry wants!
While the number of employers looking for these languages might be low, because there are a limited number of people who know them the salary for developers who specialise in them can be quite high. When mission critical apps developed in them could cost millions to replace having to pay more than usual for a coder to maintain the existing system is easily accepted.
Ada is used in the aerospace/defense industry. COBOL is used in the financial industry. Fortran is used in engineering. The question "is there any future" is borderline subjective/argumentative since all of those languages are still in active use.
Fortran is old, but is used in scientific programming. Ada is the basis for VHDL, a very important language in electrical engineering. You could also say that C is "old", and it's used pretty much everywhere.
Cobol and Algol are both still in widespread use. You won't find them running on your latest and greatest tech firms, but you can bet your car insurance company process' claims on it. Your health insurance company most certainly uses it. Reports of Cobol's death have been highly exaggerated.
You will find difficulty in colleges and places that will actually teach you Cobol or Algol. So finding developers for these so-called dead languages is getting harder and harder. Very tough to tell a kid coming out of high school that has been programming in Java, iOS, and Perl for half his life that Cobol is where the money is at.
Cobol/Algol developers are becoming harder and harder to come by, so if you have that language in your back pocket, it is only going to help you out. Algol is a lot harder of a language in my opinion to get good at. You can teach anyone with half a brain how to program in Cobol.
These languages are not going away any time soon at all. As long as companies like IBM and Unisys provide compilers for them on the mainframes, they will continue to thrive. So grab a book and an open source compiler and brush up. Plenty of people out there looking for Cobol/Algol developers.
Many of these 'old' languages are actively in use today. Lisp for instance is gaining popularity again in the form Clojure. Smalltalk is becoming popular again with the Seaside MVC framework.
In addition many of the hottest development lanaguages borrow heavily from Lisp and Smalltalk, both of which pioneered Object Oriented methodologies long before C++ came along. Javascript, Ruby, Perl 6 and Perl 5 Moose (Object System) all use mixins which were first used in Lisp and Smalltalk. Metaclasses, first used in Common Lisp and Smalltalk-80, are making a resurgence in Perl 5 Moose, Objective-C (iPhone development), Python and Groovy.
Much like learning Latin, it can be intriguing to understand where and how many English and other current languages' words had their roots. Also, if you know Latin and valuable new books/papers/scrolls are found that need translation, you suddenly become valuable too.
Honestly, I'd say learning them is great for a historical perspective, especially if you're a language designer, but not very much else.
There are roles out there for COBOL programmers, but in general they are looking for experienced developers. From what I have seen you are unlikely to get a first programming role in COBOL - in general, they are looking for people experienced with similar application domains and who are familiar with building an understanding of legacy systems. Knowing the limitations of the language can be useful for understanding why certain things you are asking for when connecting to mainframes are either considered difficult or problematic.
A rather strange question: I'm often asking myself with what programming languages things were created. I recently found this toy mini computer I played with when I was 13 or so at home. (Note: It is not one of those toy "notebooks", it's really small and came as an extra with a magazine)
"Features":
Hadware:
LCD with a small field of pixels where the games were going on, besides that some stats such as score, highscore etc.
Sounds and horrible music when started
A really small "keyboard" with a wire
Software:
At least 14 or so games, from Snake over Tetris and Breakdown to some abomination of a car racing game
A calculator
Game selecting menu
An alarm clock
Inside there is a really small circuit board, I don't want to open the thing up now, though.
Can you imagine if the games and "Operating System" of this thing where actually programmed using a language?
If yes, what language could it be?
If not with a programming language, how else was it created?
If I were to hazard a guess I'd say they used C, it's often used with Microcontrollers in devices like that.
The question is really architectural. Is there a microprocessor in there at all? If so it's likely to have been programmed in quite a low-level language - assembler or C are quite common. However, there might conceivably not be a processor; it might be implemented as custom silicon, either an FPGA or (unlikely) an ASIC, either of which you'd program in VHDL or Verilog.
Anybody's guess. A frequently used tactic when trying to cram a lot of software into a mass-market device (where saving 10c on storage can matter) is to use some kind of bytecode interpreter, where the bytecodes are designed to save space, even if they execute fairly slowly. FORTH used to be popular for this purpose, but there are an awful lot of one-off bytecodes in the world. One that has survived for adventure gaming is the Infocom Z-Machine.
It's clearly an embedded microcontroller.
While in principle it could have been programmed in almost any language, I would be surprised if it were written in anything other than assembly language or C.
My understanding is that all operating systems before 1972 and practically all embedded systems before 1980 or so were written entirely in assembly language, perhaps (as Norman Ramsey pointed out) with a one-off domain specific language (DSL) on top.
Assembly decreased in popularity and the C programming language became the most popular microcontroller programming language.
Even up to around 2000, practically all embedded systems used at least a little assembly language to handle the things that no available higher-level language could handle.
Even today, of the thousands of embedded system microcontrollers available, the vast majority of them have no more than 4 programming languages for them available off-the-shelf:
assembly language, C, BASIC, and Forth.
(I'm hoping that Python will become available for more microcontrollers -- the Pyastra and PyMite dialects already cover a couple of the most popular microcontrollers).
http://www.faqs.org/faqs/microcontroller-faq/primer/
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I'm very curious about old programming languages, especially COBOL, and as Wikipedia couldn't really tell me much about this topic, I decided to ask it here:
Was COBOL the first programming language really being used in financial, stock and banking systems?
Where exactly was COBOL used?
Was it used more frequently than Fortran or BASIC, for example?
I don't know if you lived at that time, but how did people react to the rising COBOL? Did they expect it to be the future?
When has COBOL actually stopped being used to create new, big systems?
Are you sure that there are still important legacy apps written in COBOL out there? I can't believe that somehow.
Previous SO questions have gone a long way toward answering your questions. Please review:
What are Fortran and COBOL used for today
Why is COBOL still a preferred language in the business world
Reasons to start a new project in COBOL
What makes COBOL such a hated language
Was COBOL the first programming language used in financial, stock and banking systems
Well known languages that co-existed with early COBOL are
Fortran and Lisp. These languages were not much used much outside
of research and university facilities.
The landscape was highly fragmented within the world of business computing.
A number of proprietary low-to-medium level languages existed but generally
only ran on one vendors machine.
A few examples were: FLO-MATIC,
AIMACO and COMTRAN, all of
which heavily influenced the development of COBOL.
From this chaos emerged a strong desire to have a machine independent and common language for
developing business applications.
According to Jean E. Sammet (The Early History of COBOL), the
US Department of Defense spearheaded and funded the early development of COBOL.
Where is COBOL used
Largely in financial (banks/government) and insurance industries. Outside of these sectors, COBOL is pretty
much unheard of.
Is it used more frequently than Fortran or BASIC
I believe Fortran actually pre-dates COBOL by a little bit. Fortran is
primarily suited for high-performance numerical applications (astronomy, physics
and the like). COBOL is primarily suited for financial and record keeping
applications - the stuff of business and commerce (hence
the name: COmmon Business Oriented Language).
The two were never in "competition" so asking which is more frequently used
is kind of like comparing apples to oranges.
Putting the "apples" and "oranges" aside, it is hard to say how many lines of production
code exist for either of these languages. Estimates vary from billions to millions.
However, I don't think anybody would claim that the active code base is insignificant.
BASIC (excluding "Visual Basic") was largely a personal computer language.
There have been a few ports to
larger machines (eg. VAX BASIC - Oh that was fun) but I don't think this ever caught on.
I would be surprised if there are any significant production systems written in BASIC
today. Just say "BAISC" to any "old timer" and their minds will flood with fond
memories. Other than that it is pretty much gone.
When did COBOL stop being used
The COBOL legacy is huge. As such, there is a lot of legacy maintenance going on
today, and it will go on for many years to come.
Is there any new development? I would say less and less every year but it is
nowhere near coming to an end. I work in a very large shop and we actively develop new
COBOL applications. I don't believe we are alone.
Those that still actively develop systems in COBOL are not a bunch of "back woods"
idiots who don't know any better. They do it because COBOL "delivers the goods"
for the least cost per transaction processed. Believe me, if any other technology could
do it cheaper, faster and more reliably, COBOL would be gone tomorrow!
One can only get an appreciation for how wide spread COBOL is by
working in the financial, government or insurance industries - and then only
in an area where they have to push a lot of data around.
If you work outside of this environment it is like the language died a hundred yeas ago!
How did people react to the rise of COBOL?
In a couple of words: Not well.
COBOL came into existence just about the same time that the academic world made huge
breakthroughs in language theory and compiler design. COBOL missed that boat and has been
denigrated by everybody with an academic interest in computing ever since. I went through
university in the 70's and even at that time the word "COBOL" made us all cringe. The
hate for COBOL runs very deep.
Even the developers of COBOL could not have predicted the long term success of the
language. The original COBOL was specified by a "short range committee" so
that could it implemented with reasonable time and effort. The final "touches" would
be made by a "long range committee". The "long range committed" never materialized
and this is what we got!
The death of COBOL has been predicted as imminent since the 60's. It is still with
us and going strong.
Why? I think there are three big reasons:
Code stability. COBOL carries its legacy fairly well, major upgrades are rare. This
may not be a selling point if you are in the business of developing code. However,
if you are the one paying for it COBOL gets high marks on this one.
Performance. COBOL applications are generally developed where volume and/or
throughput are critical (eg. processing monthly bank statements, tax returns, etc.)
Track Record. Organizations that use COBOL generally know their track record. They
have a certain comfort level with cost/time estimates for major development projects
using COBOL and related technologies.
Taking on a new language and supporting technology to implement mission critical applications involves additional
and unknown risks (and unknown benefits).
Notice that all the reasons I have cited for COBOL's continued existence are driven by
cost and risk minimization. There is nothing from a developer's point of view that
makes developing in COBOL interesting. Blame corporate accountants for COBOL's
continued success.
On the brighter side, there are a few frameworks (eg. Bassett Frame Technology and XVCL) that can make COBOL development today tollerable, even, dare I say, interesting.
Was COBOL the first programming
language really being used in
financial, stock and banking systems?
For practical purposes this was all done in assembler, but Cobol was the first high level language to move into these domains.
Where exactly was COBOL used?
Any place where money changed hands, inventory was tracked, et al. Your use of the word "was" implies that it is not now used. Cobol is involved every time you swipe a credit card, ship a package, make a phone call...it is every where. Still.
Was it used more frequently than
Fortran or BASIC, for example?
Yes, very much so. Fortran is well geared for science geeks and engineers -- a noble calling -- but they don't exist in the numbers of the sales and marketing geeks, the Cobol domain.
Does anyone use BASIC? Doesn't that suck?
I don't know if you lived at that
time, but how did people react to the
rising COBOL? Did they expect it to be
the future?
People like credit cards. People like online access to their bank accounts. People like Voice Response Systems that give their balance and last 5 transactions. People like ATMS. People like fast airline and hotel bookings.
The only people who don't like what Cobol does for them are programmers that have never put the time and effort into understanding Cobol (but they hate it anyway).
When has COBOL actually stopped being
used to create new, big systems?
Ummm, never. Cobol is still actively developed and used all over the world. It isn't sexy and no Computer Science professor is going to tell you that it is "the next big thing" -- but if they knew what they were talking about they would be making money in the real world...
Are you sure that there are still
important legacy apps written in COBOL
out there? I can't believe that
somehow.
MasterCard. Visa. Naa...
No idea really, but LEO was used for payroll. It used a language similar to COBOL called CLEO.
COBOL is used all over the place. Mostly banks and big mainframe departments.
difficult to say. It was certainly popular back in the day.
Back in the days when COBOL had its heyday, the alternatives were for alternative niches - eg Fortran for scientific, Algol for academic, Cobol for financial. Did they expect it to be the future... possibly.
5,6. It still is used. Search for COBOL jobs, you'll get quite a few hits for banking and financial companies that are looking for programmers, architects, etc. Pays quite well too by all accounts.
Answering the last part:
Yes, there are certainly new COBOL applications being written at banks everyday. Large financial institutions usually have a mainframe or two around, as they (traditionally) have a much better uptime than standard servers, and can move a lot of data reliably.
Additionally, the people still doing COBOL are pretty darn good at what they do.
If you're dealing with billions of dollars of electronic transactions, reliably is worth paying for, even if it's not new or sexy to do so. Then again, I can't hotswap the processor out of my webserver; hotswapping any of the parts out of a mainframe is usually possible, and that's actually a pretty tech-sexy feature, if I gotta say so myself.
I managed to go almost 15 years in my career without touching a single line of COBOL, or even of seeing it. Until I got my last job, which is some enterprise middleware that links COBOL to web services and non-mainframe databases. My very first customer engagement at this new job was with a big-ass company with lots and lots of COBOL they wanted to integrate with newer systems.
Learning it has been a pain, mostly because there are few good PC-based COBOL engines left, but it's not really hard at all. And that is why it's still around. It does a job, does it well. It's showing it's age a bit in how it interacts with SOA frameworks, but even that problem is going away.
#Neal:
>
BASIC (excluding "Visual Basic") was largely a personal computer language.
There have been a few ports to
larger machines (eg. VAX BASIC - Oh that was fun) but I don't think this ever caught on.
BASIC also started on big machines. I remember programming with BASIC and a paper tape on a CDC back in 1974 in an environnement similar to this one http://www.museumwaalsdorp.nl/computer/en/comp742E.html.
RE code stability of cobol: Updates rare, but they are very disruptive, and are actively resisted by the installed base. When forced, conversions often are done in a compatibility mode and the testing alone can burn the entire SD budget for a year. OO cobol is a case in point as the real costs of conversion will exceed its benefits unless a total redesign is attempted. Consulting shops love this as they bill for time, but for the organization it has the potential to literally put them out of business. One of the great myths of this OO cobol exercise is the 'portability' of the skill set of cobol, but in fact it is the OOP/OOD skill which is lacking and must be taught to the legacy programmers. Learning a new paradigm is allays harder than learning a new tool ( language) and in point of fact the exercise makes no sense and is entertained only by that bastion of folly known as management ---as carefully mislead by the vendor community devoted as they are to the creation of 'value' for their shareholders. It is often an easy sale, and fools generally do deserve to be fleeced.
RE execution speed. This is not really worth a detailed response. Platforms are fast, and it is compilers that determine execution speed. I have examine the asm output from COBOL compilers and it is not any better than a good c compiler. More to the point classic COBOLS's lack of type safety , failure to support scope, failure to support parametrized procedures, failure to support explicit type conversion etc, leads to the mistaken impression that because it does not do any of this it is faster. In fact most of this requires only compile time support and the rest does not add much overhead ( and what little it does can be optimized out) where as it does make code reuse prohibitively expensive, make testing a nightmare, and produce brittle code.
It will only go away when it costs to much to fix. This may or may not happen, but it is more likely that the organizations will fail due to a major software issue which would have been trapped by a type safe language before then.( OOP will provide type safety but that will require that cobol programmers and business analysts learn to use types)
Does anybody know of any research or benchmarks of how long it takes to develop the same application in a variety of languages? Really I'm looking for Java vs. C++ but any comparisons would be useful. I have the feeling there is a section in Code Complete about this but my copy is at work.
Edit:
There are a lot of interesting answers to this question but it seems like there is a lack of really good research. I have made a proposal over at meta about this problem.
Pratt & Whitney, purveyors of jet engines for civilian and military applications, did a study on this many years ago, without actually intending to do the study.
They went on the same metrics kick everyone else went on in the 1990s. They collected a bunch of data about their jet engine controller projects, including timecard data. They crunched it. The poor sap who got to crunch the data noticed something in the results: the military projects uniformly had twice the programmer productivity and one/fourth the defect density as the civilian projects.
This, by itself, is significant. It means you only need half as many programmers, and you aren't going to spend quite as much time fixing bugs. What is even more important is that this was an apples-to-apples comparison. A jet engine controller is a jet engine controller.
He then went looking for candidate explanations. All of the usual candidates: individual experience, team size, toolsets, software processes, requirements stability, everything, were trotted out, and they were ruled out when it was seen that the story on those items was uniformly the same on both sides of the aisle. At the end of the day, only one statistically significant difference showed up.
The civilian projects were written in every language you could think of. The military projects were all written in Ada.
IN EVERY SINGLE CASE, against every other comer, for jet engine controllers at Pratt & Whitney, using Ada gave double the productivity and one/fourth the defect density.
I know what the flying code monkeys are going to say. "You can do good work in any language." In theory, that's true. In practice, however, it appears that, at least at Pratt & Whitney, language made a difference.
Last I heard about this, Pratt & Whitney upper management decreed that ALL jet engine controller projects would be done in Ada.
No, I don't have a citation. No paper was ever written. My source on this story was the poor sap who crunched the numbers. Here's a similar study from 1995:
http://archive.adaic.com/intro/ada-vs-c/cada_art.html
This, incidentally, was BEFORE Boeing did the 777, and BEFORE the 777 brake subcontractor story happened. But that's another story.
One of the few funded scientific studies that I'm aware of on cross-language productivity, from the early 90s, funded by ARPA and the ONR,
Haskell vs. Ada Vs. C++ vs Awk vs ... An Experiment in Software Prototyping Productivity, Hudak & Jones, 1994.
We describe the results of an
experiment in which several
conventional programming languages,
together with the functional language
Haskell, were used to prototype a
Naval Surface Warfare Center (NSWC)
requirement for a Geometric Region
Server. The resulting programs and
development metrics were reviewed by a
committee chosen by the Navy. The
results indicate that the Haskell
prototype took significantly less time
to develop and was considerably more
concise and easier to understand than
the..
This article(a pdf) has some benchmarks (note that it's from 2000) between C, C++, Perl, Java, Perl, Python, Rexx and Tcl.
Some common wisdom I believe holds true (also somewhere within the article):
The number of lines written per hour is independent of the language
Opinion: more important is what is faster for a given developer, for example yourself. What you are used to, will usually be faster. If you are used to 20 years of C++ pitfalls and never skip an uninitialized variable, that will be faster than Java for anybody.
If you remember all parameters of CreateWindowEx() by heart, it will be faster than MFC or winforms.
A couple of anecdotal data points:
On Project Euler, which invites programming solutions to mathematical problems,
the shortest solutions are almost invariably written in J or K, a relative of APL; there are occasionally MatLab solutions in the same range. It can be argued, though, that these languages specialized in math.
runners up were Ruby solutions. A lot of algorithm can be wrapped in very little code, and it's much more legible than J / K.
Python and Haskell solutions also did very well, LOC-wise.
The question asked about "fastest development," not "shortest code." But it's conceivable that shorter solutions are faster to come up with - certainly for slow typists!
There's an annual competition among roboticists. Contestants are given some specs for some hardware, a practical problem to solve in software, and limited time to do so. Again very domain specific, of course. Programmers have their choice of tools, including language of course. Every year, the winning team (often a single person) used Forth.
This admittedly limited sample suggests that "development speed" and "effect of language on speed" is often very dependent on the problem domain.
See also
Are there statistical studies that indicates that Python is "more productive"?
for some discussions about this kind of question.
It would make more sense to benchmark the programmers, not the languages. The time to write a program in any mainstream language depends more on the ability of the programmer in that language than on qualities of that specific language.
I think most benchmarks and statements on this topic will mean very little.
Benchmarks can always be gamed; see the history of "Pet Store".
A language that's good at solving one kind of problem might not apply as well to another.
What matters most is the skill of your team, its knowledge of a particular technology, and how well you know the domain you're trying to solve.
UPDATE: Control software for jet engines and helicopters is a very specialized subset of computing problems. It's characterized by very rigorous, complete, detailed specs and QA that means the multi-million dollar aircraft cannot crash.
I can second the (very good) citation by John Strohm of Pratt & Whitney control software written in Ada. The control software for Kaman helicopters sold to Australia was also written in Ada.
But this does not lead to the conclusion that if you decided to write your next web site in Ada that you'd have higher productivity and fewer defects than you would if you chose C# or Java or Python or Ruby. All languages are not equally good in all problem domains.
Language/framework comparison for web applications
The Plat_Forms project provides some information of this type for web applications.
There are three studies with different tasks (done in 2007, 2011, 2012), all of the following format: Several teams of three professional developers implemented the same application under controlled conditions within two days.
It covers Java, Perl, PHP, and Ruby and has multiple teams for each language.
The evaluation reports much more than only development time.
Findings of iteration one for instance included
that experience with the language and framework appeared to be more relevant than what that framework was.
that Java tended to induce teams to make laborious constructions while Perl induced them to make pragmatic (and quite handy) constructions.
Findings of iteration two included
that Ruby on Rails was more productive in this type of project (which due to its duration was more rapid prototyping than full-blown development of a mature application)
and that the one exception to the above rule was the one team using Symfony, a PHP framework that has similar concepts to Ruby on Rails (but still the very different base language underneath it).
Look under http://www.plat-forms.org or search the web for "Plat_Forms".
There is plenty more detail in the reports, in particular the thick techreport on iteration 1.
Most programs have to interface with some other framework. It tends to be a good idea to pick the language that has libraries specifically for what you are trying to do. For instance are you trying to build a distributed redundant messaging system? If so I would use Erlang. Are you trying to make a quick and dirty data driven website, use Ruby and Rails. You get the idea. Real time DirectX where performance is key, C++/C/Asm.
If you are writing something that is algorithm based I would look to a functional language like Haskell, although it has a very high learning curve.
This question is a little old fashioned. Focusing on development time solely based on the choice of language is of limited value. There are so many other factors that have equal or more impact than the language itself:
The libraries or frameworks available / used.
The level of quality required (ie. defect count).
The type of application (eg. GUI, server, driver etc...)
The level of maintainability required.
Developer experience in the language.
The platform or OS the application is built on.
As an example, many would say Java is the better choice over C++ to build enterprise (line of business) applications. This is not normally because of the language itself, but instead it is perceived that Java has better (or more mature) web server and database frameworks available to it. This may or may not be true, but that is beside the point.
You may even find that the building an application using the same language on different operating systems or platforms gives greatly differing development time. For example using C++ on Linux to build a GUI application may take longer than a Windows based GUI application using C++ because of less extensive and mature GUI libraries avaialble on Linux (once again this is debatable).
According to Norvig, Lutz Prechelt published just such an article in the October 1999 CACM: "Comparing Java vs. C/C++ Efficiency Issues to Interpersonal Issues".
Norvig includes a link to that article. Unfortunately, the ACM, despite having a bitmap graphic proclaiming their goal of "Advancing Computing as a Science & Profession", couldn't figure out how to maintain stable links on their webpage, so it's just a 404 now. Perhaps your local library could help you out.
That Ada story might be an embellished version of this: http://www.adaic.com/whyada/ada-vs-c/cada_art.html
Erlang vs C++/Corba
"... As the Erlang DCC is less than a quarter of the size of a similar C++/CORBA implementation, the product development in Erlang should be fast, and the code maintainable. We conclude that Erlang and associated libraries are suitable for the rapid development of maintainable and highly reliable distributed products."
Paper here
There's a reason why there are no real comparisons in that aspect, except for anecdotal evidence (which can be found in favor of almost any language).
Actually writing code takes relatively small portion of developer's time. Even if language lets you cut coding time in half, it will be barely noticeable by the time project ends. Design, structure of program, development process are all much more important, and then there are libraries, tools and experience with them.
Some languages are better suited for certain development processes than the others, so if you've settled on design and process you can decide which language will be more efficient - but not before.
(didn't notice there's a similar answer already, so feel free to ignore this)
I have always wondered: what programming languages were used to go to the moon?
I realize there may not be a single answer/language, but it interests me.
How many people worked on the code for these systems? How was it tested?
Not a full answer, but a bit more info:
"The on-board Apollo Guidance Computer (AGC) was about 1 cubic foot with 2K of 16-bit RAM and 36K of hard-wired core-rope memory with copper wires threaded or not threaded through tiny magnetic cores. The 16-bit words were generally 14 bits of data (or two op-codes), 1 sign bit, and 1 parity bit. The cycle time was 11.7 micro-seconds. Programming was done in assembly language and in an interpretive language, in reverse Polish."
http://www.hq.nasa.gov/alsj/a11/a11.1201-fm.html
Added: the BBC has recently published a wonderful article about the AGC including interviews with the designers, and with the "little old ladies" who wove the "rope core." It doesn't cover how the software was designed, coded or tested, but you'll probably find it interesting all the same!
Additionally, the source code for the main and landing modules can be found here
The Apollo Guidance Computer was programmed in assembly language.
From "Digitial Apollo Human and Machine in Spaceflight" by David A. Mindell, MIT Press (C) 2008
pg. 149
Apollo's software derived from the basic design of the Mars mission.
Designer Hugh-Blair Smith created a language called "Basic", a
low-level assmelby language of about forty instructions (distince from
the hgigh-level BASIC programming language developed at Dartmouth at
about the same time). On top of BASIC was "Interpreter," brainchild of
Hal Laning, a language that was really a collection of routines to do
the higher-level mathematical functions associated with gudance and
control, in the high precision data format.
Not exactly the Moon, but Lisping at JPL.
I remember reading that the exact same software was written by (at least) two different disjoint teams. The computers would then compare their answers together and check for any discrepancies... not sure what it would do if it found any, but at least they would know there was a problem. I think they used four different computers actually, and took the majority vote, so if one computer was wrong, it was ignored.