Coding for high reliability/availability/security - what standards do I read? - security

I've heard that the automotive industry has something called MISRA C.
What are the relevant standards for other high reliability/availability/security industries, such as
Space
Aircraft
Banking/financial
Automotive
Medical
Defense/Military
???
-Adam

Check out the Goddard Space Flight Center and its coding standards. One of the C standards, which I've adopted in my own code, is that headers must be self-contained, and they provide a simple way to enforce that -- a module's header must be the first file included in the module, so if the file is not self-contained, it won't compile.

if you're asking specifically about coding, the MISRA presents some guidelines for avoiding common mistakes in C.
however, there's a lot more to good software than coding. The "bible" of the aviation industry for sw development is DO-178B. It tells you what questions need to be addressed in the various design phases and how the answers should be documented. It's an ENORMOUS amount of paperwork, but if you're trying to keep planes in the air, you want the weakest point to be the human (pilot), not the software.

For programming high reliability systems in Ada, there is: ISO/IEC TR 15942:"Information technology — Programming languages — Guide for the use of the Ada programming language in high integrity systems":
Introduction
As a society, we are increasingly
reliant upon high integrity systems:
for safety systems (such as
fly-by-wire aircraft), for security
systems (to protect digital
information) or for financial systems
(e.g., cash dispensers). As the
complexity of these systems grows, so
do the demands for improved techniques
for the production of the software
components of the system. These high
integrity systems must be shown to be
fully predictable in operation and
have all the properties required of
them. This can only be achieved by
analysing the software, in addition to
the use of conventional dynamic
testing. There is, currently, no
mainstream high level language where
all programs in that language are
guaranteed to be predictable and
analysable. Therefore for any choice
of implementation language it is
essential to control the ways that the
language is used by the application.
The Ada language [ARM] is designed
with specific mechanisms for
controlling the use of certain aspects
of the language. Furthermore,
The semantics of Ada programs are well-defined, even in error
situations. Specifically, the effect
of a program can be predicted from the
language definition with few
implementation dependencies or
interactions between language
features.
The strong typing within the language can be used to reduce the
scope (and cost) of analysis to verify
key properties.
The Ada language has been successfully used on many high
integrity applications. This
demonstrates that validated Ada
compilers have the quality required
for such applications.
Guidance can be provided to facilitate the use of the language and
to encourage the development of tools
for further verification.
Ada is therefore ideally suited for
implementing high integrity software
and this document provides guidance in
the controls that are required on the
use of Ada to ensure that programs are
predictable and analysable.

You may find it instructive to look at some of the requirements of Carrier Grade Linux. While they (as the name suggests!) are specifying linux requirements, they are doing so for use in the high availability segment of telecommunications equipment.

FDA has General Principles of Software Validation, Design Control Guidance For Medical Device Manufacturers, Guidance for Industry, FDA Reviewers and Compliance on
Off-The-Shelf Software Use in Medical Devices, etc.

NIST provides a whole slew of related documents, you can dive in and peruse their work - but there is a lot of it, and it's all quite verbose, so I dont have a specific one to point you at.
If you want to be more specific with your needs, I might be able to narrow it down a bit...
In addition, Carnegie Mellon is pretty much the definitive when it comes to development processes for reliability, easy enough to find their standards but also quite verbose.
Also, specific industries often have their own standards, depending also on the country. For instance, credit card industry - PCI-DSS; Banking industry in EU - Basel II; Medical - HIPAA (though thats pretty high-level); anything US government related, various NIST docs; etc.

Related

How the hardware platform impacts upon the choice for the programming language?

Long put short: The teacher who taught me through out the last year has only recently left and has been replaced with a new one. This new teacher has given me an assignment that involves things (like this) that we were never previously taught. So this task has showed up on the assignment and I have no idea how to do it. I can't get hold of the teacher because he's poorly and not coming in for the next few days. And even when I do ask him to explain further, he gets into a right mood and makes me feel like I'm completely retarded.
Describe how the hardware platform impacts upon the choice for
the programming language
Looking at my activity here on SO, you can tell that I'm into programming, I'm into developing things, and I'm into learning, so I'm not just trying to get one of you guys to do my homework for me.
Could someone here please explain how I would answer a question like this.
Some considerations below, but not a full answer by any means.
If your hardware platform is a small embedded device of some kind, then your choice of programming language is going to be directed towards the lower level unmanaged languages - you probably won't be able to (or want to) load a managed language runtime like the Java JVM or .NET CLR. This is down to memory and storage requirements. Similarly, interpreted languages will be out of the question as you won't have space for the intepreter.
If you're on a larger machine, it's more a question of compatibility. A managed language must run on a platform where its runtime is supported. In the case of .NET, that's Windows, or other platforms if you substitute the Microsoft CLR with the Mono runtime. In the case of Java, that's a far wider range of platforms.
This is by no means a definitive answer, but my first thought would be embedded systems. A task I perform on an embedded system, or other low powered battery operated computer, would need to be handled completely different to that performed on a computer which has access to mains electricity.
One simple impact.. would be battery life.
If I use wasteful algorithms on an embedded system, the battery life will be affected.
Hope that helps stir the brain juices!
Clearly, the speed and amount of memory of the device will impact the choice. The more primitive and weak the platform is, the harder it is to run code developed with very high level languages. Code written with them may just not work at all (e.g. when there isn't enough memory) or be too slow or it will require serious optimizations (i.e. incur more work), perhaps affecting negatively the feature set or quality.
Also, some languages and software may rely heavily on or benefit from the availability of page translation in the CPU. If the CPU doesn't have it, certain checks will have to be done in software instead of being done automatically in hardware, and that will affect the performance or the language/software choice.

The unmentioned parts of COBOL's history [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I'm very curious about old programming languages, especially COBOL, and as Wikipedia couldn't really tell me much about this topic, I decided to ask it here:
Was COBOL the first programming language really being used in financial, stock and banking systems?
Where exactly was COBOL used?
Was it used more frequently than Fortran or BASIC, for example?
I don't know if you lived at that time, but how did people react to the rising COBOL? Did they expect it to be the future?
When has COBOL actually stopped being used to create new, big systems?
Are you sure that there are still important legacy apps written in COBOL out there? I can't believe that somehow.
Previous SO questions have gone a long way toward answering your questions. Please review:
What are Fortran and COBOL used for today
Why is COBOL still a preferred language in the business world
Reasons to start a new project in COBOL
What makes COBOL such a hated language
Was COBOL the first programming language used in financial, stock and banking systems
Well known languages that co-existed with early COBOL are
Fortran and Lisp. These languages were not much used much outside
of research and university facilities.
The landscape was highly fragmented within the world of business computing.
A number of proprietary low-to-medium level languages existed but generally
only ran on one vendors machine.
A few examples were: FLO-MATIC,
AIMACO and COMTRAN, all of
which heavily influenced the development of COBOL.
From this chaos emerged a strong desire to have a machine independent and common language for
developing business applications.
According to Jean E. Sammet (The Early History of COBOL), the
US Department of Defense spearheaded and funded the early development of COBOL.
Where is COBOL used
Largely in financial (banks/government) and insurance industries. Outside of these sectors, COBOL is pretty
much unheard of.
Is it used more frequently than Fortran or BASIC
I believe Fortran actually pre-dates COBOL by a little bit. Fortran is
primarily suited for high-performance numerical applications (astronomy, physics
and the like). COBOL is primarily suited for financial and record keeping
applications - the stuff of business and commerce (hence
the name: COmmon Business Oriented Language).
The two were never in "competition" so asking which is more frequently used
is kind of like comparing apples to oranges.
Putting the "apples" and "oranges" aside, it is hard to say how many lines of production
code exist for either of these languages. Estimates vary from billions to millions.
However, I don't think anybody would claim that the active code base is insignificant.
BASIC (excluding "Visual Basic") was largely a personal computer language.
There have been a few ports to
larger machines (eg. VAX BASIC - Oh that was fun) but I don't think this ever caught on.
I would be surprised if there are any significant production systems written in BASIC
today. Just say "BAISC" to any "old timer" and their minds will flood with fond
memories. Other than that it is pretty much gone.
When did COBOL stop being used
The COBOL legacy is huge. As such, there is a lot of legacy maintenance going on
today, and it will go on for many years to come.
Is there any new development? I would say less and less every year but it is
nowhere near coming to an end. I work in a very large shop and we actively develop new
COBOL applications. I don't believe we are alone.
Those that still actively develop systems in COBOL are not a bunch of "back woods"
idiots who don't know any better. They do it because COBOL "delivers the goods"
for the least cost per transaction processed. Believe me, if any other technology could
do it cheaper, faster and more reliably, COBOL would be gone tomorrow!
One can only get an appreciation for how wide spread COBOL is by
working in the financial, government or insurance industries - and then only
in an area where they have to push a lot of data around.
If you work outside of this environment it is like the language died a hundred yeas ago!
How did people react to the rise of COBOL?
In a couple of words: Not well.
COBOL came into existence just about the same time that the academic world made huge
breakthroughs in language theory and compiler design. COBOL missed that boat and has been
denigrated by everybody with an academic interest in computing ever since. I went through
university in the 70's and even at that time the word "COBOL" made us all cringe. The
hate for COBOL runs very deep.
Even the developers of COBOL could not have predicted the long term success of the
language. The original COBOL was specified by a "short range committee" so
that could it implemented with reasonable time and effort. The final "touches" would
be made by a "long range committee". The "long range committed" never materialized
and this is what we got!
The death of COBOL has been predicted as imminent since the 60's. It is still with
us and going strong.
Why? I think there are three big reasons:
Code stability. COBOL carries its legacy fairly well, major upgrades are rare. This
may not be a selling point if you are in the business of developing code. However,
if you are the one paying for it COBOL gets high marks on this one.
Performance. COBOL applications are generally developed where volume and/or
throughput are critical (eg. processing monthly bank statements, tax returns, etc.)
Track Record. Organizations that use COBOL generally know their track record. They
have a certain comfort level with cost/time estimates for major development projects
using COBOL and related technologies.
Taking on a new language and supporting technology to implement mission critical applications involves additional
and unknown risks (and unknown benefits).
Notice that all the reasons I have cited for COBOL's continued existence are driven by
cost and risk minimization. There is nothing from a developer's point of view that
makes developing in COBOL interesting. Blame corporate accountants for COBOL's
continued success.
On the brighter side, there are a few frameworks (eg. Bassett Frame Technology and XVCL) that can make COBOL development today tollerable, even, dare I say, interesting.
Was COBOL the first programming
language really being used in
financial, stock and banking systems?
For practical purposes this was all done in assembler, but Cobol was the first high level language to move into these domains.
Where exactly was COBOL used?
Any place where money changed hands, inventory was tracked, et al. Your use of the word "was" implies that it is not now used. Cobol is involved every time you swipe a credit card, ship a package, make a phone call...it is every where. Still.
Was it used more frequently than
Fortran or BASIC, for example?
Yes, very much so. Fortran is well geared for science geeks and engineers -- a noble calling -- but they don't exist in the numbers of the sales and marketing geeks, the Cobol domain.
Does anyone use BASIC? Doesn't that suck?
I don't know if you lived at that
time, but how did people react to the
rising COBOL? Did they expect it to be
the future?
People like credit cards. People like online access to their bank accounts. People like Voice Response Systems that give their balance and last 5 transactions. People like ATMS. People like fast airline and hotel bookings.
The only people who don't like what Cobol does for them are programmers that have never put the time and effort into understanding Cobol (but they hate it anyway).
When has COBOL actually stopped being
used to create new, big systems?
Ummm, never. Cobol is still actively developed and used all over the world. It isn't sexy and no Computer Science professor is going to tell you that it is "the next big thing" -- but if they knew what they were talking about they would be making money in the real world...
Are you sure that there are still
important legacy apps written in COBOL
out there? I can't believe that
somehow.
MasterCard. Visa. Naa...
No idea really, but LEO was used for payroll. It used a language similar to COBOL called CLEO.
COBOL is used all over the place. Mostly banks and big mainframe departments.
difficult to say. It was certainly popular back in the day.
Back in the days when COBOL had its heyday, the alternatives were for alternative niches - eg Fortran for scientific, Algol for academic, Cobol for financial. Did they expect it to be the future... possibly.
5,6. It still is used. Search for COBOL jobs, you'll get quite a few hits for banking and financial companies that are looking for programmers, architects, etc. Pays quite well too by all accounts.
Answering the last part:
Yes, there are certainly new COBOL applications being written at banks everyday. Large financial institutions usually have a mainframe or two around, as they (traditionally) have a much better uptime than standard servers, and can move a lot of data reliably.
Additionally, the people still doing COBOL are pretty darn good at what they do.
If you're dealing with billions of dollars of electronic transactions, reliably is worth paying for, even if it's not new or sexy to do so. Then again, I can't hotswap the processor out of my webserver; hotswapping any of the parts out of a mainframe is usually possible, and that's actually a pretty tech-sexy feature, if I gotta say so myself.
I managed to go almost 15 years in my career without touching a single line of COBOL, or even of seeing it. Until I got my last job, which is some enterprise middleware that links COBOL to web services and non-mainframe databases. My very first customer engagement at this new job was with a big-ass company with lots and lots of COBOL they wanted to integrate with newer systems.
Learning it has been a pain, mostly because there are few good PC-based COBOL engines left, but it's not really hard at all. And that is why it's still around. It does a job, does it well. It's showing it's age a bit in how it interacts with SOA frameworks, but even that problem is going away.
#Neal:
>
BASIC (excluding "Visual Basic") was largely a personal computer language.
There have been a few ports to
larger machines (eg. VAX BASIC - Oh that was fun) but I don't think this ever caught on.
BASIC also started on big machines. I remember programming with BASIC and a paper tape on a CDC back in 1974 in an environnement similar to this one http://www.museumwaalsdorp.nl/computer/en/comp742E.html.
RE code stability of cobol: Updates rare, but they are very disruptive, and are actively resisted by the installed base. When forced, conversions often are done in a compatibility mode and the testing alone can burn the entire SD budget for a year. OO cobol is a case in point as the real costs of conversion will exceed its benefits unless a total redesign is attempted. Consulting shops love this as they bill for time, but for the organization it has the potential to literally put them out of business. One of the great myths of this OO cobol exercise is the 'portability' of the skill set of cobol, but in fact it is the OOP/OOD skill which is lacking and must be taught to the legacy programmers. Learning a new paradigm is allays harder than learning a new tool ( language) and in point of fact the exercise makes no sense and is entertained only by that bastion of folly known as management ---as carefully mislead by the vendor community devoted as they are to the creation of 'value' for their shareholders. It is often an easy sale, and fools generally do deserve to be fleeced.
RE execution speed. This is not really worth a detailed response. Platforms are fast, and it is compilers that determine execution speed. I have examine the asm output from COBOL compilers and it is not any better than a good c compiler. More to the point classic COBOLS's lack of type safety , failure to support scope, failure to support parametrized procedures, failure to support explicit type conversion etc, leads to the mistaken impression that because it does not do any of this it is faster. In fact most of this requires only compile time support and the rest does not add much overhead ( and what little it does can be optimized out) where as it does make code reuse prohibitively expensive, make testing a nightmare, and produce brittle code.
It will only go away when it costs to much to fix. This may or may not happen, but it is more likely that the organizations will fail due to a major software issue which would have been trapped by a type safe language before then.( OOP will provide type safety but that will require that cobol programmers and business analysts learn to use types)

Was ALGOL ever used for "mainstream" programming? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
I know that ALGOL language is super-uber-extremely important as a theoretical language, and it also had a variety of implementations as per Wikipedia.
However, what's unclear is, was ALGOL (pure ALGOL, not any of its derivatives like Simula) ever actually used for any "real" programming in any way?
By "real", I mean used for several good-sized projects other than programming language/CS research, or by a significant number of developers (say, > 1000).
Personally, the only ALGOL programming I have ever done was on paper, thus the curiosity.
Algol58 seems to have been the most successful in terms of important applications.
From Wikipedia:
JOVIAL is an acronym for "Jules Own
Version of the International
Algorithmic Language." The
"International Algorithmic Language"
was a name originally proposed for
ALGOL 58. It was developed to compose
software for the electronics of
military aircraft by Jules Schwartz in
1959.
Then:
Notable systems using JOVIAL include
the Milstar Communications
Satellite, Advanced Cruise
Missile, B-52, B-1B,
B-2 bombers, C-130,
C-141, and C-17 transport
aircraft, F-111,
F-15, F-16 (prior to Block
50), and F-117 fighter aircraft,
LANTIRN, U-2 aircraft,
E-3 Sentry AWACS aircraft,
Navy Aegis cruisers, Army
Multiple Launch Rocket System
(MLRS), Army UH-60 Black
Hawk helicopters, F100,
F117, and F119 jet
engines, the NORAD air
defense & control system (Hughes
HME-5118ME system) and RL-10
rocket engines. Airborne radar
systems with embedded JOVIAL software
include the APG-70, APG-71
and APG-73
ALGOL 68 was used in part of DRA for the same purpose. cf. Wikipedia:
The '''Defence Research Agency'''
(normally known as '''DRA'''), was an
executive agency of the UK Ministry of Defence
(MOD) from April 1991 until April 1995. At the
time the DRA was Britain's largest science and
technology organisation.
DRA's Algol68 compiler was finally open-sourced in April 1999 and is now available for linux for download from sourceforge. (However an interpreter for "Algol68g" is easier to use).
ICL's Algol68 was/is S3 - It was developed by the UK company International Computers Limited (ICL) for its 2900 Series mainframes. It is a system programming language based on ALGOL 68 but with data types and operators aligned to those offered by the 2900 Series. It was the implementation language of the operating system VME.
There are (at least) two other British operating systems - Flex and Cambridge_CAP_computer - written in Algol68 variants. And also 1 Soviet OS: Эльбрус-1 (Elbrus-1), but I have yet to find any of their source code. (If anyone can find and distribute to this source code please let me know)
BTW: I believe that VME is still running - in production - as a Linux/Unixware guest VM. Mostly at Commonwealth of Nations Custom/Immigration services.
Also over the same period the USSR was using Algol68, c.f. history link. Algol68 is used in Russian telephone exchanges. And Algol58 was used in the Russian "Buran/Буран" Space Shuttle landing system.
ALGOL68 was internationalized in 1968. I suspect there are other Algol projects in other countries. esp in German, in Dutch Japanese and Chinese but I have no details.
If you want to actually tryout Algol68 and/or contribute your code, check out Rosettacode's ALGOL 68 repository, then as a class project try one of the "Tasks not implemented".
Nothing like responding to 2 year old threads. I program in ALGOL almost daily. I am a programmer on a Unisys ClearPath mainframe and the majority of the system code is written in ALGOL or variants. The Burroughs B5500 was really designed around the language so it is a pretty efficient language/compilation process. Granted, this version is ALGOL with some extensions like limited classes (structure blocks), etc.
i := 80;
while i > 0 do
begin
scan ptrRay:ptrRay for i:i until in ALPHA;
scan ptrEnd:ptrRay for i:i while in ALPHA;
if i > 0 then
begin
replace nextToken by ptrRay for (offset(ptrEnd) - offset(ptrRay));
end;
end;
That code scans for ALPHA only tokens. It uses the OFFSET function which is a little more costly than using the residual count math yourself (i, starti, etc);
Like Tom, I program in ALGOL almost daily - and I'm also on a Unisys Clearpath. ALGOL has been the primary source of my mortgage repayments for more years than I care to remember.
When I started programming, Algol was the only compiler available. Yes, it was mainstream till we got a Fortran compiler.
To follow up on themis' answer, the entire Burroughs "large system" family (5000, 5500, 5700, 6500, 6700...) was really designed to run Algol well. The operating system, compilers, and major system utilities were written in Algol; if that's not "real" programming, what is?
To be precise, over the life of the product family Burroughs extended Algol into a superset called ESPOL. When Burroughs brought out the "small systems" family (1700, 1800, 1900 series), they defined another Algol-like language called SDL (Systems Development Language) in which the operating software of that line was written. Access to SDL was restricted for security reasons. A variant of SDL was subsequently created with a few of the "priveleged" features removed. The resulting language, called UPL (User Programming Language), was available for customer use.
Some of us still remember when the phrase "Algol-like language" was used to describe any programming language with block-oriented control structures and variable scoping. Widely-known examples of Algol-like languages included PL/I, Pascal, and (...wait for it...) C.
Algol was the major programming language for the Burroughs B5000.
However, what's unclear is, was Algol (pure Algol, not any of its derivatives like Simula) ever actually used for any "real" programming in any way?
Please, avoid the term "real" programming. "Real" - as opposed to what ? Imaginative ?
By "real", I mean used for several good-sized projects other than programming language/CS research, or by a significant number of developers (say, > 1000).
Yes. It was used for a certain number of projects on which worked a certain number of developers.
Only, what is usually misinterpreted often today is this; in those days computers weren't exactly a household commodity. Hell, they weren't that 30 years ago, less alone 60.
Programming was done in computer centres which were either in goverment ownership (military, academic, institutes of various kinds) or in private enterprises (large companies). And programming wasn't a profession - it was something which engineers, mathematicians, scientiscs and the like used to do when their work was done on paper ... or they had specialized operators which did it for them. Often women, who may or may have not had a scientific background in that particular field - they were "language translators", in lack of a better term (and my bad english).
Programming theories and research was at its beginnings ... vendors being few (and naturally uncooperative to each other) ... each of them used their own extensions, and often programs written for one didn't work well with the other vendor's systems.
There wasn't a "right way" to do something ... you had that and that, and you used whatever catch you could figure to work around your problem.
But, I've wandered off. Let me get back to the number of people. This also goes for several other languages; fortran and cobol, for example.
People say, "very few use it". That's simply not true. What is true is that a small percentage of people uses it today, but a larger percent of people used to use it.
As I said, in those days only the sci. and eng. community used to do it. And their number was relatively small, compared to the total population. Nowadays, everybody uses computers, but the absolute number of engineers, mathematicians and the like, is pretty much the same. So it seems that nobody uses those languages anymore ... while in reality, for certain specialized languages (well, nowadays this goes for fortran and cobol, more than algol) the number of users is pretty much constant.
Personally, the only Algol programming I have ever done was on paper, thus the curiosity.
I know I didn't answer your question, but just wanted to clear this. Algol was a little "beofre my time".
My first programming experience was on a Burroughs B5500 owned by Northern Natural Gas Company starting in 1970. I started out in COBOL but switched to ALGOL (actually used both) when they needed additional support for a large Oil & Gas Lease Information system that was written almost entirely in ALGOL. At the time there were two programming departments, Business Systems and Scientific Computing. The Scientific Computing department programmed in ALGOL and FORTRAN while the Business Systems department was mostly COBOL with a smattering of ALGOL. Northern advanced from the B5500 to B6500, B6700, B6900, B7800, and B7900 while I was there. I eventually transferred to the Technical Support department and got into making and supporting MCP patches to customize the system for Northern's needs. That was fun!
Short answer to the question. Yes. Northern had a number of application systems written in ALGOL. Of course it was Burrough's version of ALGOL (extended ALGOL).
Burroughs B5500 Extended Algol was used heavily for research in astrophysics, linguistics, and statistics at my university (Monash University, Australia) in the late 60s. It was also used in commercial applications that helped pay the bills for the computer center.
As I write this I am running Algol programs in the latest release of the Burroughs B5500 emulator from the team at retro-b5500 in Tasmania. The emulator runs entirely in the browser and faithfully models the processors, disks, tapes, card readers, line printers, card punches and datacom gear!
You can read about the project at http://retro-b5500.blogspot.com/ and http://code.google.com/p/retro-b5500 and you can write Algol programs for arguably the finest Algol machine ever made (except perhaps its successor the B6700.)
One of the postdocs from Monash wrote a reverse compiler from IBM Assembler to Burroughs COBOL in Algol, which was used to migrate all the billing applications at the state-run Gas & Fuel Corporation from IBM 360s to Burroughs 6700s.
Back in 1970, I helped develop a Jovial compiler for the Royal Dutch Navy. One of its big advantages was that it was written in Jovial, hence we all got to become pretty good Jovial experts. In fact, as part of the test cycle we would compile the compiler though the latest incarnation of itself and run all our test sets on that. If it passed we would release the first compiler. Thus each release had the capability of compiling itself and that compiler could pass all tests. As every found bug was always added to our self-checking test set the quality of the compiler improved and improved. By the time we left the project we had no known bugs...my once and only time that ever happened.
I programmed in Algol/Jovial back in the 70's for the military. I loved the language. You couldn't do recursion in Fortran and I often could make a program much easier by using the correct data structure and a little recursion.
After I had left that assignment, I found that the other developers on the project didn't want to maintain the Jovial code and tried to replicate what I had done in Fortran. It just didn't work and was much slower.
I learned about compiler theory by digging into the source code for the Jovial compiler. Ah... those were the days.
Algol was well implemented on the Elliott 4100 machine and was used extensively to develop early refinery process simulations at BP Research center in the late 60s. However, at that time Input/output was not well defined (varied between machines) and at BP it was quickly overtaken by Fortran IV as programs written in strict Fortran IV would run on almost any machine variation - IBM, Univac, Atlas, etc., etc.

Development time in various languages

Does anybody know of any research or benchmarks of how long it takes to develop the same application in a variety of languages? Really I'm looking for Java vs. C++ but any comparisons would be useful. I have the feeling there is a section in Code Complete about this but my copy is at work.
Edit:
There are a lot of interesting answers to this question but it seems like there is a lack of really good research. I have made a proposal over at meta about this problem.
Pratt & Whitney, purveyors of jet engines for civilian and military applications, did a study on this many years ago, without actually intending to do the study.
They went on the same metrics kick everyone else went on in the 1990s. They collected a bunch of data about their jet engine controller projects, including timecard data. They crunched it. The poor sap who got to crunch the data noticed something in the results: the military projects uniformly had twice the programmer productivity and one/fourth the defect density as the civilian projects.
This, by itself, is significant. It means you only need half as many programmers, and you aren't going to spend quite as much time fixing bugs. What is even more important is that this was an apples-to-apples comparison. A jet engine controller is a jet engine controller.
He then went looking for candidate explanations. All of the usual candidates: individual experience, team size, toolsets, software processes, requirements stability, everything, were trotted out, and they were ruled out when it was seen that the story on those items was uniformly the same on both sides of the aisle. At the end of the day, only one statistically significant difference showed up.
The civilian projects were written in every language you could think of. The military projects were all written in Ada.
IN EVERY SINGLE CASE, against every other comer, for jet engine controllers at Pratt & Whitney, using Ada gave double the productivity and one/fourth the defect density.
I know what the flying code monkeys are going to say. "You can do good work in any language." In theory, that's true. In practice, however, it appears that, at least at Pratt & Whitney, language made a difference.
Last I heard about this, Pratt & Whitney upper management decreed that ALL jet engine controller projects would be done in Ada.
No, I don't have a citation. No paper was ever written. My source on this story was the poor sap who crunched the numbers. Here's a similar study from 1995:
http://archive.adaic.com/intro/ada-vs-c/cada_art.html
This, incidentally, was BEFORE Boeing did the 777, and BEFORE the 777 brake subcontractor story happened. But that's another story.
One of the few funded scientific studies that I'm aware of on cross-language productivity, from the early 90s, funded by ARPA and the ONR,
Haskell vs. Ada Vs. C++ vs Awk vs ... An Experiment in Software Prototyping Productivity, Hudak & Jones, 1994.
We describe the results of an
experiment in which several
conventional programming languages,
together with the functional language
Haskell, were used to prototype a
Naval Surface Warfare Center (NSWC)
requirement for a Geometric Region
Server. The resulting programs and
development metrics were reviewed by a
committee chosen by the Navy. The
results indicate that the Haskell
prototype took significantly less time
to develop and was considerably more
concise and easier to understand than
the..
This article(a pdf) has some benchmarks (note that it's from 2000) between C, C++, Perl, Java, Perl, Python, Rexx and Tcl.
Some common wisdom I believe holds true (also somewhere within the article):
The number of lines written per hour is independent of the language
Opinion: more important is what is faster for a given developer, for example yourself. What you are used to, will usually be faster. If you are used to 20 years of C++ pitfalls and never skip an uninitialized variable, that will be faster than Java for anybody.
If you remember all parameters of CreateWindowEx() by heart, it will be faster than MFC or winforms.
A couple of anecdotal data points:
On Project Euler, which invites programming solutions to mathematical problems,
the shortest solutions are almost invariably written in J or K, a relative of APL; there are occasionally MatLab solutions in the same range. It can be argued, though, that these languages specialized in math.
runners up were Ruby solutions. A lot of algorithm can be wrapped in very little code, and it's much more legible than J / K.
Python and Haskell solutions also did very well, LOC-wise.
The question asked about "fastest development," not "shortest code." But it's conceivable that shorter solutions are faster to come up with - certainly for slow typists!
There's an annual competition among roboticists. Contestants are given some specs for some hardware, a practical problem to solve in software, and limited time to do so. Again very domain specific, of course. Programmers have their choice of tools, including language of course. Every year, the winning team (often a single person) used Forth.
This admittedly limited sample suggests that "development speed" and "effect of language on speed" is often very dependent on the problem domain.
See also
Are there statistical studies that indicates that Python is "more productive"?
for some discussions about this kind of question.
It would make more sense to benchmark the programmers, not the languages. The time to write a program in any mainstream language depends more on the ability of the programmer in that language than on qualities of that specific language.
I think most benchmarks and statements on this topic will mean very little.
Benchmarks can always be gamed; see the history of "Pet Store".
A language that's good at solving one kind of problem might not apply as well to another.
What matters most is the skill of your team, its knowledge of a particular technology, and how well you know the domain you're trying to solve.
UPDATE: Control software for jet engines and helicopters is a very specialized subset of computing problems. It's characterized by very rigorous, complete, detailed specs and QA that means the multi-million dollar aircraft cannot crash.
I can second the (very good) citation by John Strohm of Pratt & Whitney control software written in Ada. The control software for Kaman helicopters sold to Australia was also written in Ada.
But this does not lead to the conclusion that if you decided to write your next web site in Ada that you'd have higher productivity and fewer defects than you would if you chose C# or Java or Python or Ruby. All languages are not equally good in all problem domains.
Language/framework comparison for web applications
The Plat_Forms project provides some information of this type for web applications.
There are three studies with different tasks (done in 2007, 2011, 2012), all of the following format: Several teams of three professional developers implemented the same application under controlled conditions within two days.
It covers Java, Perl, PHP, and Ruby and has multiple teams for each language.
The evaluation reports much more than only development time.
Findings of iteration one for instance included
that experience with the language and framework appeared to be more relevant than what that framework was.
that Java tended to induce teams to make laborious constructions while Perl induced them to make pragmatic (and quite handy) constructions.
Findings of iteration two included
that Ruby on Rails was more productive in this type of project (which due to its duration was more rapid prototyping than full-blown development of a mature application)
and that the one exception to the above rule was the one team using Symfony, a PHP framework that has similar concepts to Ruby on Rails (but still the very different base language underneath it).
Look under http://www.plat-forms.org or search the web for "Plat_Forms".
There is plenty more detail in the reports, in particular the thick techreport on iteration 1.
Most programs have to interface with some other framework. It tends to be a good idea to pick the language that has libraries specifically for what you are trying to do. For instance are you trying to build a distributed redundant messaging system? If so I would use Erlang. Are you trying to make a quick and dirty data driven website, use Ruby and Rails. You get the idea. Real time DirectX where performance is key, C++/C/Asm.
If you are writing something that is algorithm based I would look to a functional language like Haskell, although it has a very high learning curve.
This question is a little old fashioned. Focusing on development time solely based on the choice of language is of limited value. There are so many other factors that have equal or more impact than the language itself:
The libraries or frameworks available / used.
The level of quality required (ie. defect count).
The type of application (eg. GUI, server, driver etc...)
The level of maintainability required.
Developer experience in the language.
The platform or OS the application is built on.
As an example, many would say Java is the better choice over C++ to build enterprise (line of business) applications. This is not normally because of the language itself, but instead it is perceived that Java has better (or more mature) web server and database frameworks available to it. This may or may not be true, but that is beside the point.
You may even find that the building an application using the same language on different operating systems or platforms gives greatly differing development time. For example using C++ on Linux to build a GUI application may take longer than a Windows based GUI application using C++ because of less extensive and mature GUI libraries avaialble on Linux (once again this is debatable).
According to Norvig, Lutz Prechelt published just such an article in the October 1999 CACM: "Comparing Java vs. C/C++ Efficiency Issues to Interpersonal Issues".
Norvig includes a link to that article. Unfortunately, the ACM, despite having a bitmap graphic proclaiming their goal of "Advancing Computing as a Science & Profession", couldn't figure out how to maintain stable links on their webpage, so it's just a 404 now. Perhaps your local library could help you out.
That Ada story might be an embellished version of this: http://www.adaic.com/whyada/ada-vs-c/cada_art.html
Erlang vs C++/Corba
"... As the Erlang DCC is less than a quarter of the size of a similar C++/CORBA implementation, the product development in Erlang should be fast, and the code maintainable. We conclude that Erlang and associated libraries are suitable for the rapid development of maintainable and highly reliable distributed products."
Paper here
There's a reason why there are no real comparisons in that aspect, except for anecdotal evidence (which can be found in favor of almost any language).
Actually writing code takes relatively small portion of developer's time. Even if language lets you cut coding time in half, it will be barely noticeable by the time project ends. Design, structure of program, development process are all much more important, and then there are libraries, tools and experience with them.
Some languages are better suited for certain development processes than the others, so if you've settled on design and process you can decide which language will be more efficient - but not before.
(didn't notice there's a similar answer already, so feel free to ignore this)

What languages are used for real time systems programming? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I didn't find any useful information about programming languages for real time systems. All I found was Real Time Systems and Programming Languages: Ada 95, Real-Time Java and Real-Time C/POSIX (some pdf here), which seems to talk about extensions of Java and C for real times systems (I don't have the book to read). Also, the book was published in 2001, and the information may be obsolete now.
So, I'm dubious whether these languages are used in real world applications, or if real time systems in the real world are made in other languages, like DSLs.
If the second option is true for you, what are the outstanding characteristics of the language you use?
I am an avionics software engineer.
I was able to participate in several development projects.
The languages I used in those projects are: C, C++, and Real-time Java.
C is great.
C++ is not so bad but C/C++ require strict coding standards for the safety considerations such as DO-178B.
I think Real-time Java is the way to go but I don't see many avionics applications, yet.
Korean jet trainer T-50 will have a mission computer running RT Java application serving HUD and MFD displays, and all of the mission critical functions.
The Real-Time Specification for Java now has several commercial-grade implementations:
Sun's JavaRTS
IBM's WebSphere Real-Time
Aonix PERC
aicas JamaicaVM
Apogee Aphelion
These products span the continuum from compilation to native code (Aonix) to J2ME (aicas, apogee), to full J2SE (Sun, IBM). Most, if not all, have seen deployments in small numbers of safety- or mission-critical systems, but momentum is building. Examples include Eglin AFB's space surveillance radar modernization and the US Navy's use of RTSJ in the DDG-1000/Zumwalt destroyer. Sun also claims deployment in the financial transaction processing domain.
If you are interested in RTSJ, I suggest Peter Dibble's Real-Time Platform Programming, or Professor Wellings' Concurrent and Real-Time Programming in Java.
On a related note, there is also work underway to provide a Safety-Critical profile for the Java programming language, built as a subset of RTSJ. Also, an expert group has formed to explore a Distributed RTSJ DRTSJ, but the work is stalled.
The book covers use of Ada 95, the Java Real-Time System and realtime POSIX extensions (programmed in C). None of these is directly a domain specific language.
Ada 95 is a programming language commonly used in the late 90s and (AFAIK) still widely used for realtime programming in defence and aerospace industries. There is at least one DSL built on top of Ada - SparkAda - which is a system of annotations which describe system characteristics to a program verification tool.
This interview of April 6, 2006 indicates some of the classes and virtual machine changes which make up the Java Real-Time System. It doesn't mention any domain specific language extensions. I haven't come across use of Java in real-time systems, but I haven't been looking at the sorts of systems where I'd expect to find it (I work in aerospace simulation, where it's C++, Fortran and occasionally Ada for real-time in-the-loop systems).
Realtime POSIX is a set of extensions to the POSIX operating system facilities. As OS extensions, they don't require anything specific in the language. That said, I can think of one C based DSL for describing embedded systems - SystemC - but I've no idea if it's also used to generate the embedded systems.
Not mentioned in the book is Matlab, which in the last few years has gone from a simulation tool to a model driven development system for realtime systems.
Matlab/Simulink is, in effect, a DSL for linear programming, state machines and algorithms. Matlab can generate C or HDL for realtime and embedded systems. It's very rare to see an avionics, EW or other defence industry real-time job advertised which doesn't require some Matlab experience. (I don't work for Matlab, but it's hard to over emphasis how ubiquitous it really is in the industry)
Real time applications can be made in almost any language. The environment (operating system, runtime and runtime libraries) must however be compliant to real time constraints. In most cases real-time means that there's always a deterministic time in which something happens. Deterministic time being ussually a very low time value in the microseconds/milliseconds range.
Real time systems depend solely on this criteria, as the specificiations usually say something like 'Every x (period of time) (do something | check something)'. Usually this happens if the system interfaces with external sensors and controls life-saving or life-threatening systems.
I was working on an in-car navigation and infotainment system developed mostly in C/C++ with an operating system configured specifically to meet the real-time constraints to provide real-time navigation and media playback.
But this is not all to real-time systems: Usually the selection of algorithms in the entire system is made to have deterministic runtimes according to the Big-O notation, mostly using linear or constant time. Everything else is considered non-deterministic and thus not useable for real-time systems.
All of the real-time systems I have worked with were predominantly written in C with some bits of assembler, or written mostly in assembler with little bits of C. (Depending on whether we're talking the 90s and beyond, or the 80s, respectively.) However, some of the real-time systems I've worked with have used -- not exactly DSLs -- special homegrown code generators.
Real-time oriented language?
What is real-time
First we have to define what real-time mean.
Of course depending on how your tool will work against the physical environment pure real-time couldn't be effectively done, mostly because there will be a lot of third party dependencies.
If you are building embed stuff by using microcontrollers like arduino, the language to use will be limited by the hardware, but with more complex stuff like Raspberry Pi, the language choice is very wide.
Granularity
This is depending on what you are measuring, if you're working with:
weather temperatures, one read each 10 minute could be enough
people height or weight, one or maybe four read by day
server status, between 1 second for fine debugging to approx 1 hour for quiet unimportant secondary server.
atomic collision count: something finer...
Event based reading
The right (better) way for collecting data is based on value change event... whenever the device do permit it.
Your tool have to not poll values from device, but the device have to send values to your tool, when they change.
This could be done by using an hardware interrupt trigger or by using port protocole like RS-232 staying listening on some serial port, for sample.
Monitoring environment
The last thing to be warned is how legitimate user will interact with.
If you're building embed standalone device, like robot, you may use graphic libraries to interact with touch screen.
If you're building web based monitor, you may have to keep in mind that the client could be an old 800x600 monochrome screen, using poor internet connection and small processor... But depending on final goal if you may interact with clients, you could ensure strong hardware and strong internet connections. Anyway you have to watch for connexion loosing and event for communication delay between server and client. There is mostly third party dependencies.
Which programming language?
From there, the language choice is wide and clearly depend on
your knowledge.
granularity requested (by using event-based too, of course)
the amount of time you have to build the tool (money;)
delay, co-workers...
kind of device
kind of monitoring
some other political reasons
You could build real-time monitoring engine by using bash and sql only, I've seen sophisticated engines that was built under postgresql only... I've personally built a web based, solar energy monitor by using perl, mysql and javascript.
I cannot believe no one has mentioned LabVIEW programming language which is widely used for Real-time safety-critical systems. It has extensive libraries and well-known design patterns for architecturing and implementing for RT systems.
Also National Instruments makes various hardware (cRIO, PXI and etc) which are designed for real-time applications.
We use LabVIEW for Fracking (Hydraulic Fracturing) which is used in safety-critical environments.
PLCs run ladder and fbd code which is really a real-time dsl in the sense that your options are so limited that it is difficult to program in a way that would result in unpredictable runtime performance
A really purposeful application of the C language to real-time programming - and all related issues (such as parallel programming) - is offered by my Kickstarter
http://www.kickstarter.com/projects/767046121/crawl-space-computing-with-connel
It is called "Wide Programming" and I've been doing it most of my life. The rewards include a software library and a book - designed to be useful.
the company I've been working for since 2003 has been developing and deploying a Scada/Mes platform. Original implementation started in 1993, used Modula2 on OS/2. Later (1998) it was ported to Ada95 and Windows. Currently (2019) we use Ada compiler by AdaCore. Our system was ported and has been deployed to 32/64 Windows, HPUX, OpenVMS (and lately even to Raspberry). We have multiple installation in central Europe (gas industry, refineries, factories, power plants).
We feel Ada's features give our system a high degree of reliability and prevents a lot of errors that would easily occour if we used languages like C.
See also my blog
https://www.ipesoft.com/en/blog/what-language-is-the-d2000-written

Resources