What did they program this toy with? - programming-languages

A rather strange question: I'm often asking myself with what programming languages things were created. I recently found this toy mini computer I played with when I was 13 or so at home. (Note: It is not one of those toy "notebooks", it's really small and came as an extra with a magazine)
"Features":
Hadware:
LCD with a small field of pixels where the games were going on, besides that some stats such as score, highscore etc.
Sounds and horrible music when started
A really small "keyboard" with a wire
Software:
At least 14 or so games, from Snake over Tetris and Breakdown to some abomination of a car racing game
A calculator
Game selecting menu
An alarm clock
Inside there is a really small circuit board, I don't want to open the thing up now, though.
Can you imagine if the games and "Operating System" of this thing where actually programmed using a language?
If yes, what language could it be?
If not with a programming language, how else was it created?

If I were to hazard a guess I'd say they used C, it's often used with Microcontrollers in devices like that.

The question is really architectural. Is there a microprocessor in there at all? If so it's likely to have been programmed in quite a low-level language - assembler or C are quite common. However, there might conceivably not be a processor; it might be implemented as custom silicon, either an FPGA or (unlikely) an ASIC, either of which you'd program in VHDL or Verilog.

Anybody's guess. A frequently used tactic when trying to cram a lot of software into a mass-market device (where saving 10c on storage can matter) is to use some kind of bytecode interpreter, where the bytecodes are designed to save space, even if they execute fairly slowly. FORTH used to be popular for this purpose, but there are an awful lot of one-off bytecodes in the world. One that has survived for adventure gaming is the Infocom Z-Machine.

It's clearly an embedded microcontroller.
While in principle it could have been programmed in almost any language, I would be surprised if it were written in anything other than assembly language or C.
My understanding is that all operating systems before 1972 and practically all embedded systems before 1980 or so were written entirely in assembly language, perhaps (as Norman Ramsey pointed out) with a one-off domain specific language (DSL) on top.
Assembly decreased in popularity and the C programming language became the most popular microcontroller programming language.
Even up to around 2000, practically all embedded systems used at least a little assembly language to handle the things that no available higher-level language could handle.
Even today, of the thousands of embedded system microcontrollers available, the vast majority of them have no more than 4 programming languages for them available off-the-shelf:
assembly language, C, BASIC, and Forth.
(I'm hoping that Python will become available for more microcontrollers -- the Pyastra and PyMite dialects already cover a couple of the most popular microcontrollers).
http://www.faqs.org/faqs/microcontroller-faq/primer/

Related

Ex Commodore 64 programmer wants to get back into programming- any suggestions?

When I was a kid I wrote hundreds of programs in BASIC but then as I got older I got out of it (when I discovered girls). Now I want to get back into it again and I don't want to let my prior knowledge & experience go to waste - is there a modern language that is at least somewhat similar? Every time I try to search I get pushed toward Visual BASIC but I would rather learn a modern language that's more widely used. Any suggestions? Thank you in advance!
Start from scratch.
Programming in a modern language (Object or Functional) is different enough from programming basic on a C64 that you will probably carry over more bad habits than good ones.
I would pick a language you like the look and feel of, but mostly think of what you want to do:
Java is probably the "safe" bet, especially if you want to start a career in programming or if you want to work on Android development.
If you want to program for Windows / Microsoft devices then C#
If you want to want to write for the Mac or iOs devices then Swift.
If you like the idea of functional programming then Clojure is a good bet.
If you want to do web development then Javascript and maybe Ruby
If you want to work on things like machine learning or statistics then Python to start and then maybe R
If you want to be cutting edge and maybe work on some DevOps kind of things I would suggest Go
With all of these I would suggest also learning some flavor of SQL
Languages I personally would generally avoid either because they are overly complex or tend to teach bad programming practices:
Objective C, C++, Perl, Lisp, Ruby
If you want to explore some other more esoteric languages I recommend two books:
Seven Languages in Seven Weeks
Seven More Languages in Seven Weeks
Keep in mind, that just because you might start from scratch it doesn't mean your prior experience goes to waste, it just may not be as useful as you may like.
I was in this exact position about eight years ago; whilst I could do some assembly and BASIC, these skills were (and are) generally not required in a modern context. So I went to study a Foundation Degree in Enterprise Computing in the UK (MMU affiliated) because this had Java. Due to a Government change in 2010, that cut funding to Higher Education, the 3rd year of that course was scrapped for all affiliated establishments, so I spent a year at the University of Derby on its Games Programming degree, which was all programming in C, C++, MIPs assembly, C# and Java.
I found the following useful:
6502 is good if you want to learn more modern assembly like MIPs; Z80 is probably good for x86/64, though that is an educated guess rather than fact (I use both 65x and Z80 in personal projects today mixed with C when I get the chance);
C is the most beautiful language that I've ever used. I did C programming on Windows and for the PSP. I've since made Sinclair ZX81 games with C and done a bit of experimental programming for the Commodore 64 and Sinclair ZX Spectrum. I love C;
Object Orientated programming took me a while for me to get my head around. At first, I thought an object was simply a container for computer RAM. Maybe this is a good base to think about it, maybe not;
Going to University is a good thing because you will always learn something if you apply yourself;
8-bit BASIC can still teach you a thing or two if you can transpose your logic without bad practises that 8-bit BASIC encourages;
I had most difficulty understanding databases, mostly the relational algebra side but also all other database stuff. I finally got my head around M:M relationships sometime last year after years of looking at it. If you struggle with SQL/Database stuff, don't give up;
I now work at a PHP Web Application developer with bespoke OO and Procedural frameworks. I have also worked with simpler off-the-shelf solutions such as Magento, CodeIgniter, Joomla! and ExpressionEngine (built on CodeIgniter).

Was ALGOL ever used for "mainstream" programming? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
I know that ALGOL language is super-uber-extremely important as a theoretical language, and it also had a variety of implementations as per Wikipedia.
However, what's unclear is, was ALGOL (pure ALGOL, not any of its derivatives like Simula) ever actually used for any "real" programming in any way?
By "real", I mean used for several good-sized projects other than programming language/CS research, or by a significant number of developers (say, > 1000).
Personally, the only ALGOL programming I have ever done was on paper, thus the curiosity.
Algol58 seems to have been the most successful in terms of important applications.
From Wikipedia:
JOVIAL is an acronym for "Jules Own
Version of the International
Algorithmic Language." The
"International Algorithmic Language"
was a name originally proposed for
ALGOL 58. It was developed to compose
software for the electronics of
military aircraft by Jules Schwartz in
1959.
Then:
Notable systems using JOVIAL include
the Milstar Communications
Satellite, Advanced Cruise
Missile, B-52, B-1B,
B-2 bombers, C-130,
C-141, and C-17 transport
aircraft, F-111,
F-15, F-16 (prior to Block
50), and F-117 fighter aircraft,
LANTIRN, U-2 aircraft,
E-3 Sentry AWACS aircraft,
Navy Aegis cruisers, Army
Multiple Launch Rocket System
(MLRS), Army UH-60 Black
Hawk helicopters, F100,
F117, and F119 jet
engines, the NORAD air
defense & control system (Hughes
HME-5118ME system) and RL-10
rocket engines. Airborne radar
systems with embedded JOVIAL software
include the APG-70, APG-71
and APG-73
ALGOL 68 was used in part of DRA for the same purpose. cf. Wikipedia:
The '''Defence Research Agency'''
(normally known as '''DRA'''), was an
executive agency of the UK Ministry of Defence
(MOD) from April 1991 until April 1995. At the
time the DRA was Britain's largest science and
technology organisation.
DRA's Algol68 compiler was finally open-sourced in April 1999 and is now available for linux for download from sourceforge. (However an interpreter for "Algol68g" is easier to use).
ICL's Algol68 was/is S3 - It was developed by the UK company International Computers Limited (ICL) for its 2900 Series mainframes. It is a system programming language based on ALGOL 68 but with data types and operators aligned to those offered by the 2900 Series. It was the implementation language of the operating system VME.
There are (at least) two other British operating systems - Flex and Cambridge_CAP_computer - written in Algol68 variants. And also 1 Soviet OS: Эльбрус-1 (Elbrus-1), but I have yet to find any of their source code. (If anyone can find and distribute to this source code please let me know)
BTW: I believe that VME is still running - in production - as a Linux/Unixware guest VM. Mostly at Commonwealth of Nations Custom/Immigration services.
Also over the same period the USSR was using Algol68, c.f. history link. Algol68 is used in Russian telephone exchanges. And Algol58 was used in the Russian "Buran/Буран" Space Shuttle landing system.
ALGOL68 was internationalized in 1968. I suspect there are other Algol projects in other countries. esp in German, in Dutch Japanese and Chinese but I have no details.
If you want to actually tryout Algol68 and/or contribute your code, check out Rosettacode's ALGOL 68 repository, then as a class project try one of the "Tasks not implemented".
Nothing like responding to 2 year old threads. I program in ALGOL almost daily. I am a programmer on a Unisys ClearPath mainframe and the majority of the system code is written in ALGOL or variants. The Burroughs B5500 was really designed around the language so it is a pretty efficient language/compilation process. Granted, this version is ALGOL with some extensions like limited classes (structure blocks), etc.
i := 80;
while i > 0 do
begin
scan ptrRay:ptrRay for i:i until in ALPHA;
scan ptrEnd:ptrRay for i:i while in ALPHA;
if i > 0 then
begin
replace nextToken by ptrRay for (offset(ptrEnd) - offset(ptrRay));
end;
end;
That code scans for ALPHA only tokens. It uses the OFFSET function which is a little more costly than using the residual count math yourself (i, starti, etc);
Like Tom, I program in ALGOL almost daily - and I'm also on a Unisys Clearpath. ALGOL has been the primary source of my mortgage repayments for more years than I care to remember.
When I started programming, Algol was the only compiler available. Yes, it was mainstream till we got a Fortran compiler.
To follow up on themis' answer, the entire Burroughs "large system" family (5000, 5500, 5700, 6500, 6700...) was really designed to run Algol well. The operating system, compilers, and major system utilities were written in Algol; if that's not "real" programming, what is?
To be precise, over the life of the product family Burroughs extended Algol into a superset called ESPOL. When Burroughs brought out the "small systems" family (1700, 1800, 1900 series), they defined another Algol-like language called SDL (Systems Development Language) in which the operating software of that line was written. Access to SDL was restricted for security reasons. A variant of SDL was subsequently created with a few of the "priveleged" features removed. The resulting language, called UPL (User Programming Language), was available for customer use.
Some of us still remember when the phrase "Algol-like language" was used to describe any programming language with block-oriented control structures and variable scoping. Widely-known examples of Algol-like languages included PL/I, Pascal, and (...wait for it...) C.
Algol was the major programming language for the Burroughs B5000.
However, what's unclear is, was Algol (pure Algol, not any of its derivatives like Simula) ever actually used for any "real" programming in any way?
Please, avoid the term "real" programming. "Real" - as opposed to what ? Imaginative ?
By "real", I mean used for several good-sized projects other than programming language/CS research, or by a significant number of developers (say, > 1000).
Yes. It was used for a certain number of projects on which worked a certain number of developers.
Only, what is usually misinterpreted often today is this; in those days computers weren't exactly a household commodity. Hell, they weren't that 30 years ago, less alone 60.
Programming was done in computer centres which were either in goverment ownership (military, academic, institutes of various kinds) or in private enterprises (large companies). And programming wasn't a profession - it was something which engineers, mathematicians, scientiscs and the like used to do when their work was done on paper ... or they had specialized operators which did it for them. Often women, who may or may have not had a scientific background in that particular field - they were "language translators", in lack of a better term (and my bad english).
Programming theories and research was at its beginnings ... vendors being few (and naturally uncooperative to each other) ... each of them used their own extensions, and often programs written for one didn't work well with the other vendor's systems.
There wasn't a "right way" to do something ... you had that and that, and you used whatever catch you could figure to work around your problem.
But, I've wandered off. Let me get back to the number of people. This also goes for several other languages; fortran and cobol, for example.
People say, "very few use it". That's simply not true. What is true is that a small percentage of people uses it today, but a larger percent of people used to use it.
As I said, in those days only the sci. and eng. community used to do it. And their number was relatively small, compared to the total population. Nowadays, everybody uses computers, but the absolute number of engineers, mathematicians and the like, is pretty much the same. So it seems that nobody uses those languages anymore ... while in reality, for certain specialized languages (well, nowadays this goes for fortran and cobol, more than algol) the number of users is pretty much constant.
Personally, the only Algol programming I have ever done was on paper, thus the curiosity.
I know I didn't answer your question, but just wanted to clear this. Algol was a little "beofre my time".
My first programming experience was on a Burroughs B5500 owned by Northern Natural Gas Company starting in 1970. I started out in COBOL but switched to ALGOL (actually used both) when they needed additional support for a large Oil & Gas Lease Information system that was written almost entirely in ALGOL. At the time there were two programming departments, Business Systems and Scientific Computing. The Scientific Computing department programmed in ALGOL and FORTRAN while the Business Systems department was mostly COBOL with a smattering of ALGOL. Northern advanced from the B5500 to B6500, B6700, B6900, B7800, and B7900 while I was there. I eventually transferred to the Technical Support department and got into making and supporting MCP patches to customize the system for Northern's needs. That was fun!
Short answer to the question. Yes. Northern had a number of application systems written in ALGOL. Of course it was Burrough's version of ALGOL (extended ALGOL).
Burroughs B5500 Extended Algol was used heavily for research in astrophysics, linguistics, and statistics at my university (Monash University, Australia) in the late 60s. It was also used in commercial applications that helped pay the bills for the computer center.
As I write this I am running Algol programs in the latest release of the Burroughs B5500 emulator from the team at retro-b5500 in Tasmania. The emulator runs entirely in the browser and faithfully models the processors, disks, tapes, card readers, line printers, card punches and datacom gear!
You can read about the project at http://retro-b5500.blogspot.com/ and http://code.google.com/p/retro-b5500 and you can write Algol programs for arguably the finest Algol machine ever made (except perhaps its successor the B6700.)
One of the postdocs from Monash wrote a reverse compiler from IBM Assembler to Burroughs COBOL in Algol, which was used to migrate all the billing applications at the state-run Gas & Fuel Corporation from IBM 360s to Burroughs 6700s.
Back in 1970, I helped develop a Jovial compiler for the Royal Dutch Navy. One of its big advantages was that it was written in Jovial, hence we all got to become pretty good Jovial experts. In fact, as part of the test cycle we would compile the compiler though the latest incarnation of itself and run all our test sets on that. If it passed we would release the first compiler. Thus each release had the capability of compiling itself and that compiler could pass all tests. As every found bug was always added to our self-checking test set the quality of the compiler improved and improved. By the time we left the project we had no known bugs...my once and only time that ever happened.
I programmed in Algol/Jovial back in the 70's for the military. I loved the language. You couldn't do recursion in Fortran and I often could make a program much easier by using the correct data structure and a little recursion.
After I had left that assignment, I found that the other developers on the project didn't want to maintain the Jovial code and tried to replicate what I had done in Fortran. It just didn't work and was much slower.
I learned about compiler theory by digging into the source code for the Jovial compiler. Ah... those were the days.
Algol was well implemented on the Elliott 4100 machine and was used extensively to develop early refinery process simulations at BP Research center in the late 60s. However, at that time Input/output was not well defined (varied between machines) and at BP it was quickly overtaken by Fortran IV as programs written in strict Fortran IV would run on almost any machine variation - IBM, Univac, Atlas, etc., etc.

What fast low-level languages can you recommend?

I have become interested in C-like languages for performance computing. Can you recommend some alternative programming languages which have the following attributes:
must be close to the hardware (bit fiddling, pointers or some alternative safe method like references)
no managed code (no jvm/.net languages)
has to be really fast (like C)
must be above ASM level (and yes I am interested in macro languages on top of ASM)
can be obscure, not very widespread
I am mainly interested in little-known languages.
How about Assembly language, or the D programming language?
If you don't know about it and are interested just in broadening your horizons, take a look at Forth. Reading about Forth always makes me feel C is high-level.
Well, I've always preferred C and/or C++ because there are multiple flavours (MSVC, glibc etc), it runs on many different platforms (e.g. mobile devices, Windows, linux) and devices, and it can be written cross platform (different processor architectures) and even for high end graphics (e.g. DirectX).
You get "decent" access to platform resources (conditions vary), it can be as fast as you choose to hone it, and it's a tad easier (IMHO) to write than ASM. There's also a pretty decent range of support tools and code analysis tools to make things a little easier.
Also C and C++ have been around for quite some time, so it's got (even today) an excellent and enthusiastic community!
You don't explicitly state that it can't be C in your question, so I'll go ahead and recommend C. It fulfills your three bulleted desires, and you won't have to worry about different versions of the language (like each different kind of assembler).
Forth!
Forth can be faster than machine language on some architectures. The compiled code is extremely dense, therefore, making optimal use of code caching.
assembly would be the closest to the hardware and therefore the fastest
Ada was originally designed for embedded systems (among other things).
OpenCL might be interesting. It's sort of like OpenGL shader language (a subset of C with extensions), but for general purpose parallel array computing.
You could start programming FPGAs in VHDL, Verilog, System C ...
Variations on a theme
FORTRAN is older than C, and is still one of the major players in numerical computing. Until 1990 (when the language was substantially modernized), the language didn't have any form of pointer (checked or not). This lack meant that there was no way to manage memory dynamically; it also made aliasing analysis easy for the compiler, which is one of the things that makes Fortran code fast.
ALGOL was the first structured programming language. Although it had limited success with programmers, it had a strong influence on language designers.
Ada is an imperative language with a strong type system and good modularity, which makes it good for low-level programming with strong assurance requirements (it was sponsored by the US government with military and avionics applications in mind). It was inspired by Pascal, like Modula-2 and Modula-3.
Going further from the mainstream of low-level imperative programming, there is FORTH. FORTH can be compiled for, and even interpreted on, devices with very little memory; it finds a lot of use on low-end embedded systems, including microcontrollers. The language is based on reverse polish notation, made famous by HP calculators (in fact, the language of HP calculators is strongly influenced by FORTH). Many implementations don't have variables: all data is kept on one or more stacks.
Just for fun, I'll mention INTERCAL, the grandaddy of esoteric languages.
Stuff that will blow your mind
Esoteric languages can be instructive, and a quite a few work close to the machine (usually a virtual machine, but in principle you could implement them for an actual computer if you were crazy enough). You could look at brainfuck (a sort of intermediate stage between Turing machines and C), or the many single-instruction languages, or befunge (what if memory was a two-dimensional array?).
Cyclone looks a lot like C. The syntax is the same, and Cyclone has pointers, untagged structures and unions, goto statements and manual memory management. And yet it's a safe language: you can't have a dangling pointer, or a buffer overflow. And you have access to high-level features such as pattern matching, exceptions, polymorphism, abstract types and optional automatic memory management (not just garbage collection, but also regions). Cyclone is both useful and instructive; for a C die-hard, it can be a good way of discovering what makes a safe language. Cyclone can compile to C, so you can run your programs anywhere you have a C compiler for.
Going in a different direction, if you want to be close to the hardware, while still not actually designing hardware, have a look at synchronous languages, such as Lustre and Esterel. These languages are used to program high-assurance realtime systems such as nuclear plants, airplanes and railway signaling. These languages give up Turing completeness and gain the assurance that programmers can know exactly how fast their program will run and how much memory it will require. If you think C is close to the machine, finding out what a language that is really close to the machine may come as a shock.
You can't get much closer than assembly language, unless you get a job with a chip-maker and start writing micro code!!!
If you're on Windows I think you can get hold of Microsoft MASM (macro assembler) that will allow you go get up and running quickly. I used it a long time ago and it's not a bad product.
Seems a bit awkward to answer my question, but I have found two languages:
Pyrex
Vala
They may not fulfill all of the constraints, but they are great for performance computing and both translates to C.

Do certain languages have intrinsic processor architectures by-design

I'm curious to know if certain languages are, by design, better suited for certain processor architectures. When I say architectures, I don't mean ARM/PPC/MIPS but more stack, accumulator, or register based architectures.
For example, I can think of Forth, which is a stack architecture. Any others?
Yes, definitely... it goes the other way as well: many hardware architectures are designed to accommodate certain languages.
RISC architectures are very much an answer to that people moved from assembly to compiled languages like C/C++.
Burroughs B5000 had Algol instead of assembler.
There are several different Forth chips.
Lisp machines were designed to run Lisp efficiently.
Java processors run Java bytecode in hardware.
Some ARM processors have (optional) Java acceleration technology.
Probably many more good examples are available.
Yes they do. For example, the Occam programming language was originally targerted specifically at the Transputer architecture.
Perhaps this is a bit of a smart-ass answer, but:
The assembly languages of the processors involved are tightly linked to the architecture, so, yes, there do exist some languages where it is true.
Whether higher-level languages exhibit the same is perhaps more interesting.
I saw a talk on Google Video by Simon Peyton Jones that talked about this. He mentioned that back in the day people were very interested in writing hardware that was specialized to execute a particular language, but people figured out a better way to solve the problem: make the compiler smarter. Take a look at Haskell. GHC produces some ridiculously fast code from high level constructs, yet Haskell is so much unlike x86 assembler that the two look alien from each other. The same kind of thing happened with Java and Lisp: Java and Lisp are both very fast on modern computers and take decent advantage of our processors, but Java was originally compiled for a weird stack-based bytecode and long ago, people built Lisp machines.
Here's the video, by the way. Most of it is not relevant to the current question but you may find it interesting, it's about "why functional programming is important" and how to make unit tests the easy way.
http://video.google.com/videoplay?docid=-4991530385753299192&hl=en
It's only been fairly recent (last decade or so?) that compilers have been smart enough to make Haskell and Java almost as fast as C, even though neither of them expose much of the underlying architecture. Heck, GHC doesn't even use the stack, how wacky is that?
The best known example is of course c
C was written in the early 1970s to suit the DEC PDP-11
e.g. On the PDP-7 the programming language B only had one data type, but porting it to the PDP-11 which had different sized data types, data types for variables were added to the language.
Most languages target Von Neumann architecture, which is the basis of most CPU.
Occam for Transputer, mentioned by Neil Butterworth is a notable exception.
VHDL is another exception, based on data flow concept, but it is not a programming language, it is a hardware description and simulation language.

Which language would you use in your OS?

This is probably more of a subjective question, but which language (not API like .NET or JDK) would you use should you write your own operating system? Which language provides flexibility, simplicity, and possibly a low-level interface to the hardware? I was thinking Java or C...
C, of course.
Haskell.
Once you have flipped the right hardware bits, C is a terrible language to use for the rest of the OS. Things like the scheduler, filesystems, drivers, etc. are complex high-level algorithms, and you don't want to be writing those in assembly language (or C; same thing). It's too hard to get right. (The VM subsystem and memory manager may need to be written in something low-level, as you will need to bootstrap your high-level langauge's runtime somehow.)
Anyway, this isn't just a crazy idea that I am coming up with for SO. Here is an OS written in Haskell: http://programatica.cs.pdx.edu/House/
Lisp is another good choice; the original Lisp machines were infinitely more tweakable (at runtime) than "modern" OSes like UNIX and Windows.
Sometimes history forgets good ideas (often in the name of "maximum performance"), and that makes me sad.
D would be an interesting choice. From its own description:
D is a systems programming language. Its focus is on combining the power and high performance of C and C++ with the programmer productivity of modern languages like Ruby and Python. Special attention is given to the needs of quality assurance, documentation, management, portability and reliability.
The D runtime assumes the existence of a garbage collector, which would not be appropriate for the very lowest levels of the kernel. However, it would be appropriate for many of the higher layers.
Build the basic components like task schedulers and drivers etc with Assembly, then build the higher level components like applications and tools with C
I believe this is how Windows XP was built too (unsure about Windows Vista and Windows 7).
Definitely... C
C, ASM, C#
Singularity
Low-level in something like Haskell or D. Productivity over performance, in my opinion. You can rewrite slow parts in C++ or even assembly later if the need arises.
High-level in Python or Ruby. Ideally I'd also have a really fast JIT-capable VM for that language, but that's not going to happen for either language for a while. Lua would be a good alternative if speed gets in the way.
The kernel has to be written in a low-level language, C is by far the best choice for this, because it is so memory efficient. The higher levels could be built with a combination of Java or more ideally Objective-C, and scripting languages like python and ruby, or lua.
Honestly, I would either use C or some hierarchy of languages that I had either designed or fit together completely seamlessly. What I would be looking for is a seamless experience that starts at the bare metal level and then I could move to higher and higher level languages as I moved up the problem space. I would probably chose something like:
C - for bare metal stuff like drivers, kernel, etc
Java/C# - for application-level things like administration consoles, OS apps
Python/PowerShell - for scripting activities like common administrative tasks (creating a new user, etc)
Personally, I think C/C#/PowerShell is more tightly integrated and the type of experience I'd be looking for. Of course, if I ever got so ambitious as to write an OS, I would have a lot of spare time on my hands and would probably really enjoy tackling the language stack first. So maybe it would be L/L#/LScript ...
BitC seems to have this in mind. Despite it's name it seems to be the midpoint of assembly language and lisp. The goal was to make a language with a strong correspondence with machine language but have an intermediate representation that supports stronger correctness inferences than is possible with most other common languages. The languages was created as part of the Coyotos project, an operating system with lofty goals of security and reliability. Formal verification is made significantly possible with the ir used in BitC.
Ada:
Ada is a structured, statically typed, imperative, and object-oriented high-level computer programming language, extended from Pascal and other languages. It was originally designed by a team led by Jean Ichbiah of CII Honeywell Bull under contract to the United States Department of Defense (DoD) from 1977 to 1983 to supersede the hundreds of programming languages then used by the DoD. Ada is strongly typed and compilers are validated for reliability in mission-critical applications, such as avionics software.
Ada, because it was not only specifically developed for such projects, but it also provides support for several very useful high level features (such as support for strong typing, concurrency and abstraction) that are simply not available in standard C.
So that, even as a project grows, you don't have to work around language limitations (think encapsulation, abstraction, namespaces in C).
Don't get me wrong, C works obviously for a great many of projects, but once a project has gained a certain size (think Linux kernel, gcc, GNOME), you will inevitably appreciate certain features of more high level languages to make the development process less tedious and also less obfuscated.
In C however, these features usually end up being -pretty poorly- emulated by excessive and almost pervert use of the pre-processor (this can for example be seen in the gcc code base), so that you get to see lots of nested macros, that from an implementation point of view, actually emulate features found in other programming languages.
In addition, Ada is the only programming language, that I am aware of, that actually provides standardized support for source code analysis using the ASIS, having such a facility in place is however the prerequisite to actually be able to maintain and transform/re-engineer a code base in the long run (think refactoring).
Having an interface like ASIS available, means that you can actually implement "semantic patching", where you can automatically rename a file, function or variable/data structure and it will actually work.
Java ?? no jave runs on a virtual machine which needs an os to run on top of ,
maybe C and some ASM ;)
I would go with D to see whether it can do it.
I would only pick the following 3 out of practicality.
C (good old fashioned)
C++ (C with stuff tacked on. Windows is partially written in this)
Java (the medium level language that just might have a capable garbage collector with controllable pauses with G1).
If I were going to start a new OS I'd do it with the subset of C++ recommended by the embedded industry. You can use things like classes and use it "as a better C" and be just as fast. Just avoid things that have massive overhead. You can even use some template features, if you stick to a certain subet that basically don't have any overhead. Look on embedded.com for features in C++ that have little to no overhead, but will allow you organize your code better than you ever could in C.
Oberon? I guess I miss Pascal too much some times. C paid the bills for quite a while, but I don't really love it.
Lisp of course!
Title text: Some say the world will end in fire; some say in segfaults.
For an OS, you want speed at the lowest levels. So assembly, C, C++, Objective-C, or Java seem to be the current choices. Although it's just recently that Java got fast, and it's hard for me to imagine an OS with garbage collection.
If I were writing my own, it would be a mix of assembly and C.
A C or C++ microkernel with a JIT for a highly dynamic language like Ruby or maybe a language with native support for the Prototype pattern. Even device drivers in that language.
Not because it's practical but because it's really cool. Cool in the way that NeXTStep was cool for using Obj-C for pretty much everything.
http://www.dwheeler.com/sloc/redhat71-v1/redhat71sloc.html - share of languages in Linux's source code.
C, by a number of reasons. Other candidates, like D, are great. However, C has this advantage: there's a lot of available open-sourced C code that you could reuse in your project (much more than is for other languages appropriate for system programming).
I would be torn between using some existing low level language and write my own based on C# but with much better generics support.
In second case I would make each method generic, but all the constraints will be resolved by compiler - to allow "duck typing" like in Scala but still language should be static. Also static virtual methods would lower the codebase.
I've had that idea for a long time, but it never seems to be doable in real timeframe, so who knows maybe in the future. :-)
Some would say Java.
Note that openfirmware is written partly in Forth, and it's very low level.
Have an open mind.
"The kernel has to be written in a low-level language, C is by far the best choice for this, because it is so memory efficient. "
Um... What about FORTH?
FORTH can be low level and high level, so you could have a whole operating system written in FORTH from the ground up, and still have a nice easy REPL scripting environment on top, also in FORTH.
However, any decent operating system should support lots of langauges on top, from C all the way to Python Ruby and Javascript. Making FORTH the basis for it all has a lot of benefits though.
edit: I'd only ever attempt this for an embedded environment with a single known hardware set. Trying to write an OS that could compete with Linux or Windows is a fools job.
If this isn't a hypothetical question, and you're looking to create your own OS, I'd probably go with C because most of the examples out there are written in C.
Also, (And I haven't build an OS yet so take this with a grain of salt), I'm thinking that the c runtime libraries would be a lot easier to port to your new OS than say .NET.
Pascal + Oberon: they have the power of C and C++ but they're not as daunting to use. Both these languages are grossly under appreciated.

Resources