Standard (simple?) benchmark code/test? - programming-languages

Is there some kind of standard benchmarking system or outline or something? I am looking at go, llvm, d and other languages and i wanted to know how they fair in execution time, memory usage, etc.
I found https://benchmarksgame-team.pages.debian.net/benchmarksgame/ but the code is NOT THE SAME. One example is a C++ source is < 100 lines while the C source is >650. I hardly call that fair. Another test in its source has the stupid mistake of putting a lock inside the loop while other languages put it outside.
So i wanted to know some test i might consider looking at/running that perhaps uses no nonstandard or even complex libs. Like implemented completely inside a single source file. Something fair.

For several years the benchmarks game website featured this on the Help page -
What does "not fair" mean? (A fable)
They raced up, and down, and around and around and around, and forwards and backwards and sideways and upside-down.
Cheetah's friends said "it's not fair" - everyone knows Cheetah is the fastest creature but the races are too long and Cheetah gets tired!
Falcon's friends said "it's not fair" - everyone knows Falcon is the fastest creature but Falcon doesn't walk very well, he soars across the sky!
Horse's friends said "it's not fair" - everyone knows Horse is the fastest creature but this is only a yearling, you must stop the races until a stallion takes part!
Man's friends said "it's not fair" - everyone knows that in the "real world" Man would use a motorbike, you must wait until Man has fueled and warmed up the engine!
Snail's friends said "it's not fair" - everyone knows that a creature should leave a slime trail, all those other creatures are cheating!
Dalmatian's tail was banging on the ground. Dalmatian panted and between breaths said "Look at that beautiful mountain, let's race to the top!"
At that time "it's not fair" comments were mostly special pleading intended to gain an advantage for programming language X to the disadvantage of programming language Y.
But the issues your question raises are a little different.
Firstly, look at the n-body
programs on the benchmarks game
website. Even though the programs
are written in different languages
there's very little difference in
the way the programs are coded.
So far no one has found an effective
way to make use of quad-core for
this small n-body problem - so there
are no special multi-core programs.
The programs do not use non-standard
or complex libraries. The programs
are completely implemented inside a
single source file.
I said there's very little
difference in the way the n-body
programs are coded but does that
really mean the programs are the
same? Soon after the project had
been revived, 6 or 7 years ago I
remember an Ada programmer
half-joked about comparing apples to
oranges because the assembly language
from the Ada programs wasn't the
same as the assembly language from the C
programs - so obviously like wasn't
being compared to like :-)
otoh the Ada source code would have
to be written in a different way
than the C source code was written,
to make the Ada compiler produce the
same assembly language as the C compiler
produced.
otoh if the assembly language produced by
both compilers really was line-by-line
the same, why would there be a
performance difference?
When there's very little difference in the way
the programs are coded then at first glance the
comparison appears to be fair, but forcing
different languages to be coded like language X
may favour language X.
As Yannick Versley noted, the point
of using a different language is for
the different approaches that
language provides. In other words,
there's more than one way to do the
same thing.
Look at the mandelbrot programs on
the benchmarks game website - the
simplest C program is half the size
of the fastest C program; the
simplest C program is sequential and
uses doubles, the fastest C program
uses all 4 cores through OMP and GCC
intrinsics.
Other languages take different approaches to use all 4 cores - does that mean we should only compare sequential programs and ignore the reality of multi-core computing?
Other language implementations may not provide an equivalent to GCC intrinsics - does that mean we should only compare programs that use doubles? But other language implementations take different approaches in the way they represent doubles - does that mean we should ignore all floating point programs?
The problem is that programming languages (and programming language implementations) are more different than apples to oranges, but we still ask - Will my program be faster if I write it in language X? - and still wish for a simpler answer than - It depends how you write it!
The different tasks and different programs on the benchmarks game website show that some of the performance comparison answers are confusing and complicated - the details matter, a lot.

Benchmarking is not entirely about being fair - it's about choosing something for your own workload, within your restraints.
If you want to use the alioth shootout site, you can still get interesting information if you exclude solutions that are too verbose, or too slow (the exact balancing depends on what you want to do - do you write code that runs for five seconds, or one that will occupy a dozen computers for five months). Look at the most concise examples for one particular problem to see the general problem structure - then see what typical optimizations people applied to make the code run faster.
Having a benchmark with THE SAME code misses the point, because you need different things to help in different languages; Java has GC, which means that it will do well on the trees test, whereas you need custom memory allocation in C/C++ to compete with that (and that particular benchmark is structured so that standard malloc does really poorly), for the spectral-norm one, you need non-boxed double arrays...
If you want to come up with your own solutions, have a go at Project Euler - there are a lot of problems that do not depend on complex libraries, yet are challenging to optimize. Otherwise, try to come up with scoring criteria that you consider adequate to filter or rank the existing contributions in the shootout (or outside it - for example, there are ShedSkin and Cython solutions to some of the problems, which are "unofficial" because these languages are not included).

Related

Haskell for mission-critical systems [duplicate]

I've been curious to understand if it is possible to apply the power of Haskell to embedded realtime world, and in googling have found the Atom package. I'd assume that in the complex case the code might have all the classical C bugs - crashes, memory corruptions, etc, which would then need to be traced to the original Haskell code that
caused them. So, this is the first part of the question: "If you had the experience with Atom, how did you deal with the task of debugging the low-level bugs in compiled C code and fixing them in Haskell original code ?"
I searched for some more examples for Atom, this blog post mentions the resulting C code 22KLOC (and obviously no code:), the included example is a toy. This and this references have a bit more practical code, but this is where this ends. And the reason I put "sizable" in the subject is, I'm most interested if you might share your experiences of working with the generated C code in the range of 300KLOC+.
As I am a Haskell newbie, obviously there may be other ways that I did not find due to my unknown unknowns, so any other pointers for self-education in this area would be greatly appreciated - and this is the second part of the question - "what would be some other practical methods (if) of doing real-time development in Haskell?". If the multicore is also in the picture, that's an extra plus :-)
(About usage of Haskell itself for this purpose: from what I read in this blog post, the garbage collection and laziness in Haskell makes it rather nondeterministic scheduling-wise, but maybe in two years something has changed. Real world Haskell programming question on SO was the closest that I could find to this topic)
Note: "real-time" above is would be closer to "hard realtime" - I'm curious if it is possible to ensure that the pause time when the main task is not executing is under 0.5ms.
At Galois we use Haskell for two things:
Soft real time (OS device layers, networking), where 1-5 ms response times are plausible. GHC generates fast code, and has plenty of support for tuning the garbage collector and scheduler to get the right timings.
for true real time systems EDSLs are used to generate code for other languages that provide stronger timing guarantees. E.g. Cryptol, Atom and Copilot.
So be careful to distinguish the EDSL (Copilot or Atom) from the host language (Haskell).
Some examples of critical systems, and in some cases, real-time systems, either written or generated from Haskell, produced by Galois.
EDSLs
Copilot: A Hard Real-Time Runtime Monitor -- a DSL for real-time avionics monitoring
Equivalence and Safety Checking in Cryptol -- a DSL for cryptographic components of critical systems
Systems
HaLVM -- a lightweight microkernel for embedded and mobile applications
TSE -- a cross-domain (security level) network appliance
It will be a long time before there is a Haskell system that fits in small memory and can guarantee sub-millisecond pause times. The community of Haskell implementors just doesn't seem to be interested in this kind of target.
There is healthy interest in using Haskell or something Haskell-like to compile down to something very efficient; for example, Bluespec compiles to hardware.
I don't think it will meet your needs, but if you're interested in functional programming and embedded systems you should learn about Erlang.
Andrew,
Yes, it can be tricky to debug problems through the generated code back to the original source. One thing Atom provides is a means to probe internal expressions, then leaves if up to the user how to handle these probes. For vehicle testing, we build a transmitter (in Atom) and stream the probes out over a CAN bus. We can then capture this data, formated it, then view it with tools like GTKWave, either in post-processing or realtime. For software simulation, probes are handled differently. Instead of getting probe data from a CAN protocol, hooks are made to the C code to lift the probe values directly. The probe values are then used in the unit testing framework (distributed with Atom) to determine if a test passes or fails and to calculate simulation coverage.
I don't think Haskell, or other Garbage Collected languages are very well-suited to hard-realtime systems, as GC's tend to amortize their runtimes into short pauses.
Writing in Atom is not exactly programming in Haskell, as Haskell here can be seen as purely a preprocessor for the actual program you are writing.
I think Haskell is an awesome preprocessor, and using DSEL's like Atom is probably a great way to create sizable hard-realtime systems, but I don't know if Atom fits the bill or not. If it doesn't, I'm pretty sure it is possible (and I encourage anyone who does!) to implement a DSEL that does.
Having a very strong pre-processor like Haskell for a low-level language opens up a huge window of opportunity to implement abstractions through code-generation that are much more clumsy when implemented as C code text generators.
I've been fooling around with Atom. It is pretty cool, but I think it is best for small systems. Yes it runs in trucks and buses and implements real-world, critical applications, but that doesn't mean those applications are necessarily large or complex. It really is for hard-real-time apps and goes to great lengths to make every operation take the exact same amount of time. For example, instead of an if/else statement that conditionally executes one of two code branches that might differ in running time, it has a "mux" statement that always executes both branches before conditionally selecting one of the two computed values (so the total execution time is the same whichever value is selected). It doesn't have any significant type system other than built-in types (comparable to C's) that are enforced through GADT values passed through the Atom monad. The author is working on a static verification tool that analyzes the output C code, which is pretty cool (it uses an SMT solver), but I think Atom would benefit from more source-level features and checks. Even in my toy-sized app (LED flashlight controller), I've made a number of newbie errors that someone more experienced with the package might avoid, but that resulted in buggy output code that I'd rather have been caught by the compiler instead of through testing. On the other hand, it's still at version 0.1.something so improvements are undoubtedly coming.

Does this language have its niche | future?

I am working on a new language, targeted for web development, embeding into applications, distributed applications, high-reliability software (but this is for distant future).
Also, it's target to reduce development expenses in long term - more time to write safer code and less support later. And finally, it enforces many things that real teams have to enforce - like one crossplatform IDE, one codestyle, one web framework.
In short, the key syntax/language features are:
Open source, non-restrictive licensing. Surely crossplatform.
Tastes like C++ but simpler, Pythonic syntax with strict & static type checking. Easier to learn, no multiple inheritance and other things which nobody know anyway :-)
LLVM bytecode/compilation backend gives near-C speed.
Is has both garbage collection & explicit object destruction.
Real OS threads, native support of multicore computers. Multithreading is part of language, not a library.
Types have the same width on any platform. int(32), long(64) e.t.c
Built in post and preconditions, asserts, tiny unit tests. You write a method - you can write all these things in 1 place, so you have related things in one place. If you worry that your class sourcecode will be bloated with this - it's IDEs work to hide what you don't need now.
Java-like exception handling (i.e. you have to handle all exceptions)
I guess I'll leave web & cluster features for now...
What you think? Are there any existing similar languages which I missed?
To summarize: You language has no real selling points. It just does what a dozen other languages already did, with syntax and semantics just slightly off, depending on where the programmer comes from. This may be a good thing, as it makes the language easier to adapt, but you also have to convince people to trouble to switch. All this stuff has to be built and debugged and documented again, tools have to be programmed, people have to learn it and convince their pointy-haired bosses to use it, etc. "So it's language X with a few features from Y and nicer syntax? But it won't make my application's code 15% shorter and cleaner, it won't free me from boilerplate X, etc - and it won't work with my IDE." The last one is important. Tools matter. If there are no good tools for a language, few people will shy away, rightfully so.
And finally, it enforces many things that real teams have to enforce - like one crossplatform IDE, one codestyle, one web framework.
Sounds like a downside! How does the language "enforce one X"? How do you convince programmers this coding style is the one true style? Why shouldn't somebody go and replace the dog slow, hardly maintained, severly limited IDE you "enforce" with something better? How could one web framework possibly fit all applications? Programmers rarely like to be forced into X, and they are sometimes right.
Also, you language will have to talk to others. So you have ready-made standard solutions for multithreading and web development in mind? Maybe you should start with a FFI instead. Python can use extensions written in C or C++, use dynamic libraries through ctypes, and with Cython it's amazingly simple to wrap any given C library with a Python interface. Do you have any idea how many important libraries are written in C? Unless your language can use these, people can hardly get (real-world) stuff done with it. Just think of GUI. Most mayor GUI toolkits are C or C++. And Java has hundreds of libraries (the other JVM languages profit much from Java interop) for many many purposes.
Finally, on performance: LLVM can give you native code generation, which is a huge plus (performance-wise, but also because the result is standalone), but the LLVM optimizers are limited, too. Don't expect it to beat C. Especially not hand-tuned C compiled via icc on Intel CPUs ;)
"Are there any existing similar
languages which I missed?"
D? Compared to your features:
The compiler has a dual license - GPL and Artistic
See example code here.
LDC targets LLVM. Support for D version 2 is under development.
Built-in garbage collection or explicit memory management.
core.thread
Types
Unit tests / Pre and Post Contracts
try/catch/finally exception handling plus scope guarantees
Responding to a few of your points individually (I've omitted what I consider either unimportant or good):
targeted for web development
Most people use php. Not because it's the best language available, that's for sure.
embeding into applications
Lua.
distributed applications, high-reliability software (but this is for distant future).
Have you carefully studied Erlang, both its design and its reference implementation?
it enforces many things that real teams have to enforce - like one crossplatform IDE, one codestyle, one web framework.
If your language becomes successful, people will make other IDEs, other code styles, other web frameworks.
Multithreading is part of language, not a library.
Really good languages for multithreading forbid side effects inside threads. Yes, in practice that pretty much means Erlang only.
Types have the same width on any platform. int(32), long(64) e.t.c
Sigh... There's only one reasonable width for integers outside of machine-level languages like C: infinite.
Designing your own language will undoubtedly teach you someting. But designing a good language is like designing a good cryptosystem: lots of amateurs try, but it takes an expert to do it well.
I suggest you read some of Norman Ramsey's answers here on programming language design, starting with this thread.
Given your interest in distributed applications, knowing Erlang is a must. As for sequential programming, the minimum is one imperative language and one functional language (ideally both Lisp/Scheme and Haskell, but F# is a good start). I also recommend knowing at least one high-level language that doesn't have objects, just so you understand that not having objects can often make the programmer's life easier (because objects are complex).
As for what could drive other people to learn your language... Good tools/libraries/frameworks can't hurt (FORTRAN, php), and a big company setting the example can't hurt (Java, C#). Good design doesn't seem to be much of a factor (a ha-ha-only-serious joke has it that what makes a language successful is using {braces} to delimit blocks: C, C++, Java, C#, php)...
What you've given us is a list of features, with no coherent philosophy, or explanation as to how they will work together. None of the features are unique. At best, you're offering incremental improvements over what's already there. I'd expect there's already languages kicking around with what you've said, it's just that they're still fairly obscure, because they didn't make it.
Languages have inertia. People have to learn new languages, and sometimes new tools. They need incentive to do so, and 20% improvement in a few features doesn't cut it.
What you need, at a minimum, is a killer app and a form of elevator pitch. (The "elevator pitch" is what you tell the higher-ups about your project when you're in the elevator with them, in current US business parlance.) You need to have your language be obviously worth learning for some purpose, and you need to be able to tell people why it's worth learning before they think "just another language by somebody who wanted to write a language" and go away.
You need to form a language community. That community needs to have some localization at first: people who work in X big company, people who want to do Y, whatever. Decide on what that community is likely to be, and come up with one big reason to switch and some reasons to believe that your language can deliver what it promises.
No.
Every buzzword you have included in your feature list is an enormous amount of work to be spec'd, implemented, documented, and tested.
How many people will be actively developing the language? I guess the web is full of failed programming language projects. (Same is true for non-mainstream OSes)
Have a look at what .Net/Visual Studio or Java/Eclipse have accomplished. That's 1000s of years of specification, development, tests, documentation, feedback, bug fixes, service packs.
During my last job I heard about somebody who wrote his own programming framework, because it was "better". The resulting program code (both in the framework and in the applications) is certainly unmaintainable once the original programmer quits, or is "hit by a bus", as the saying goes.
As the list sounds like Java++ or Mono++, you'd probably be more successful in engaging in an existing project, even if it won't have your name tag on it.
Perhaps you missed one key term. Performance.
In any case, unless this new language has some really out-of-this-world features(ex: 100% increase in performance over other web development languages), I think it will be yet another fish in the pond.
Currently I'm responsible for maintaining a framework developed/owned by my company. It's a nightmare. Unless there is a mainstream community, working on this full time, it's really an elephant. I do not appreciate my company's decision to develop its own framework(because it's supposed to be "faster") day 'n night.
The language tastes good in my opinion, I don't want use java for a simple website but I would like to have types and things like that. ASP .NET is a problem because of licensing and I can't afford those licenses for a single website... Also features looks good
Remember a lot of operator overloading: I think is the biggest thing that PHP is actually missing. It allows classes to behave much more like basic types :)
When you have something to test I'll love to help you with it! Thanks
Well, if you have to reinvent the wheel, you can go for it :)
I am not going to give you any examples of languages or language features, but I will give you one advice instead:
Supporting framework is what is the most important thing. People will tend to love it or hate it, depending on how easy is to write good code that get job done. Therefore, please do usability test before releasing it. I mean ask several people how they will do certain task and create API accordingly. Then test beta API on other coders and listen carefully to their comments.
Regards and good luck :)
There's always space for another programming language. Apart from getting the design right, I think the biggest problem is coming across as just another wannabe language. So you may want to look at your marketing, you need a big sponsor who can integrate your language into their products, or you need to generate a buzz around it, easiest way is astroturfing. Good luck.
http://en.wikipedia.org/wiki/List_of_programming_languages
NB the names G and G++ aren't taken. Oh and watch out for the patent trolls.
Edit
Oops G / G++ are taken... still there are plenty more letters left.
This sounds more like a "systems" language rather than a "web development language". The major languages in this category (other than C++/C) are D and Go.
My advice to you would be to not start from scratch but examine the possibility of creating tools or libraries for those languages, and seeing just how far you can push them.

What does "powerful" mean, when discussing programming languages?

In the context of programming language discussion/comparison, what does the term "power" mean?
Does it have a well defined meaning? Even a poorly defined meaning?
Say if someone says "language X is more powerful than language Y" or asks the same as a question, what do they mean - or what information are they trying to find out?
It does not have a well-defined meaning. In these types of discussions, "language X is more powerful than language Y" usually means little more than "I like language X more than language Y." On the other end of the spectrum, you'll also usually have someone chime in about how any Turing-complete language can accomplish the same tasks as any other Turing-complete language, so that neither is strictly more powerful than the other.
I think a good meaning for it is expressivity. When a language is highly expressive, it means less code is required to express concepts. To me, this doesn't just mean that you have to write less code to accomplish the same tasks, but also that the code is easily readable by humans. Of course, generally (to a point), having fewer lines of code to read makes the task of reading and understanding easier for humans.
Having a "powerful" standard library comes into play here along the same lines. If a language comes equipped with thorough, complete libraries, then idiomatic code in that language will be able to benefit from the existing library code and not have to repeat or reinvent common functionality in application code. The end result is, again, having to write and read less code to accomplish the same tasks.
I keep saying "generally" and "to a point", because once a language gets too terse, it gets more difficult for humans to decipher. I suppose at this extreme, a language may still be considered "more powerful" (or even "too powerful"). So I guess I'm saying my personal interpretation of "powerful" includes some aspects of "useful" and "readable" in it as well.
C is powerful, because it is low level and gives you access to hardware. Python is powerful because you can prototype quickly. Lisp is powerful because its REPL gives you fantastic debugging opportunities. SQL is powerful because you say what you want and the DMBS will figure out the best way to do it for you. Haskell is powerful because each function can be tested in isolation. C++ is powerful because it has ten times the number of syntactic constructs that any one person ever needs or uses. APL is powerful since it can squeeze a ten-screen program into ten characters. Hell, COBOL is powerful because... why else would all the banks be using it? :)
"Powerful" has no real technical meaning, but lots of people have made proposals.
A couple of the more interesting ones:
Paul Graham wants to call a language "more powerful" if you can write the same programs in fewer lines of code (or some other sane, sensible measure of program size).
Matthias Felleisen has written a very serious theoretical study called On the Expressive Power of Programming Language.
As someone who knows and uses many programming languages, I believe that there are real differences between languages, and that "power" can be a convenient shorthand to describe ways in which one language might be better than another. Nevertheless, whenever I hear a discussion or claim that one language is more powerful than another, I tend to keep one hand firmly on my wallet.
The only meaningful way to describe "power" in a programming language is "can do what I require with the least amount of resources" where "resources" is defined as "whatever costs I'd rather not pay" and could, thus, be development time, CPU time, memory space, money, etc.
So basically the definition of "power" is purely subjective and rendered meaningless in any objective discussion.
Powerful means "high in power". "Power" is something that increases your ability to do things. "Things" vary in shape, size and other things. Loosely speaking therefore, "powerful" when applied to a programming language means that it helps you to do perform your tasks quickly and efficiently.
This makes "powerful" somewhat well defined but not constant across domains. A language powerful in one domain might be crippling in another eg. C is very powerful if you want to do systems level programming since it gives you direct access to the machine and hardware and structures that let you code much faster than you would in assembly. C compilers also produce tight code that runs fast. However, once you move to web applications, C can become very "unpowerful" and crippling since it's so much effort to get something up and running and you have to worry about a lot of extraneous details like memory etc.
Sometimes, languages are "powerful" in multiple domains. This gives them a general "powerful" tag (or badge since were are on SO here). PG's claim is that with LISP, this is the case. That might be true or might not be.
At the end of the day, "powerful" is a loaded word so you should evaluate who is saying it, why he's saying it and what it means to to your work.
There are really only two meanings people are worried about:
"Powerful" in the sense of "takes less resources (time, money, programmers, LOC, etc.) to achieve the same/better result", and "powerful" in the sense of "is capable of doing a wide range of tasks".
Some languages are extrememly resource-effective for a small range of tasks. Others are not so resource-effective but can be applied to a wide range of tasks (e.g. C, which is often used in OS development, creation of compilers and runtime libraries, and work with microcontrollers).
Which of these two meanings someone has in mind when they use the term "powerful" depends on the context (and even then is not always clear). Indeed often it is a bit of both.
Typically there are two distinct meanings:
Expressive, meaning the code tends to be very short and understandable
Low level, meaning you have very fine-grained control over the hardware.
For the most languages, these two definitions are at opposite ends of the spectrum: Python is very expressive but not very low level; C is very low level but not very expressive. Depending on which definition you pick, either language is powerful or not powerful.
nothing absolutely nothing.
To high level programmers it might mean alot of available datatypes built in. Or maybe abstractions to easily create or follow Design Patterns.
Paul Graham is a very high level guy here is what he has to say:
http://www.paulgraham.com/avg.html
Java guys might tell you something about portability, the power to reach every platform.
C/UNIX programmers may tell you that its speed and efficiency, complete control over every inch of memory.
VHDL/Verilog programmers will tell you its complete control over every clock and gate so as to not waste any electricity or time.
But in my opinion a "powerful language" supports all of the features for you to complete your task. Documentation may be important, or perhaps it is portability, or the ability to do graphics. It could be anything, writing a gui from Assembly is just stupid, so is trying to design an embedded processor in flash.
Choosing a language that suits your needs perfectly will always feel like power.
I view the term as marketing fluff, no one well-defined meaning.
If you consider, say, Assembler, C, and C++. On occasions one drops from C++ "down" to C for particualr needs, and in turn from C down to assembler. So that make assembler the most powerful because it's the only language that can do everything. Or, to argue the other way, a single line of C++ code can replace several of C (hiding polymorphic dispatch via function pointers for example) and a single line of C replaces many of assembler. So C++ is more powerful because one line does "more".
I think the term had some currency when products such as early databases and spreadsheets had in-built languages, some quite restricted. So vendors would tout their language as being "powerful" because it was less restricted.
It can have several meanings. In the very basic sense there's power as far as what is computable. In that sense the most powerful languages are Turing Complete which includes pretty much every general purpose programming language (as opposed to most markup languages and domain specific languages which are often not Turing complete).
In a more pragmatic sense it often refers to how concisely (and readably) you can do certain things. Basically how easy is it to do certain tasks in one language compared to another.
What language is more powerful (besides being somewhat subjective) depends heavily on what you're trying to do. If your requirements are to get something running on a small device with 64k of memory you're likely not going to be using Java. Most likely the right language would be C or C++ (or if you're really hard core assembly). If you need a very simple CRUD app done in 1 day, maybe something like Ruby On Rails would be the way to go (I know Rails is a framework and Ruby is the language, but these days what libraries and frameworks are available factor greatly into picking a language)
I think that, perhaps coincidentally, the physics definition of power is relevant here: "The rate at which work is performed."
Of course, a toaster does not perform very quickly the work of putting out fires. Similarly, the power of a programming language is not universal, but specific to the domain or task to which it is being applied. C is a powerful language for writing device drivers or implementations of higher-level languages; Python is a powerful language for writing general-purpose applications; XPath is a powerful language for writing queries on structured data sets.
So given a problem domain, the power of a language can be said to be the rate at which a competent programmer is able to use it to solve problems in that domain.
A precise answer can be tried to reach, by not assuming that the elements that define "powerful" (in the context of languages) come from so many dimensions.
See how many could be, and a lot will be missing:
runtime speed
code size
expressiveness
supported paradigms
development / debugging time
domain specialization
standard libs
codebase
toolchain ecosystem
portability
community
support / documentation
popularity
(add more here)
These and more parameters draw together X picture of how "programming in some language" would be like at X level. That will be only the definition, though, the only real knowledge comes with the actual practice of using the language, but i digress.
The question comes down to which parameter will represent the intrinsic quality of a language. If you refer to a language in itself, its ultimate, intrinsic purpose is "express things", and thus the most representative parameter is rightfully expressiveness, and is also one that resonates frequently when someone talks about how powerful a language is.
At the moment you try to widen the question/answer to cover more than the expressiveness of the language "as a language, as a tongue", you are more talking about different kinds of "environment", social environment, development environment, commercial environment, etc.
Depending of the complexity of the environment to be defined you'll have to mix more parameters that come from multiple, vast, overlapping and sometimes contradictory dimensions, and eventually the point of getting the definition will be lost or the question will have to be narrowed.
This approximation still won't answer "what is an expressive language", but, again, a common understanding are the definitions that Vineet well points out in its answer, and Forest remarks in the comments. I agree, for me "expression" is "conveying meaning".
I remember many instructors in college calling whatever language they were teaching "powerful".
Leads me to think:
Powerful = a relative term comparing the latest way to code something vs. the original or previous way.
I find it useless to use the word "powerful" in regards to discussing anything software related. Every time my professor in college would introduce a new concept such as polymorphism he would say "so this is a really powerful feature". After a while I got annoyed. If everything is powerful then nothing is. It's all the same. You can write code to do anything. Does is really matter how much code is required to do it? You can say it's short or efficient but powerful is just useless. Nuclear energy is powerful. Code is words.
I think that power would normally refer to how quickly it can process data, for example I found that in python as soon as a list exceeds a length of approx. 2000 it becomes unbearably slow whereas in C++ a list can easily contain 20,000 entries without doing so.

Trivial mathematical problems as language benchmarks

Why do people insist on using trivial mathematical problems like finding numbers in the Fibonacci sequence for language benchmarks? Don't these usually get optimized to relativistic speeds? Isn't the brunt of the bottlenecks usually in I/O, system API calls, operations on strings and structures, processing large quantities of data, abstract object-oriented stuff, etc?
It is a throwback to the old days, when compiler technology for what we would now call basic math was still evolving rapidly.
Now, compiler evolution is more focused on exploiting new instructions for niche operations, 64-bit math, and so on.
Micro-benchmarks such as the ones you mention were useful, though, when evaluating the efficiency of the hotspot compiler when Java was first launched, and in evaluating the efficiency of .NET versus C/C++.
Your suggestion that I/O and system calls are the likely bottlenecks is correct, at least for some space of problems. But I notice you suggested string operations. One person's irrelevant micro-benchmark is another person's critical performance metric.
EDIT: ps, I also remember using linpack and other micro-benchmarks to compare versions of the JVM, and to compare vendors of the JVM. From v4 to v5 there was a big jump in perf, I guess the JIT compiler got more effective. Also, IBM's JVM was ahead of Sun's at that time, on Windows-x86.
Because if you want to benchmark the language/compiler, these "math problems" are good indicators of the "bare speed" of the generated code. Either they use the iterative solution, which is a tight loop and indicates how well can the compiler push the instructions to the processor, or they use the recursive solution, which indicates how does it handle recursive calls of short functions (inlining, tail-recursion etc.) (although the Ackermann function is usually used for that too).
Usually, the benchmark suite for the language contain tests benchmarking other parts as well - eg. gzip compression, text searching, object creation, virtual function call, exception throw/catch benchmarks.
The other things you've noticed, syscalls and IO are usually not included because
syscalls are in fact not that slow - applications don't spend significant porion of the time in the kernel, except for test specifically targeted at them or when something is seriously wrong with the program
syscall and IO performance does not depend on the language, but rather on the OS & hardware
I'd think a simple, well-established algorithm would remove the possibility that the benchmark is biased (whether through ignorance or malice) to favor one language. It is very difficult to write a complex program in two different languages exactly the same. Testing something like the efficiency of a multithreaded application in c# vs java, for example, would require developers skilled in multithreaded development both languages, and there would still be questions as to whether the benchmark app properly represents the general case, or if it is misrepresenting a special case that only one language handles well.
Back when the sieve of eratosthanes was a popular benchmark for C compilers, I thought it would be funny if one of the compiler authors would recognize the sieve code and replace it with a pre-computed lookup.

What fast low-level languages can you recommend?

I have become interested in C-like languages for performance computing. Can you recommend some alternative programming languages which have the following attributes:
must be close to the hardware (bit fiddling, pointers or some alternative safe method like references)
no managed code (no jvm/.net languages)
has to be really fast (like C)
must be above ASM level (and yes I am interested in macro languages on top of ASM)
can be obscure, not very widespread
I am mainly interested in little-known languages.
How about Assembly language, or the D programming language?
If you don't know about it and are interested just in broadening your horizons, take a look at Forth. Reading about Forth always makes me feel C is high-level.
Well, I've always preferred C and/or C++ because there are multiple flavours (MSVC, glibc etc), it runs on many different platforms (e.g. mobile devices, Windows, linux) and devices, and it can be written cross platform (different processor architectures) and even for high end graphics (e.g. DirectX).
You get "decent" access to platform resources (conditions vary), it can be as fast as you choose to hone it, and it's a tad easier (IMHO) to write than ASM. There's also a pretty decent range of support tools and code analysis tools to make things a little easier.
Also C and C++ have been around for quite some time, so it's got (even today) an excellent and enthusiastic community!
You don't explicitly state that it can't be C in your question, so I'll go ahead and recommend C. It fulfills your three bulleted desires, and you won't have to worry about different versions of the language (like each different kind of assembler).
Forth!
Forth can be faster than machine language on some architectures. The compiled code is extremely dense, therefore, making optimal use of code caching.
assembly would be the closest to the hardware and therefore the fastest
Ada was originally designed for embedded systems (among other things).
OpenCL might be interesting. It's sort of like OpenGL shader language (a subset of C with extensions), but for general purpose parallel array computing.
You could start programming FPGAs in VHDL, Verilog, System C ...
Variations on a theme
FORTRAN is older than C, and is still one of the major players in numerical computing. Until 1990 (when the language was substantially modernized), the language didn't have any form of pointer (checked or not). This lack meant that there was no way to manage memory dynamically; it also made aliasing analysis easy for the compiler, which is one of the things that makes Fortran code fast.
ALGOL was the first structured programming language. Although it had limited success with programmers, it had a strong influence on language designers.
Ada is an imperative language with a strong type system and good modularity, which makes it good for low-level programming with strong assurance requirements (it was sponsored by the US government with military and avionics applications in mind). It was inspired by Pascal, like Modula-2 and Modula-3.
Going further from the mainstream of low-level imperative programming, there is FORTH. FORTH can be compiled for, and even interpreted on, devices with very little memory; it finds a lot of use on low-end embedded systems, including microcontrollers. The language is based on reverse polish notation, made famous by HP calculators (in fact, the language of HP calculators is strongly influenced by FORTH). Many implementations don't have variables: all data is kept on one or more stacks.
Just for fun, I'll mention INTERCAL, the grandaddy of esoteric languages.
Stuff that will blow your mind
Esoteric languages can be instructive, and a quite a few work close to the machine (usually a virtual machine, but in principle you could implement them for an actual computer if you were crazy enough). You could look at brainfuck (a sort of intermediate stage between Turing machines and C), or the many single-instruction languages, or befunge (what if memory was a two-dimensional array?).
Cyclone looks a lot like C. The syntax is the same, and Cyclone has pointers, untagged structures and unions, goto statements and manual memory management. And yet it's a safe language: you can't have a dangling pointer, or a buffer overflow. And you have access to high-level features such as pattern matching, exceptions, polymorphism, abstract types and optional automatic memory management (not just garbage collection, but also regions). Cyclone is both useful and instructive; for a C die-hard, it can be a good way of discovering what makes a safe language. Cyclone can compile to C, so you can run your programs anywhere you have a C compiler for.
Going in a different direction, if you want to be close to the hardware, while still not actually designing hardware, have a look at synchronous languages, such as Lustre and Esterel. These languages are used to program high-assurance realtime systems such as nuclear plants, airplanes and railway signaling. These languages give up Turing completeness and gain the assurance that programmers can know exactly how fast their program will run and how much memory it will require. If you think C is close to the machine, finding out what a language that is really close to the machine may come as a shock.
You can't get much closer than assembly language, unless you get a job with a chip-maker and start writing micro code!!!
If you're on Windows I think you can get hold of Microsoft MASM (macro assembler) that will allow you go get up and running quickly. I used it a long time ago and it's not a bad product.
Seems a bit awkward to answer my question, but I have found two languages:
Pyrex
Vala
They may not fulfill all of the constraints, but they are great for performance computing and both translates to C.

Resources