Can any language potentially be used to create any program? - programming-languages

I've heard that given a programmer with enough time and skill in any particular language and enough lines of code, then any program could be created with any given language. I know its certainly not going to be cost-efficient, for instance, to rewrite Adobe Photoshop in BASIC, but could a good enough and patient enough programmer potentially create any program in any language?

If a language is Turing complete, then theoretically you can write any program in it -- However, even this has some limitations, such as user interface and OS APIs. For example, Brainfuck is Turing complete, but there's no way to have a GUI because you can't access video memory, and there's no threading support. However, it is possible to do any computational task with it.

This depends on exactly how you define "any program" and "any language".
Let's start with the first one: "any program". There are many programs (in fact, there are infinitely many programs) that cannot be written at all, regardless of the programming language. One of the most famous ones is the so-called Halting Problem: write a program H, which when given any program P and any input x determines whether P(x) will eventually halt. Alan Turing proved many decades ago that it is impossible for such a program to exist. Ergo, you cannot write this program in any programming language.
Now, let's talk about "any language". There are actually different classes of languages. Some are more powerful than others. For example regular expressions (which are a kind of programming language) can not compute any arbitrary function. They are limited in their computational power. However, most general purpose programming languages are what is generally called "Turing-complete".
Brief bit of history: in order to prove the undecidability of the Halting Problem, Alan Turing invented a hypothetical machine called the Turing Machine. A TM is basically a hypothetical computer with infinite memory, that computes a particular function. It turns out that you can build a Universal Turing Machine which can emulate any other Turing Machine.
At about the same time, Alonzo Church invented the Lambda Calculus. The LC is also an abstract mathematical model of computation, but completely different. People started wondering: which of those two models is more powerful? Is there anything that a UTM can compute that LC cannot and vice versa? Can the LC solve the Halting Problem?
As it turns out, you can write an emulator for a UTM in LC and you can build a TM which interprets LC. This means that a TM can compute everything LC can compute (by simply running it in the interpreter) and LC can compute everything a UTM can compute (by simply running it in the emulator). So, we have
LC ≤ UTM ∧ UTM ≤ LC ⇒ LC = UTM
In English: LC and UTM are exactly equally powerful. In fact, it turns out that every model of computation and every machine and every programming language we have ever found is at most as powerful as LC and UTM and indeed every other model. This leads to the so-called Church-Turing-Thesis which states all sufficiently powerful models of computation are equally powerful and there can be no model of computation that is more powerful than UTM or LC. (There can be models of computation that are less powerful, like for example regular expressions or total functions or a language with only bounded loops.)
We call such models of computation Turing-complete. And, BTW, you don't need much to be Turing-complete.
So, with that out of the way, we can now define what we mean by "any program" and "any language":
If a program can be written in one Turing-complete language, then it can be written in any Turing-complete language.

It's all a question of time, isn't it?
The lack of suitable libraries and APIs to use for BASIC may make the Adobe Photoshop project take forever, and it might not run very smoothly when finished, but is theoretically doable.

The language has to be Turing-complete and would also need a way to access the native OS for many different operations like files,sockets etc...

Sure(sorta). There is a trade off, performance. Also some languages may not be able to access certain functions of the system making them unable to do certain tasks on the machine. Some languages are just... weaker than others and it would take a while.

You could also re-create Windows 95 by typing bit for bit into a hex editor but what's the point?

You have to be careful with what you mean by "any program". For example, if you were asked to write a program that creates a text file on disk containing "Hello world", and you were asked to write it in pure Javascript, it would be impossible because pure Javascript has no facility to write anything to disk.
For a thorough discussion of this idea, you may wish to read about computability.

I guess if your language has means to access all the input and output necessary then yes.

If you added "... on a powerful enough computer, and given that the language has libraries that can handle whatever the program needs to do", then the answer would still be no. Some languages can drive programmers so crazy that they kill themselves. If I was ever forced to go back to Visual Basic 3 (no classes or collections) I wouldn't know how to rewrite Notepad.

I think if you add "creative enough" and you include exploitations to be considered programming and the definition of "any program" to be any program that aready exsists, then the answer is yes.

Related

Is natural language Turing complete?

I'm pretty sure a human language (e.g. English) is powerful enough to simulate a Turing machine, which would make it Turing complete. However, that would imply natural languages are no more or less expressive than programming languages, which seems questionable.
Is natural language Turing complete?
First of all "Is language X Turing complete" is only a well-defined question given a well-defined semantics for language X. It is nearly impossible to define one for natural languages due to natural languages' complex nature and reliance on context and intuition. Most (all?) natural languages don't even have a well-defined syntax.
That aside, your main confusion is based on the assumption that it's not possible for a computational model to be strictly more powerful than a Turing machine, i.e. be able to simulate a Turing machine, but also to express computations that a Turing machine can not. This is not true. For example we can extend Turing machines with oracles and we get a computational model that's strictly more powerful than plain Turing machines.
In the same vein we could define a programming language MagicLang that can do everything an ordinary programming language can do plus solve the halting problem. Defining a semantics for such a language is easy: just take the semantics of the language we used as a basis and add a function bool halts(string src, string input) with the semantics "returns true if the program described by the source code src successfully terminates after a finite amount of time when given the input input". So that's easy. What's hard, or rather impossible, is implementing this language.
Now one may argue that natural language can also describe the halting problem and our brain can "execute" natural language, i.e. it can answer the question "does this program halt". So if we could build a computer that could do everything our brain can do, it should be able to do this as well. But the thing is our brain can't solve the halting problem with 100% accuracy. Our brain can't even execute regular programs with 100% accuracy. Just remember how often you've stepped through a program in your head and came up with a different result than reality. Our brain is very good at learning, making intuitive connections and applying heuristics, but those things always come with the risk of giving the wrong result.
So could a computer do the same thing? Yes, we can use heuristics and machine learning to approach otherwise unsolvable problems and with that normal programming languages can attempt to solve every problem that can be described in natural language (even the undecidable ones). But just like the brain, those programs will sometimes give wrong results. In fact they will give wrong results much more often as our machine learning algorithms and heuristics aren't nearly as advanced as those of the human brain.
If a software language is sufficiently complex that it can be used to define arbitrary extensions to itself (such as defining arbitrary new functions), then it's clearly Turing-complete.
Using natural language I can, given sufficient time, teach another human terminology and concepts to extend their understanding and ability to discuss arbitrary subjects that they previously couldn't -- I could teach them copyright law, or astrophysics, for example (if they didn't already know them). So, while this may be more of an analogy than an exact identity, there does seem to be a Turing-completeness-like property to natural languages: they can be used to define and transmit arbitrary extensions to themselves. (Admittedly, not every human is really cut out to learn astrophysics -- but then any non-idealized Turning machine has only some finite amount of memory, so it's always possible to define a program that it can't run because it doesn't have enough memory.)

What does "powerful" mean, when discussing programming languages?

In the context of programming language discussion/comparison, what does the term "power" mean?
Does it have a well defined meaning? Even a poorly defined meaning?
Say if someone says "language X is more powerful than language Y" or asks the same as a question, what do they mean - or what information are they trying to find out?
It does not have a well-defined meaning. In these types of discussions, "language X is more powerful than language Y" usually means little more than "I like language X more than language Y." On the other end of the spectrum, you'll also usually have someone chime in about how any Turing-complete language can accomplish the same tasks as any other Turing-complete language, so that neither is strictly more powerful than the other.
I think a good meaning for it is expressivity. When a language is highly expressive, it means less code is required to express concepts. To me, this doesn't just mean that you have to write less code to accomplish the same tasks, but also that the code is easily readable by humans. Of course, generally (to a point), having fewer lines of code to read makes the task of reading and understanding easier for humans.
Having a "powerful" standard library comes into play here along the same lines. If a language comes equipped with thorough, complete libraries, then idiomatic code in that language will be able to benefit from the existing library code and not have to repeat or reinvent common functionality in application code. The end result is, again, having to write and read less code to accomplish the same tasks.
I keep saying "generally" and "to a point", because once a language gets too terse, it gets more difficult for humans to decipher. I suppose at this extreme, a language may still be considered "more powerful" (or even "too powerful"). So I guess I'm saying my personal interpretation of "powerful" includes some aspects of "useful" and "readable" in it as well.
C is powerful, because it is low level and gives you access to hardware. Python is powerful because you can prototype quickly. Lisp is powerful because its REPL gives you fantastic debugging opportunities. SQL is powerful because you say what you want and the DMBS will figure out the best way to do it for you. Haskell is powerful because each function can be tested in isolation. C++ is powerful because it has ten times the number of syntactic constructs that any one person ever needs or uses. APL is powerful since it can squeeze a ten-screen program into ten characters. Hell, COBOL is powerful because... why else would all the banks be using it? :)
"Powerful" has no real technical meaning, but lots of people have made proposals.
A couple of the more interesting ones:
Paul Graham wants to call a language "more powerful" if you can write the same programs in fewer lines of code (or some other sane, sensible measure of program size).
Matthias Felleisen has written a very serious theoretical study called On the Expressive Power of Programming Language.
As someone who knows and uses many programming languages, I believe that there are real differences between languages, and that "power" can be a convenient shorthand to describe ways in which one language might be better than another. Nevertheless, whenever I hear a discussion or claim that one language is more powerful than another, I tend to keep one hand firmly on my wallet.
The only meaningful way to describe "power" in a programming language is "can do what I require with the least amount of resources" where "resources" is defined as "whatever costs I'd rather not pay" and could, thus, be development time, CPU time, memory space, money, etc.
So basically the definition of "power" is purely subjective and rendered meaningless in any objective discussion.
Powerful means "high in power". "Power" is something that increases your ability to do things. "Things" vary in shape, size and other things. Loosely speaking therefore, "powerful" when applied to a programming language means that it helps you to do perform your tasks quickly and efficiently.
This makes "powerful" somewhat well defined but not constant across domains. A language powerful in one domain might be crippling in another eg. C is very powerful if you want to do systems level programming since it gives you direct access to the machine and hardware and structures that let you code much faster than you would in assembly. C compilers also produce tight code that runs fast. However, once you move to web applications, C can become very "unpowerful" and crippling since it's so much effort to get something up and running and you have to worry about a lot of extraneous details like memory etc.
Sometimes, languages are "powerful" in multiple domains. This gives them a general "powerful" tag (or badge since were are on SO here). PG's claim is that with LISP, this is the case. That might be true or might not be.
At the end of the day, "powerful" is a loaded word so you should evaluate who is saying it, why he's saying it and what it means to to your work.
There are really only two meanings people are worried about:
"Powerful" in the sense of "takes less resources (time, money, programmers, LOC, etc.) to achieve the same/better result", and "powerful" in the sense of "is capable of doing a wide range of tasks".
Some languages are extrememly resource-effective for a small range of tasks. Others are not so resource-effective but can be applied to a wide range of tasks (e.g. C, which is often used in OS development, creation of compilers and runtime libraries, and work with microcontrollers).
Which of these two meanings someone has in mind when they use the term "powerful" depends on the context (and even then is not always clear). Indeed often it is a bit of both.
Typically there are two distinct meanings:
Expressive, meaning the code tends to be very short and understandable
Low level, meaning you have very fine-grained control over the hardware.
For the most languages, these two definitions are at opposite ends of the spectrum: Python is very expressive but not very low level; C is very low level but not very expressive. Depending on which definition you pick, either language is powerful or not powerful.
nothing absolutely nothing.
To high level programmers it might mean alot of available datatypes built in. Or maybe abstractions to easily create or follow Design Patterns.
Paul Graham is a very high level guy here is what he has to say:
http://www.paulgraham.com/avg.html
Java guys might tell you something about portability, the power to reach every platform.
C/UNIX programmers may tell you that its speed and efficiency, complete control over every inch of memory.
VHDL/Verilog programmers will tell you its complete control over every clock and gate so as to not waste any electricity or time.
But in my opinion a "powerful language" supports all of the features for you to complete your task. Documentation may be important, or perhaps it is portability, or the ability to do graphics. It could be anything, writing a gui from Assembly is just stupid, so is trying to design an embedded processor in flash.
Choosing a language that suits your needs perfectly will always feel like power.
I view the term as marketing fluff, no one well-defined meaning.
If you consider, say, Assembler, C, and C++. On occasions one drops from C++ "down" to C for particualr needs, and in turn from C down to assembler. So that make assembler the most powerful because it's the only language that can do everything. Or, to argue the other way, a single line of C++ code can replace several of C (hiding polymorphic dispatch via function pointers for example) and a single line of C replaces many of assembler. So C++ is more powerful because one line does "more".
I think the term had some currency when products such as early databases and spreadsheets had in-built languages, some quite restricted. So vendors would tout their language as being "powerful" because it was less restricted.
It can have several meanings. In the very basic sense there's power as far as what is computable. In that sense the most powerful languages are Turing Complete which includes pretty much every general purpose programming language (as opposed to most markup languages and domain specific languages which are often not Turing complete).
In a more pragmatic sense it often refers to how concisely (and readably) you can do certain things. Basically how easy is it to do certain tasks in one language compared to another.
What language is more powerful (besides being somewhat subjective) depends heavily on what you're trying to do. If your requirements are to get something running on a small device with 64k of memory you're likely not going to be using Java. Most likely the right language would be C or C++ (or if you're really hard core assembly). If you need a very simple CRUD app done in 1 day, maybe something like Ruby On Rails would be the way to go (I know Rails is a framework and Ruby is the language, but these days what libraries and frameworks are available factor greatly into picking a language)
I think that, perhaps coincidentally, the physics definition of power is relevant here: "The rate at which work is performed."
Of course, a toaster does not perform very quickly the work of putting out fires. Similarly, the power of a programming language is not universal, but specific to the domain or task to which it is being applied. C is a powerful language for writing device drivers or implementations of higher-level languages; Python is a powerful language for writing general-purpose applications; XPath is a powerful language for writing queries on structured data sets.
So given a problem domain, the power of a language can be said to be the rate at which a competent programmer is able to use it to solve problems in that domain.
A precise answer can be tried to reach, by not assuming that the elements that define "powerful" (in the context of languages) come from so many dimensions.
See how many could be, and a lot will be missing:
runtime speed
code size
expressiveness
supported paradigms
development / debugging time
domain specialization
standard libs
codebase
toolchain ecosystem
portability
community
support / documentation
popularity
(add more here)
These and more parameters draw together X picture of how "programming in some language" would be like at X level. That will be only the definition, though, the only real knowledge comes with the actual practice of using the language, but i digress.
The question comes down to which parameter will represent the intrinsic quality of a language. If you refer to a language in itself, its ultimate, intrinsic purpose is "express things", and thus the most representative parameter is rightfully expressiveness, and is also one that resonates frequently when someone talks about how powerful a language is.
At the moment you try to widen the question/answer to cover more than the expressiveness of the language "as a language, as a tongue", you are more talking about different kinds of "environment", social environment, development environment, commercial environment, etc.
Depending of the complexity of the environment to be defined you'll have to mix more parameters that come from multiple, vast, overlapping and sometimes contradictory dimensions, and eventually the point of getting the definition will be lost or the question will have to be narrowed.
This approximation still won't answer "what is an expressive language", but, again, a common understanding are the definitions that Vineet well points out in its answer, and Forest remarks in the comments. I agree, for me "expression" is "conveying meaning".
I remember many instructors in college calling whatever language they were teaching "powerful".
Leads me to think:
Powerful = a relative term comparing the latest way to code something vs. the original or previous way.
I find it useless to use the word "powerful" in regards to discussing anything software related. Every time my professor in college would introduce a new concept such as polymorphism he would say "so this is a really powerful feature". After a while I got annoyed. If everything is powerful then nothing is. It's all the same. You can write code to do anything. Does is really matter how much code is required to do it? You can say it's short or efficient but powerful is just useless. Nuclear energy is powerful. Code is words.
I think that power would normally refer to how quickly it can process data, for example I found that in python as soon as a list exceeds a length of approx. 2000 it becomes unbearably slow whereas in C++ a list can easily contain 20,000 entries without doing so.

Non-deterministic programming languages

I know in Prolog you can do something like
someFunction(List) :-
someOtherFunction(X, List)
doSomethingWith(X)
% and so on
This will not iterate over every element in List; instead, it will branch off into different "machines" (by using multiple threads, backtracking on a single thread, creating parallel universes or what have you), with a separate execution for every possible value of X that causes someOtherFunction(X, List) to return true!
(I have no idea how it does this, but that's not important to the question)
My question is: What other non-deterministic programming languages are out there? It seems like non-determinism is the simplest and most logical way to implement multi-threading in a language with immutable variables, but I've never seen this done before - Why isn't this technique more popular?
Prolog is actually deterministic—the order of evaluation is prescribed, and order matters.
Why isn't nondeterminism more popular?
Nondeterminism is unpopular because it makes it harder to reason about the outcomes of your programs, and truly nondeterministic executions (as opposed to semantics) are hard to implement.
The only nondeterministic languages I'm aware of are
Dijkstra's calculus of guarded commands, which he wanted never to be implemented
Concurrent ML, in which communications may be synchronized nondeterministically
Gerard Holzmann's Promela language, which is the language of the model checker SPIN
SPIN does actually use the nondeterminism and explores the entire state space when it can.
And of course any multithreaded language behaves nondeterministically if the threads are not synchronized, but that's exactly the sort of thing that's difficult to reason about—and why it's so hard to implement efficient, correct lock-free data structures.
Incidentally, if you are looking to achieve parallelism, you can achieve the same thing by a simple map function in a pure functional language like Haskell. There's a reason Google MapReduce is based on functional languages.
The Wikipedia article points to Amb which is a Scheme-derivative with capacities for non-deterministic programming.
As far as I understand, the main reason why programming languages do not do that is because running a non-deterministic program on a deterministic machine (as are all existing computers) is inherently expensive. Basically, a non-deterministic Turing machine can solve complex problems in polynomial time, for which no polynomial algorithm for a deterministic Turing machine is known. In other words, non-deterministic programming fails to capture the essence of algorithmics in the context of existing computers.
The same problem impacts Prolog. Any efficient, or at least not-awfully-inefficient Prolog application must use the "cut" operator to avoid exploring an exponential number of paths. That operator works only as long as the programmer has a good mental view of how the Prolog interpreter will explore the possible paths, in a deterministic and very procedural way. Things which are very procedural do not mix well with functional programming, since the latter is mostly an effort of not thinking procedurally at all.
As a side note, in between deterministic and non-deterministic Turing machines, there is the "quantum computing" model. A quantum computer, assuming that one exists, does not do everything that a non-deterministic Turing machine can do, but it can do more than a deterministic Turing machine. There are people who are currently designing programming languages for the quantum computer (assuming that a quantum computer will ultimately be built). Some of those new languages are functional. You may find a host of useful links on this Wikipedia page. Apparently, designing a quantum programming language, functional or not, and using it, is not easy and certainly not "simple".
One example of a non-deterministic language is Occam, based on CSP theory. The combination of the PAR and ALT constructs can give rise to non-deterministic behaviour in multiprocessor systems, implementing fine grain parallel programs.
When using soft channels, i.e. channels between processes on the same processor, the implementation of ALT will make the behaviour close to deterministic†, but as soon as you start using hard channels (physical off-processor communication links) any illusion of determinism vanishes. Different remote processors are not expected to be synchronised in any way and they may not even have the same core or clock speed.
†The ALT construct is often implemented with a PRI ALT, so you have to explicitly code in fairness if you need it to be fair.
Non-determinism is seen as a disadvantage when it comes to reasoning about and proving programs correct, but in many ways once you've accepted it, you are freed from many of the constraints that determinism forces on your reasoning.
As long as the sequencing of communication doesn't lead to deadlock, which can be done by applying CSP techniques, then the precise order in which things are done should matter much less than whether you get the results that you want in time.
It was arguably this lack of determinism which was a major factor in preventing the adoption of Occam and Transputer systems in military projects, dominated by Ada at the time, where knowing precisely what a CPU was doing at every clock cycle was considered essential to proving a system correct. Without this constraint, Occam and the Transputer systems it ran on (the only CPUs at the time with a formally proven IEEE floating point implementation) would have been a perfect fit for hard real-time military systems needing high levels of processing functionality in a small space.
In Prolog you can have both non-determinism and concurrency. Non-determinism is what you described in your question concerning the example code. You can imagine that a Prolog clause is full of implicit amb statements. It is less known that concurrency is also supported by logic-programming.
History says:
The first concurrent logic programming language was the Relational
Language of Clark and Gregory, which was an offshoot of IC-Prolog.
Later versions of concurrent logic programming include Shapiro's
Concurrent Prolog and Ueda's Guarded Horn Clause language GHC.
https://en.wikipedia.org/wiki/Concurrent_logic_programming
But today we might just go with treads inside logic programming. Here is an example to implement a findall via threads. This can also be modded to perform all kinds of tasks on the collection, or maybe even produce agent networks towards distributed artificial intelligence.
I believe Haskell has the capability to construct and non-deterministic machine. Haskell at first may seem too difficult and abstract for practical use, but it's actually very powerful.
There is a programming language for non-deterministic problems which is called as "control network programming". If you want more information go to http://controlnetworkprogramming.com. This site is still in progress but you can read some info about it.
Java 2K
Note: Before you click the link and being disappointed: This is an esoteric language and has nothing to do with parallelism.
The Sly programming language under development at IBM Research is an attempt to include the non-determinism inherent in multi-threaded execution in the execution of certain types of algorithms. Looks to be very much a work in progress though.

What is a computer programming language?

At the risk of sounding naive, I ask this question in search of a deeper understanding of the concept of programming languages in general. I write this question for my own edification and the edification of others.
What is a useful definition of a computer programming language and what are its basic and necessary components? What are the key features that differentiate languages (functional, imperative, declarative, object oriented, scripting, etc...)?
One way to think about this question. Imagine you are looking at the hardware of a modern desktop or laptop computer. Assume, that the C language or any of its variants do not exist. How would you describe to others all the things needed to make the computer expressive and functional in terms of what we expect of personal computers today?
Tangentially related, what is it about computer languages that allow other languages to exist? For example take a scripting language like Javascript, Perl, or PHP. I assume part of the definition of these is that there is an interpreter most likely implemented in C or C++ at some level. Is it possible to write an interpreter for Javascript in Javascript? Is this a requirement for a complete language? Same for Perl, PHP, etc?
I would be satisfied with a list of concepts that can be looked up or researched further.
Like any language, programming languages are simply a communication tool for expressing and conveying ideas. In this case, we're translating our ideas of how software should work into a structured and methodical form that computers (as well as other humans who know the language, in most cases) can read and understand.
What is a useful definition of a computer programming language and what are its basic and necessary components?
I would say the defining characteristic of a programming language is as follows: things written in that language are intended to eventually be transformed into something that is executed. Thus, pseudocode, while perhaps having the structure and rigor of a programming language, is not actually a programming language. Likewise, UML can express many powerful ideas in an abstract manner just like a programming language can, but it falls short because people don't generally write UML to be executed.
How would you describe to others all the things needed to make the computer expressive and functional in terms of what we expect of personal computers today?
Even if the word "programming language" wasn't part of the shared vocabulary of the group I was talking to, I think it would be obvious to the others that we'd need a way to communicate with the computer. Just as no one expects a car to drive itself (yet!) without external instructions in the form of interaction with the steering wheel and pedals, no one could expect the hardware to function without being told what to do. As noted above, a programming language is the conduit through which we can make that communication happen.
Tangentially related, what is it about computer languages that allow other languages to exist?
All useful programming languages have a property called Turing completeness. If one language in the Turing-complete set can do something, then any of them can; they are said to be computationally equivalent.
However, just because they're equally "powerful" doesn't mean they're equally nice to work with for humans. This is why many people are willing to sacrifice the unparalleled micromanagement you get from writing assembly code in exchange for the expressiveness and power you get with higher-level languages, like Ruby, Python, or C#.
Is it possible to write an interpreter for Javascript in Javascript? Is this a requirement for a complete language? Same for Perl, PHP, etc?
Since there is a Javascript interpreter written in C, it follows that it must be possible to write a Javascript interpreter in Javascript, since both are Turing-complete. However, again, note that Turing-completeness says nothing about how hard it is to do something in one language versus another -- only whether it is possible to begin with. Your Javascript-interpreter-inside-Javascript might well be horrendously inefficient, consume absurd amounts of memory, require enormous processing power, and be a hideously ugly hack. But Turing-completeness guarantees it can be done!
While this doesn't directly answer your question, I am reminded of the Revenge of the Nerds essay by Paul Graham about the evolution of programming languages. It's certainly an interesting place to start your investigation.
Not a definition, but I think there are essentially two strands of development in programming languages:
Those working their way up from what the machine can do to something more expressive and less tied to the machine (Assembly, Fortran, C, C++, Java, ...)
Those going down from some mathematical or theoretical computer science concept of computation to something implementable on a real machine (Lisp, Prolog, ML, Haskell, ...)
Of course, in reality the picture is not as neat, and both strands influence each other by borrowing the best ideas.
Slightly long rant ahead.
A computer language is actually not all that different from a human language. Both are used to express ideas and concepts in commonly understood terms. Among different human languages there are syntactic differences, but you can express the same thing in every language (does that make human languages Turing complete? :)). Some languages are better suited for expressing certain things than others.
For example, although technically not completely correct, the Inuit language seems quite suited to describe various kinds of snow. Japanese in my experience is very suitable for expressing ones feelings and state of mind thanks to a large, concise vocabulary in that area. German is pretty good for being very precise thanks to largely unambiguous grammar.
Different programming languages have different specialities as well, but they mostly differ in the level of detail required to express things. The big difference between human and programming languages is mostly that programming languages lack a lot of vocabulary and have very few "grammatical" rules. With libraries you can extend the vocabulary of a language though.
For example:
Make me coffee.
Very easy to understand for a human, but only because we know what each of the words mean.
coffee : a drink made from the roasted and ground beanlike seeds of a tropical shrub
drink : a liquid that can be swallowed
swallow : cause or allow to pass down the throat
... and so on and so on
We know all these definitions by heart, but we had to learn them at some point.
In the same way, a computer can be "taught" to "understand" words as well.
Coffee::make()->giveTo($me);
This could be a perfectly valid expression in a computer language. If the computer "knows" what Coffee, make() and giveTo() means and if $me is defined. It expresses the same idea as the English sentence, just with a different, more rigorous syntax.
In a different environment you'd have to say slightly different things to get the same outcome. In Japanese for example you'd probably say something like:
コーヒーを作ってもらっても良いですか?
Kōhī o tsukuttemoratte mo ii desu ka?
Which would roughly translate to:
if ($Person->isAgreeable('Coffee::make()')) {
return $Person->return(Coffee::make());
}
Same idea, same outcome, but the $me is implied and if you don't check for isAgreeable first you may get a runtime error. In computer terms that would be somewhat analogous to Ruby's implied behaviour of returning the result of the last expression ("grammatical feature") and checking for available memory first (environmental necessity).
If you're talking to a really slow person with little vocabulary, you probably have to explain things in a lot more detail:
Go to the kitchen.
Take a pot.
Fill the pot with water.
...
Just like Assembler. :o)
Anyway, the point being, a programming language is actually a language just like a human language. Their syntax is different and specialized for the problem domain (logic/math) and the "listener" (computers), but they're just ways to transport ideas and concepts.
EDIT:
Another point about "optimization for the listener" is that programming languages try to eliminate ambiguity. The "make me coffee" example could, technically, be understood as "turn me into coffee". A human can tell what's meant intuitively, a computer can't. Hence in programming languages everything usually has one and one meaning only. Where it doesn't you can run into problems, the "+" operator in Javascript being a common example.
1 + 1 -> 2
'1' + '1' -> '11'
See "Programming Considered as a Human Activity." EWD 117.
http://www.cs.utexas.edu/~EWD/transcriptions/EWD01xx/EWD117.html
Also See http://www.csee.umbc.edu/331/current/notes/01/01introduction.pdf
Human expression which:
describes mathematical functions
makes the computer turn switches on and off
This question is very broad. My favorite definition is that a programming language is a means of expressing computations
Precisely
At a high level
In ways we can reason about them
By computation I mean what Turing and Church meant: the Turing machine and the lambda calculus have equivalent expressive power (which is a theorem), and the Church-Turing hypothesis (which is a conjecture) says roughly that there's no more powerful notion of computation out there. In other words, the kinds of computations that can be expressed in any programming languages are at best the kinds that can be expressed using Turing machines or lambda-calculus programs—and some languages will be able to express only a subset of those calculations.
This definition of computation also encompasses your friendly neighborhood hardware, which is pretty easy to simulate using a Turing machine and even easier to simulate using the lambda calculus.
Expressing computations precisely means the computer can't wiggle out of its obligations: if we have a particular computation in mind, we can use a programming language to force the computer to perform that computation. (Languages with "implementation defined" or "undefined" constructs make this task more difficult. Programmers using these languages are often willing to settle for—or may be unknowingly settling for—some computation that is only closely related to the computation they had in mind.)
Expressing computation at a high level is what programming langauges are all about. An important reason that there are so many different programming languages out there is that there are so many different high-level ways of thinking about problems. Often, if you have an important new class of problems to solve, you may be best off creating a new programming language. For example, Larry Wall's writing suggests that solving a class of problems called "systems administration" was a motivation for him to create Perl.
(Another reason there are so many different programming languages out there is that creating a new language is a lot of fun, and anyone can learn to do it.)
Finally, many programmers want languages that make it easy to reason about programs. For example, today a student of mine implemented a new algorithm that made his program run over six times faster. He had to reason very carefully about the contents of C arrays to make sure that the new algorithm would do the same job the old one did. Luckily C has decent tools for reasoning about programs, for example:
A change in a[i] cannot affect the value of a[i-1].
My student also applied a reasoning principle that isn't valid in C:
The sum of of a sequence unsigned integers will be at least as large as any integer in the sequence.
This isn't true in C because the sum might overflow. One reason some programmers prefer languages like Standard ML is that in SML, this reasoning principle is always valid. Of languages in wide use, probably Haskell has the strongest reasoning principles Richard Bird has developed equational reasoning about programs to a high art.
I will not attempt to address all the tangential details that follow your opening question. But I hope you will get something out of an answer that aims to give a deeper understanding, as you asked, of a fundamental question about programming languages.
One thing a lot of "IT" types forget is that there are 2 types of computer programming languages:
Software programming languages: C, Java, Perl, COBAL, etc.
Hardware programming languages: VHDL, Verilog, System Verilog, etc.
Interesting.
I'd say the defining feature of a programming language is the ability to make decisions based on input. Effectively, if and goto. Everything else is lots and lots of syntactic sugar. This is the idea that spawned Brainfuck, which is actually remarkably fun to (try to) use.
There are places where the line blurs; for example, I doubt people would consider XSLT to really be a programming language, but it's Turing-complete. I've even solved a Project Euler problem with it. (Very, very slowly.)
Three main properties of languages come to mind:
How is it run? Is it compiled to bare metal (C), compiled to mostly bare metal with some runtime lookup (C++), run on a JIT virtual machine (Java, .NET), bytecode-interpreted (Perl), or purely interpreted (uhh..)? This doesn't comment much on the language itself, but speaks to how portable the code may be, what sort of speed I might expect (and thus what broad classes of tasks would work well), and sometimes how flexible the language is.
What paradigms does it support? Procedural? Functional? Is the standard library built with classes or functions? Is there reflection? Is there, ideally, support for pretty much whatever I want to do?
How can I represent my data? Are there arrays, and are they fixed-size or not? How easy is it to use strings? Are there structs or hashes built in? What's the type system like? Are there objects? Are they class-based or prototype-based? Is everything an object, or are there primitives? Can I inherit from built-in objects?
I realize the last one is a very large collection of potential questions, but it's all related in my mind.
I imagine rebuilding the programming language landscape entirely from scratch would work pretty much how it did the first time: iteratively. Start with assembly, the list of direct commands the processor understands, and wrap it with something a bit easier to use. Repeat until you're happy.
Yes, you can write a Javascript interpreter in Javascript, or a Python interpreter in Python (see: PyPy), or a Python interpreter in Javascript. Such languages are called self-hosting. Have a look at Perl 6; this has been a goal for its main implementation from the start.
Ultimately, everything just has to translate to machine code, not necessarily C. You can write D or Fortran or Haskell or Lisp if you want. C just happens to be an old standard. And if you write a compiler for language Foo that can ultimately spit out machine code, by whatever means, then you can rewrite that compiler in Foo and skip the middleman. Of course, if your language is purely interpreted, this will probably result in a stack overflow...
As a friend taught me about computer languages, a language is a world. A world of communication with that machine. It is world for implementing ideas, algorithms, functionality, as Alonzo and Alan described. It is the technical equivalent of the mathematical structures that the aforementioned scientists built. It is a language with epxressions and also limits. However, as Ludwig Wittgenstein said "The limits of my language mean the limits of my world", there are always limitations and that's how one chooses it's language that fits better his needs.
It is a generic answer... some thoughts actually and less an answer.
There are many definitions to this but what I prefer is:
Computer programming is programming that helps to solve a particular technical task/problem.
There are 3 key phrases to look out for:
You: Computer will do what you (Programmer) told it to do.
Instruct: Instruction is given to the computer in a language that it can understand. We will discuss that below.
Problem: At the end of the day computers are tools (Complex). They are there to make out life simpler.
The answer can be lengthy but you can find more about computer programming

what is a programming language?

Wikipedia says:
A programming language is a machine-readable artificial language designed to express computations that can be performed by a machine, particularly a computer. Programming languages can be used to create programs that specify the behavior of a machine, to express algorithms precisely, or as a mode of human communication.
But is this true? It occurred to me in the shower this morning that a programming language might just be a set of conventions, something that both a human and an appropriately arranged compiler can interpret. If that's the case, then isn't it this definition of a programming language misleading? If that isn't the case, then what's the difference between a compiler and the language it compiles?
Thanks!
z.
A programming language is exactly that set of conventions, but I don't see why that makes the Wikipedia entry misleading, really. If it makes you feel better, you might edit it to read something like:
A programming language is a machine-readable artificial language designed to express computations that can be performed by a machine, particularly a computer. Programming languages can be used to define programs that specify the behavior of a machine, to express algorithms precisely, or as a mode of human communication.
I understand what you are saying, and you are right. Describing a programming language as a "machine-readable artificial language designed to express computations that can be performed by a machine" is unnecessarily specific. Programming languages can be more broadly generalized as established descriptions of tasks (or "a set of conventions") that allow one entity to control the behavior of another. What we traditionally identify as programming languages are just a layer of abstraction between machine code and programmers, and are specifically designed for electronic computers.
Programming languages are not limited to traditional computers (see the K'NEX Computer), and aren't even necessarily limited to computational devices at all. For example, when I am pleased with my dog's behavior, he gets a treat. When I am displeased, he gets nothing. Over time the dog learns the treat/no treat programming and I can use the treats to control his behavior (to an extent).
I don't see what is different between what you are asking...
It occurred to me in the shower this morning that a programming language might just be a set of conventions, something that both a human and an appropriately arranged compiler can interpret.
... and the Wikipedia definition.
The key is that a programming language is just "a machine-readable artificial language".
A compiler does indeed act as an effective specification of a language in terms of a reduction to machine code - however, as it's generally difficult to understand a language by reading the compiler's source, one generally considers a programming language in terms of an abstract processing model that the compiler implements. This abstract model is what one means when one refers to the programming language.
That said, there are indeed many languages (Hi there, PHP!) in which the compiler is the only specification of the language in existence. These languages tend to change unpredictably at times as compiler bugs are fixed or introduced.
Programming languages are an abstraction layer that helps insulate the programmer from having to talk in electrical signals to the computer. The creators of the language have done all the hard work in creating a structure (language) or standard (grammar, conjugation, etc.) that then can be interpreted by a compiler in terms that the computer understands.
All programming languages are really nothing more than domain specific languages for machine code or manipulating the registers and memory of a processing entity.
This is probably the true explanation of what a programming language really is:
Step 1: Think of a language and its grammar, which is a set of rules for making syntactically valid statements using the language. For example, a language called GRID has tiles {0,1} as its alphabet and grammar rules that make sure every GRID statement has equal length and height.
Step 2 (definition of program): GRID, so far, is useless. I'd dare to think of any valid statement of GRID as just data. We need to add something else to GRID: a successor function. So GRID={Grammar, alphabet, successor function}. To make this clear, lets use the rules of "The Game of Life" as successor function.
Step 3: The Game of Life is actually Turing Complete, so GRID={Grammar, alphabet, successor function = GOL} can perform any computation that is computable.
A programming language is nothing but a language with a successor function. The environment that evaluates a valid statement of the language(program) does nothing but follow those successor functions. Variables, for example, are things whose successor functions = (STAY THE SAME)
Computers are just very fast environments ;)
Wikipedia's definition might have been taken out of context. For one thing, only programs written in machine code are machine-readable. Otherwise, you need a compiler to convert C++, Java or even assembly code to machine code so the computer can carry out your instructions. Unless you include comments that are only readable to humans, or unless you are strictly discussing a topic within the realm of your program, programming is insufficient for human communication.

Resources