Are all those markup languages programming languages?
For example XML or HTML are programming languages?
The term "programming language" isn't defined rigorously enough for this to have a good answer. It really depends on the context in which the term is being used. In many contexts (usually if languages like C++, D or Java are being heavily mentioned) a "programming language" has to be Turing complete. However, XML and HTML do meet a more lenient set of criteria. They have a grammar and are textual means of expressing to a computer what you want it to do. Therefore, in some contexts they could be considered declarative programming languages.
Markup Languages could more accurately be called Data Description Languages. They describe your data.
While Programming languages are used to instruct the computer to perform logic.
It depends on the language. There are turing complete markup languages(including XMLish) of course, they aren't common though because they are ugly.
As the terms are most often used, I'd say HTML and plain XML both fall outside the realm of programming languages.
That said, quite a bit of HTML includes bits and pieces of JavaScript, which would have to be considered a programming language by almost any definition. Likewise, the "X" in XML is short of eXtensible. That (indirectly) means you can attach nearly any meaning you want to something stored in XML. In this case, the structure of the source code and the structure of the XML can be direct reflections of each other. Nonetheless, these are really examples of the markup language being a container for source code (or object code, though that's somewhat less common) for code written in some separately defined programming language, which doesn't change the fact that the markup language itself isn't really a programming language.
There are more borderline cases (e.g. defining animations in HTML with CSS) but while they approach the border, at least right offhand none occurs to me that really crosses the border to the point that you unequivocally call either a programming language.
Related
For languages that appear in academic conferences like POPL or ICFP, often the language's semantics (in the form of operational or denotational semantics) are well specified. I was trying to find documented semantics for popular languages (e.g., C, Python, JavaScript) but couldn't find any.
When such languages with "heavy" (relatively heavy to languages designed as proofs of concept) features are being developed, do designers (or committee members) of those languages add features without specifying their semantics? And is that the case for most popular programming languages?
If so, I think it makes sense practically because not every person who wants to contribute to developing a language needs to be a PL researcher. But I was wondering about what kind of real-world trade-offs exist.
The semantics of some dynamic programming languages are emerging, because they are minimal in their syntactical core, and are mostly defined by their libraries (the language that is actually used for programming is much lager than what is defined by the syntax). Examples are:
LISP
PERL
TCL
Some languages are defined with so much syntactic ambiguity that the semantics end up being defined by the particular implementation. Examples:
Early C++
C++ with STL
AG Natural
Any programming language with macro capabilities, or that is normally used with a macro preprocessor ends up with semantics being redefined by the macros used (like in Domain Specific Languages). Dynamic languages that allow changes to the parsing behavior at runtime are also defined at runtime.
In object-oriented languages (and other languages that dispatch depending on the type of the objects) the semantics of an expression depend on the types of the objects involved, and those may depart largely from the semantics of equivalent expressions for built-in and standard types.
Almost all languages are defined usually with a normative notation such as BNF. This site has references to many.
Part of this is to remove ambiguities and ensure syntactic consistency. It would be difficult to build compilers or renderers without them.
Parts of this go into the design of HTML5.2 which explains some of the reasoning.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
What is a DSL and where should I use it?
I've heard the term used a lot... what exactly does it mean for a language to be "domain-specific"?
Also, what does it mean for a language (e.g. Groovy) to support domain-specific languages?
For your first question a bit of googling will be sufficient.
As for the second question: you can implement DSLs in any language. You can even implement eDSLs in almost any language. But some languages are much better in that than the others. The key feature is metaprogramming - an ability to generate code in your host language, which means you can plug in a compiler of your eDSL anywhere. Features which facilitate compiler construction are also useful - e.g., out of box parsing tools, extensible or just flexible syntax of the host language, algebraic data types for representing ASTs, pattern matching for simplifying compiler transformations, etc. There is a continuum of possibilities, with entirely static and unextensible languages on one side and absolutely flexible languages at the other side.
A "domain specific language" is one in which a class of problems (or solutions to problems) can be expressed succinctly, usually because the vocabulary aligns with the that of the problem domain, and the notation is similar (where possible) to that used by experts that work in the domain.
What this really means is a grammar representing what you can say, and a set of semantics that defines what those said things mean. This makes DSLs just like other conventional programming langauges (e.g., Java) in terms of how they are implemented. And in fact, you can think of such conventional languages as being "DSL"s that are good at describing procedural solutions to problems (but not necessary good at describing them). The implications are that you need the same set of machinery to process DSLs as you do to process conventional languages, and that's essentially compiler machinery.
Groovy has some of this machinery (by design) which is why it can "support" DSLs.
See Domain Specific Languages for a discussion about DSLs in general, and a particular kind of metaprogramming machinery that is very helpful for implementing them.
At the risk of sounding naive, I ask this question in search of a deeper understanding of the concept of programming languages in general. I write this question for my own edification and the edification of others.
What is a useful definition of a computer programming language and what are its basic and necessary components? What are the key features that differentiate languages (functional, imperative, declarative, object oriented, scripting, etc...)?
One way to think about this question. Imagine you are looking at the hardware of a modern desktop or laptop computer. Assume, that the C language or any of its variants do not exist. How would you describe to others all the things needed to make the computer expressive and functional in terms of what we expect of personal computers today?
Tangentially related, what is it about computer languages that allow other languages to exist? For example take a scripting language like Javascript, Perl, or PHP. I assume part of the definition of these is that there is an interpreter most likely implemented in C or C++ at some level. Is it possible to write an interpreter for Javascript in Javascript? Is this a requirement for a complete language? Same for Perl, PHP, etc?
I would be satisfied with a list of concepts that can be looked up or researched further.
Like any language, programming languages are simply a communication tool for expressing and conveying ideas. In this case, we're translating our ideas of how software should work into a structured and methodical form that computers (as well as other humans who know the language, in most cases) can read and understand.
What is a useful definition of a computer programming language and what are its basic and necessary components?
I would say the defining characteristic of a programming language is as follows: things written in that language are intended to eventually be transformed into something that is executed. Thus, pseudocode, while perhaps having the structure and rigor of a programming language, is not actually a programming language. Likewise, UML can express many powerful ideas in an abstract manner just like a programming language can, but it falls short because people don't generally write UML to be executed.
How would you describe to others all the things needed to make the computer expressive and functional in terms of what we expect of personal computers today?
Even if the word "programming language" wasn't part of the shared vocabulary of the group I was talking to, I think it would be obvious to the others that we'd need a way to communicate with the computer. Just as no one expects a car to drive itself (yet!) without external instructions in the form of interaction with the steering wheel and pedals, no one could expect the hardware to function without being told what to do. As noted above, a programming language is the conduit through which we can make that communication happen.
Tangentially related, what is it about computer languages that allow other languages to exist?
All useful programming languages have a property called Turing completeness. If one language in the Turing-complete set can do something, then any of them can; they are said to be computationally equivalent.
However, just because they're equally "powerful" doesn't mean they're equally nice to work with for humans. This is why many people are willing to sacrifice the unparalleled micromanagement you get from writing assembly code in exchange for the expressiveness and power you get with higher-level languages, like Ruby, Python, or C#.
Is it possible to write an interpreter for Javascript in Javascript? Is this a requirement for a complete language? Same for Perl, PHP, etc?
Since there is a Javascript interpreter written in C, it follows that it must be possible to write a Javascript interpreter in Javascript, since both are Turing-complete. However, again, note that Turing-completeness says nothing about how hard it is to do something in one language versus another -- only whether it is possible to begin with. Your Javascript-interpreter-inside-Javascript might well be horrendously inefficient, consume absurd amounts of memory, require enormous processing power, and be a hideously ugly hack. But Turing-completeness guarantees it can be done!
While this doesn't directly answer your question, I am reminded of the Revenge of the Nerds essay by Paul Graham about the evolution of programming languages. It's certainly an interesting place to start your investigation.
Not a definition, but I think there are essentially two strands of development in programming languages:
Those working their way up from what the machine can do to something more expressive and less tied to the machine (Assembly, Fortran, C, C++, Java, ...)
Those going down from some mathematical or theoretical computer science concept of computation to something implementable on a real machine (Lisp, Prolog, ML, Haskell, ...)
Of course, in reality the picture is not as neat, and both strands influence each other by borrowing the best ideas.
Slightly long rant ahead.
A computer language is actually not all that different from a human language. Both are used to express ideas and concepts in commonly understood terms. Among different human languages there are syntactic differences, but you can express the same thing in every language (does that make human languages Turing complete? :)). Some languages are better suited for expressing certain things than others.
For example, although technically not completely correct, the Inuit language seems quite suited to describe various kinds of snow. Japanese in my experience is very suitable for expressing ones feelings and state of mind thanks to a large, concise vocabulary in that area. German is pretty good for being very precise thanks to largely unambiguous grammar.
Different programming languages have different specialities as well, but they mostly differ in the level of detail required to express things. The big difference between human and programming languages is mostly that programming languages lack a lot of vocabulary and have very few "grammatical" rules. With libraries you can extend the vocabulary of a language though.
For example:
Make me coffee.
Very easy to understand for a human, but only because we know what each of the words mean.
coffee : a drink made from the roasted and ground beanlike seeds of a tropical shrub
drink : a liquid that can be swallowed
swallow : cause or allow to pass down the throat
... and so on and so on
We know all these definitions by heart, but we had to learn them at some point.
In the same way, a computer can be "taught" to "understand" words as well.
Coffee::make()->giveTo($me);
This could be a perfectly valid expression in a computer language. If the computer "knows" what Coffee, make() and giveTo() means and if $me is defined. It expresses the same idea as the English sentence, just with a different, more rigorous syntax.
In a different environment you'd have to say slightly different things to get the same outcome. In Japanese for example you'd probably say something like:
コーヒーを作ってもらっても良いですか?
Kōhī o tsukuttemoratte mo ii desu ka?
Which would roughly translate to:
if ($Person->isAgreeable('Coffee::make()')) {
return $Person->return(Coffee::make());
}
Same idea, same outcome, but the $me is implied and if you don't check for isAgreeable first you may get a runtime error. In computer terms that would be somewhat analogous to Ruby's implied behaviour of returning the result of the last expression ("grammatical feature") and checking for available memory first (environmental necessity).
If you're talking to a really slow person with little vocabulary, you probably have to explain things in a lot more detail:
Go to the kitchen.
Take a pot.
Fill the pot with water.
...
Just like Assembler. :o)
Anyway, the point being, a programming language is actually a language just like a human language. Their syntax is different and specialized for the problem domain (logic/math) and the "listener" (computers), but they're just ways to transport ideas and concepts.
EDIT:
Another point about "optimization for the listener" is that programming languages try to eliminate ambiguity. The "make me coffee" example could, technically, be understood as "turn me into coffee". A human can tell what's meant intuitively, a computer can't. Hence in programming languages everything usually has one and one meaning only. Where it doesn't you can run into problems, the "+" operator in Javascript being a common example.
1 + 1 -> 2
'1' + '1' -> '11'
See "Programming Considered as a Human Activity." EWD 117.
http://www.cs.utexas.edu/~EWD/transcriptions/EWD01xx/EWD117.html
Also See http://www.csee.umbc.edu/331/current/notes/01/01introduction.pdf
Human expression which:
describes mathematical functions
makes the computer turn switches on and off
This question is very broad. My favorite definition is that a programming language is a means of expressing computations
Precisely
At a high level
In ways we can reason about them
By computation I mean what Turing and Church meant: the Turing machine and the lambda calculus have equivalent expressive power (which is a theorem), and the Church-Turing hypothesis (which is a conjecture) says roughly that there's no more powerful notion of computation out there. In other words, the kinds of computations that can be expressed in any programming languages are at best the kinds that can be expressed using Turing machines or lambda-calculus programs—and some languages will be able to express only a subset of those calculations.
This definition of computation also encompasses your friendly neighborhood hardware, which is pretty easy to simulate using a Turing machine and even easier to simulate using the lambda calculus.
Expressing computations precisely means the computer can't wiggle out of its obligations: if we have a particular computation in mind, we can use a programming language to force the computer to perform that computation. (Languages with "implementation defined" or "undefined" constructs make this task more difficult. Programmers using these languages are often willing to settle for—or may be unknowingly settling for—some computation that is only closely related to the computation they had in mind.)
Expressing computation at a high level is what programming langauges are all about. An important reason that there are so many different programming languages out there is that there are so many different high-level ways of thinking about problems. Often, if you have an important new class of problems to solve, you may be best off creating a new programming language. For example, Larry Wall's writing suggests that solving a class of problems called "systems administration" was a motivation for him to create Perl.
(Another reason there are so many different programming languages out there is that creating a new language is a lot of fun, and anyone can learn to do it.)
Finally, many programmers want languages that make it easy to reason about programs. For example, today a student of mine implemented a new algorithm that made his program run over six times faster. He had to reason very carefully about the contents of C arrays to make sure that the new algorithm would do the same job the old one did. Luckily C has decent tools for reasoning about programs, for example:
A change in a[i] cannot affect the value of a[i-1].
My student also applied a reasoning principle that isn't valid in C:
The sum of of a sequence unsigned integers will be at least as large as any integer in the sequence.
This isn't true in C because the sum might overflow. One reason some programmers prefer languages like Standard ML is that in SML, this reasoning principle is always valid. Of languages in wide use, probably Haskell has the strongest reasoning principles Richard Bird has developed equational reasoning about programs to a high art.
I will not attempt to address all the tangential details that follow your opening question. But I hope you will get something out of an answer that aims to give a deeper understanding, as you asked, of a fundamental question about programming languages.
One thing a lot of "IT" types forget is that there are 2 types of computer programming languages:
Software programming languages: C, Java, Perl, COBAL, etc.
Hardware programming languages: VHDL, Verilog, System Verilog, etc.
Interesting.
I'd say the defining feature of a programming language is the ability to make decisions based on input. Effectively, if and goto. Everything else is lots and lots of syntactic sugar. This is the idea that spawned Brainfuck, which is actually remarkably fun to (try to) use.
There are places where the line blurs; for example, I doubt people would consider XSLT to really be a programming language, but it's Turing-complete. I've even solved a Project Euler problem with it. (Very, very slowly.)
Three main properties of languages come to mind:
How is it run? Is it compiled to bare metal (C), compiled to mostly bare metal with some runtime lookup (C++), run on a JIT virtual machine (Java, .NET), bytecode-interpreted (Perl), or purely interpreted (uhh..)? This doesn't comment much on the language itself, but speaks to how portable the code may be, what sort of speed I might expect (and thus what broad classes of tasks would work well), and sometimes how flexible the language is.
What paradigms does it support? Procedural? Functional? Is the standard library built with classes or functions? Is there reflection? Is there, ideally, support for pretty much whatever I want to do?
How can I represent my data? Are there arrays, and are they fixed-size or not? How easy is it to use strings? Are there structs or hashes built in? What's the type system like? Are there objects? Are they class-based or prototype-based? Is everything an object, or are there primitives? Can I inherit from built-in objects?
I realize the last one is a very large collection of potential questions, but it's all related in my mind.
I imagine rebuilding the programming language landscape entirely from scratch would work pretty much how it did the first time: iteratively. Start with assembly, the list of direct commands the processor understands, and wrap it with something a bit easier to use. Repeat until you're happy.
Yes, you can write a Javascript interpreter in Javascript, or a Python interpreter in Python (see: PyPy), or a Python interpreter in Javascript. Such languages are called self-hosting. Have a look at Perl 6; this has been a goal for its main implementation from the start.
Ultimately, everything just has to translate to machine code, not necessarily C. You can write D or Fortran or Haskell or Lisp if you want. C just happens to be an old standard. And if you write a compiler for language Foo that can ultimately spit out machine code, by whatever means, then you can rewrite that compiler in Foo and skip the middleman. Of course, if your language is purely interpreted, this will probably result in a stack overflow...
As a friend taught me about computer languages, a language is a world. A world of communication with that machine. It is world for implementing ideas, algorithms, functionality, as Alonzo and Alan described. It is the technical equivalent of the mathematical structures that the aforementioned scientists built. It is a language with epxressions and also limits. However, as Ludwig Wittgenstein said "The limits of my language mean the limits of my world", there are always limitations and that's how one chooses it's language that fits better his needs.
It is a generic answer... some thoughts actually and less an answer.
There are many definitions to this but what I prefer is:
Computer programming is programming that helps to solve a particular technical task/problem.
There are 3 key phrases to look out for:
You: Computer will do what you (Programmer) told it to do.
Instruct: Instruction is given to the computer in a language that it can understand. We will discuss that below.
Problem: At the end of the day computers are tools (Complex). They are there to make out life simpler.
The answer can be lengthy but you can find more about computer programming
Wikipedia says:
A programming language is a machine-readable artificial language designed to express computations that can be performed by a machine, particularly a computer. Programming languages can be used to create programs that specify the behavior of a machine, to express algorithms precisely, or as a mode of human communication.
But is this true? It occurred to me in the shower this morning that a programming language might just be a set of conventions, something that both a human and an appropriately arranged compiler can interpret. If that's the case, then isn't it this definition of a programming language misleading? If that isn't the case, then what's the difference between a compiler and the language it compiles?
Thanks!
z.
A programming language is exactly that set of conventions, but I don't see why that makes the Wikipedia entry misleading, really. If it makes you feel better, you might edit it to read something like:
A programming language is a machine-readable artificial language designed to express computations that can be performed by a machine, particularly a computer. Programming languages can be used to define programs that specify the behavior of a machine, to express algorithms precisely, or as a mode of human communication.
I understand what you are saying, and you are right. Describing a programming language as a "machine-readable artificial language designed to express computations that can be performed by a machine" is unnecessarily specific. Programming languages can be more broadly generalized as established descriptions of tasks (or "a set of conventions") that allow one entity to control the behavior of another. What we traditionally identify as programming languages are just a layer of abstraction between machine code and programmers, and are specifically designed for electronic computers.
Programming languages are not limited to traditional computers (see the K'NEX Computer), and aren't even necessarily limited to computational devices at all. For example, when I am pleased with my dog's behavior, he gets a treat. When I am displeased, he gets nothing. Over time the dog learns the treat/no treat programming and I can use the treats to control his behavior (to an extent).
I don't see what is different between what you are asking...
It occurred to me in the shower this morning that a programming language might just be a set of conventions, something that both a human and an appropriately arranged compiler can interpret.
... and the Wikipedia definition.
The key is that a programming language is just "a machine-readable artificial language".
A compiler does indeed act as an effective specification of a language in terms of a reduction to machine code - however, as it's generally difficult to understand a language by reading the compiler's source, one generally considers a programming language in terms of an abstract processing model that the compiler implements. This abstract model is what one means when one refers to the programming language.
That said, there are indeed many languages (Hi there, PHP!) in which the compiler is the only specification of the language in existence. These languages tend to change unpredictably at times as compiler bugs are fixed or introduced.
Programming languages are an abstraction layer that helps insulate the programmer from having to talk in electrical signals to the computer. The creators of the language have done all the hard work in creating a structure (language) or standard (grammar, conjugation, etc.) that then can be interpreted by a compiler in terms that the computer understands.
All programming languages are really nothing more than domain specific languages for machine code or manipulating the registers and memory of a processing entity.
This is probably the true explanation of what a programming language really is:
Step 1: Think of a language and its grammar, which is a set of rules for making syntactically valid statements using the language. For example, a language called GRID has tiles {0,1} as its alphabet and grammar rules that make sure every GRID statement has equal length and height.
Step 2 (definition of program): GRID, so far, is useless. I'd dare to think of any valid statement of GRID as just data. We need to add something else to GRID: a successor function. So GRID={Grammar, alphabet, successor function}. To make this clear, lets use the rules of "The Game of Life" as successor function.
Step 3: The Game of Life is actually Turing Complete, so GRID={Grammar, alphabet, successor function = GOL} can perform any computation that is computable.
A programming language is nothing but a language with a successor function. The environment that evaluates a valid statement of the language(program) does nothing but follow those successor functions. Variables, for example, are things whose successor functions = (STAY THE SAME)
Computers are just very fast environments ;)
Wikipedia's definition might have been taken out of context. For one thing, only programs written in machine code are machine-readable. Otherwise, you need a compiler to convert C++, Java or even assembly code to machine code so the computer can carry out your instructions. Unless you include comments that are only readable to humans, or unless you are strictly discussing a topic within the realm of your program, programming is insufficient for human communication.
So, I am writing some sort of a statistics program (actually I am redesigning it to something more elegant) and I thought I should use a language that was created for that kind of stuff (dealing with huge data of stats, connections between them and some sort of genetic/neural programming).
To tell you the truth, I just want an excuse to dive into lisp/smalltalk (aren't smalltalk/lisp/clojure the same? - like python and ruby? -semantics-wise) but I also want a language to be easily understood by other people that are fond of the BASIC language (that's why I didn't choose LISP - yet :D).
I also checked Prolog and it seems a pretty cool language (easy to do relations between data and easier than Lisp) but I'd like to hear what you think.
Thx
Edit:
I always confuse common lisp with Smalltalk. Sorry for putting these two langs together. Also what I meant by "other people that are fond of the BASIC language" is that I don't prefer a language with semantics like lisp (for people with no CS background) and I find Prolog a little bit more intuitive (but that's my opinion after I just messed a little bit with both of them).
Is there any particular reason not to use R? It's sort of a build vs. buy (or in this case download) decision. If you're doing a statistical computation, R has many packages off the shelf. These include many libraries and interfaces for various types of data sources. There are also interface libraries for embedding R in other languages such as Python, so you can build a hybrid application with a GUI in Python (for example) and a core computation engine using R.
In this case, you could possibly reduce the effort needed for implementation and wind up with a more flexible application.
If you've got your heart set on learning another language, by all means, do it. There are several good free (some as in speech, some as in beer) implementations of Smalltalk, Prolog and LISP.
If you're putting a user interface on the system, Smalltalk might be the better option. If you want to create large rule sets as a part of your application, Prolog is designed for this sort of thing. Various people have written about the LISP ephiphany that influences the way you think about programming but I can't really vouch for this from experience - I've only really used AutoLISP for writing automation scripts on AutoCAD.
At the risk of offending some, I have a hard time reconciling "easily understood by other people that are fond of the BASIC language" with any of the languages you mentioned. That's not intended as a criticism, but as an observation that each of the languages you mention has a style and natural idiom that's quite different from that of BASIC.
Smalltalk - pure OO from the ground up, usually (e.g. Squeak) coupled with an integrated environment that is simultaneously the IDE and the runtime. IOW you enter the Smalltalk VM and work inside it rather than just writing a text that is "source code".
LISP - much closer to functional programming (although with imperative overtones); the prefix notation is the first barrier to most people who "like" other languages, but the concept and use of macros is a much more substantial one.
Clojure - The combination of LISP, OO, and JVM integration makes this one even less BASIC-like.
Python and Ruby - I lump these together (at the risk of further annoying fans of either ;-) because they are both OO language with distinct notations that will take an outsider a bit of learning curve. The use of indentation-only for control nesting in Python and the Perl-like use of special characters in Ruby are often points of the complaint by newcomers. Although both can be written in an imperative style, that would be considered non-standard by seasoned users.
Prolog - This is the most unlike BASIC of all languages mentioned. All of the other languages you mentioned can be (ab)used in a semi-procedural style, but that is essentially impossible in Prolog. It requires a thorough understanding of, and comfort with, recursion to do anything non-trivial.
Code written with a "native accent" in essentially all of these languages (but especially Prolog, IMHO) will make use of idioms and concepts that are outside the norm for conventional BASIC programming. Put another way, if you pick one of these and then write code "with a BASIC accent" you've pretty much wasted the benefits that the language can offer.
I believe that all of them are worth learning for the concepts they can teach (or at least reinforce, depending on your background). But the similarity to Language X (for a wide range of values of X) is not what you'll get.
I can answer you partially
(aren't Smalltalk/Lisp/Clojure the same? - like python and ruby? -semantics-wise)
No, it is not. Smalltalk is OO language with message pass instead method calls. Lisp is Lisp ;-) It means truly functional language with the powerful macro system, OO support which is never seen in other languages (in CL) and many more features. Closure is Lisp-like language without many Lisp features but good integration to JVM. It's not supporting tail call optimization for example. And python or ruby are classic imperative OO languages with some limited functional ability. Note word limited. For example, Guido doesn't like functional programming and removed some functional features in version 2.5 and 2.6.
If you familiar with imperative procedural programming as in Python and you want to change your paradigm you should make your decision carefully.
Prolog is a very different language. It can be very hard to grasp, mainly because it relies heavily on recursion to do very basic tasks. If you are really willing then give it a go. It can be very powerful because it allows to expess relationships and solve complicated problems simply, typical examples are Towers of Hanoi or quicksort. It will change the way you think, which can be difficult if you are used to imperative languages.
If you're interested in Prolog then there's a free version of Visual Prolog available and the commercial version is reasonably priced.
It's a strong type offshoot of Prolog so isn't your classic implementation of the language, but has a respectable history - Borland marketed the DOS ancestor of it as Turbo-Prolog back in the late '80s.
It's also Windows only, but can be used to create standard Windows DLLs so you can link your code into a 'normal' windows programming language. I've never used the package in anger myself, but I did a couple of Prolog courses at Uni so have downloaded it from time to time to play with and look for possible uses and it looks solid enough. Might be just the set of cogs you're looking for.