Is it possible to make a proxy between IDE and language server? - language-server-protocol

Say, I want to slightly change the behavior of some language.
Can this be done? Has it been done before?
UPD: Typescript has very limited possibilites to work with property names (for example you can not with the help of typescript create a propname that is derived from another one, e.g. uppercased or with a prefix prepended), whereas Javascript (which is what Typescript compiles to) is very flexible. So I wanted with the help of LSP to modify some of the Typescript's LSP messages that carry autocompletion options.

Yes, but how to do that is operating system specific. From the compiler's point of view, the proxy would appear as your IDE. LSP is a network protocol
I want to slightly change the behavior of some language.
So you want to change the semantics of some language (and you don't tell which). Then LSP is not the best place to do so.
For example, in some simple cases, with GCC, writing your GCC plugin is more appropriate.
In most cases, changing the "behavior" is really changing the language itself. Then you might need to make your own implementation of that language. Sometimes, you might patch an existing free software implementation of the original languages. In other cases, you'll need to make your language implementation by yourself. Then read the Dragon Book and consider compiling or transpiling to C your language in your language implementation. Be sure to specify it on paper first.
Don't confuse a programming language (which is a specification, generally written in English - perhaps with some formalization in some specific notation, e.g. n1570 for C11, R5RS for Scheme) with its implementation (which is a software). Read Scott's Programming Language Pragmatics.
Don't confuse an IDE with a language implementation. For example, all C or C++ compilers I know about don't have any IDE and are command line programs (e.g. GCC, Clang, etc...), and most of them don't even know about LSP. The IDE might start the C++ compiler, but it is not a compiler. You can code (in C, C++, Java, C#, Ocaml, ....) with a plain source code editor (even a simple language-agnostic one such as Notepad on Windows, or Leafpad or nano on Linux).
Most programming languages are defining source programs as a set of "translation units", practically of "source files", each being a sequence of characters with some complex syntax and semantics. How these source files are created is outside of the scope of the programming language specification (you could use editors, you could write your own generators for these source files, ....)
LSP is a proposal for a protocol between "IDE"s and language implementations.
Notice that autocompletion (mentioned in your comment) is not a feature of the programming language (and is not part of the semantics of C or C++). It might be a feature of some IDEs (not all of them). It could be done in a language neutral way (e.g. in Emacs).

Related

How do functional language compilers work? [duplicate]

I've heard of the idea of bootstrapping a language, that is, writing a compiler/interpreter for the language in itself. I was wondering how this could be accomplished and looked around a bit, and saw someone say that it could only be done by either
writing an initial compiler in a different language.
hand-coding an initial compiler in Assembly, which seems like a special case of the first
To me, neither of these seem to actually be bootstrapping a language in the sense that they both require outside support. Is there a way to actually write a compiler in its own language?
Is there a way to actually write a compiler in its own language?
You have to have some existing language to write your new compiler in. If you were writing a new, say, C++ compiler, you would just write it in C++ and compile it with an existing compiler first. On the other hand, if you were creating a compiler for a new language, let's call it Yazzleof, you would need to write the new compiler in another language first. Generally, this would be another programming language, but it doesn't have to be. It can be assembly, or if necessary, machine code.
If you were going to bootstrap a compiler for Yazzleof, you generally wouldn't write a compiler for the full language initially. Instead you would write a compiler for Yazzle-lite, the smallest possible subset of the Yazzleof (well, a pretty small subset at least). Then in Yazzle-lite, you would write a compiler for the full language. (Obviously this can occur iteratively instead of in one jump.) Because Yazzle-lite is a proper subset of Yazzleof, you now have a compiler which can compile itself.
There is a really good writeup about bootstrapping a compiler from the lowest possible level (which on a modern machine is basically a hex editor), titled Bootstrapping a simple compiler from nothing. It can be found at https://web.archive.org/web/20061108010907/http://www.rano.org/bcompiler.html.
The explanation you've read is correct. There's a discussion of this in Compilers: Principles, Techniques, and Tools (the Dragon Book):
Write a compiler C1 for language X in language Y
Use the compiler C1 to write compiler C2 for language X in language X
Now C2 is a fully self hosting environment.
The way I've heard of is to write an extremely limited compiler in another language, then use that to compile a more complicated version, written in the new language. This second version can then be used to compile itself, and the next version. Each time it is compiled the last version is used.
This is the definition of bootstrapping:
the process of a simple system activating a more complicated system that serves the same purpose.
EDIT: The Wikipedia article on compiler bootstrapping covers the concept better than me.
A super interesting discussion of this is in Unix co-creator Ken Thompson's Turing Award lecture.
He starts off with:
What I am about to describe is one of many "chicken and egg" problems that arise when compilers are written in their own language. In this ease, I will use a specific example from the C compiler.
and proceeds to show how he wrote a version of the Unix C compiler that would always allow him to log in without a password, because the C compiler would recognize the login program and add in special code.
The second pattern is aimed at the C compiler. The replacement code is a Stage I self-reproducing program that inserts both Trojan horses into the compiler. This requires a learning phase as in the Stage II example. First we compile the modified source with the normal C compiler to produce a bugged binary. We install this binary as the official C. We can now remove the bugs from the source of the compiler and the new binary will reinsert the bugs whenever it is compiled. Of course, the login command will remain bugged with no trace in source anywhere.
Check out podcast Software Engineering Radio episode 61 (2007-07-06) which discusses GCC compiler internals, as well as the GCC bootstrapping process.
Donald E. Knuth actually built WEB by writing the compiler in it, and then hand-compiled it to assembly or machine code.
As I understand it, the first Lisp interpreter was bootstrapped by hand-compiling the constructor functions and the token reader. The rest of the interpreter was then read in from source.
You can check for yourself by reading the original McCarthy paper, Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I.
Every example of bootstrapping a language I can think of (C, PyPy) was done after there was a working compiler. You have to start somewhere, and reimplementing a language in itself requires writing a compiler in another language first.
How else would it work? I don't think it's even conceptually possible to do otherwise.
Another alternative is to create a bytecode machine for your language (or use an existing one if it's features aren't very unusual) and write a compiler to bytecode, either in the bytecode, or in your desired language using another intermediate - such as a parser toolkit which outputs the AST as XML, then compile the XML to bytecode using XSLT (or another pattern matching language and tree-based representation). It doesn't remove the dependency on another language, but could mean that more of the bootstrapping work ends up in the final system.
It's the computer science version of the chicken-and-egg paradox. I can't think of a way not to write the initial compiler in assembler or some other language. If it could have been done, I should Lisp could have done it.
Actually, I think Lisp almost qualifies. Check out its Wikipedia entry. According to the article, the Lisp eval function could be implemented on an IBM 704 in machine code, with a complete compiler (written in Lisp itself) coming into being in 1962 at MIT.
Some bootstrapped compilers or systems keep both the source form and the object form in their repository:
ocaml is a language which has both a bytecode interpreter (i.e. a compiler to Ocaml bytecode) and a native compiler (to x86-64 or ARM, etc... assembler). Its svn repository contains both the source code (files */*.{ml,mli}) and the bytecode (file boot/ocamlc) form of the compiler. So when you build it is first using its bytecode (of a previous version of the compiler) to compile itself. Later the freshly compiled bytecode is able to compile the native compiler. So Ocaml svn repository contains both *.ml[i] source files and the boot/ocamlc bytecode file.
The rust compiler downloads (using wget, so you need a working Internet connection) a previous version of its binary to compile itself.
MELT is a Lisp-like language to customize and extend GCC. It is translated to C++ code by a bootstrapped translator. The generated C++ code of the translator is distributed, so the svn repository contains both *.melt source files and melt/generated/*.cc "object" files of the translator.
J.Pitrat's CAIA artificial intelligence system is entirely self-generating. It is available as a collection of thousands of [A-Z]*.c generated files (also with a generated dx.h header file) with a collection of thousands of _[0-9]* data files.
Several Scheme compilers are also bootstrapped. Scheme48, Chicken Scheme, ...

Looking for a new language that supports both interpreted and native compilation modes

I currently program in Perl, Python, C#, C, C++, Java, and a few other languages, and I'm looking for a new language to use as a primary when doing personal projects.
My current criteria are:
can be run as an interpreted language (i.e., run without having to wait to compile it);
can be compiled to native code;
are strongly typed (even if optionally);
support macros/templating/code morphing/wtf you want to call it;
has a decent number of libraries for it, or easily accessible to it;
Ideas? Suggestions?
I would suggest that Haskell suits your criteria.
Can be run as an interpreted language? Yes, via GHCI.
Can be compiled to native code? Yes.
Is strongly typed? Very much so. Perhaps even the most strongly typed language today, with the exception of some theorem provers like Agda.
Support macros/templating/morphing? If you use template haskell. This is an optional extension of the language however, so most libraries don't use macros. I haven't used template haskell myself so i can't comment on if it's any good.
Has decent library support? The standard library is not bad. There is also Hackage, an open repository of Haskell libraries a bit in the style of CPAN.
Additionally, it sounds like you already know a lot of imperative/object oriented languages. IMHO if you learn another one of those langs. it will probably be a slightly different permutation of features you've already seen somewhere else. Adding another programming paradigm like functional programming to your toolbox will probably be a better learning experience. Though I guess whether that's an advantage or not depends on if you want to learn new things or be productive quickly.
Common Lisp fits: there is an optional typing, efficient native compilation is available, powerful REPL makes it a perfect choice for scripting, and there is a powerful macro metaprogramming.
OCaml fits as well, with CamlP4 for metaprogramming.
Scala? It does run scripts, although they are compiled (transparently) first. I'm not sure what you mean by code morphing etc, but it's pretty good for DSLs. It meets all your other requirements - compiled as much as Java is, strongly typed, and has a reasonable number of its own libraries as well as all of Java's. I'm still a beginner with it, but I like it so far.

How can a programming language be "implemented"?

maybe this is just a little misunderstanding but how can a programming language be implemented?
I'm not talking about how to implement my own programming language but about the word "implemented"?
I mean, you can implement a compiler or an interpreter, but a programming language?
What does it mean if I read "C++ is implemented in C" or "Python was implemented in C"?
I think a language is more sth. like a protocol of how someone thinks about things should be implemented. For example, if he wants do display a messagebox he can say the command for this is ShowMessageBox(string) and implement a compiler who will translate this into something that works on a computer (aside from the selected programming paradigms he imagines).
I think this question leads to the question "what is a programming language in reality"? A compiler, an interpreter or just a documented language standard about how things should be implemented in a language?
[EDIT]
Answer: Languages are never implemented, only compilers/interpreters etc. It's this simple.
Here's a very academic answer (from a longtime academic).
First I'll reframe the question:
What does it mean for a programming language to be implemented?
I'll start with "what is a programming language":
A programming language is a formal language (a set of utterances we can characterize precisely through algorithmic rules) such that a sentence in the language has a computational meaning. There are a variety of ways to give computation meaning; two of the most popular are that a computation stands for a function (from values to values, or from machine states to machine states) and that a computation stands for a machine that makes "state transitions" and interacts with the outside world.
A language is implemented when a means is provided to read in an utterance and perform the computation, that is, calculate the function or perform the behavior. The means is the implementation.
Typical implementations include
Direct interpretation of the language syntax. This model is rare but FORTH probably comes closest to it.
Translation of the syntax into virtual-machine code, also called bytecode, which is itself another language and which is interpreted. It is popular to write bytecode interpreters in C. Lua, Perl, Python, and Ruby are implemented more or less this way.
Translation of the syntax into hardware machine instructions, which is itself another language, and which is interpreted by your CPU. C and C++ are typically (but not always) implemented this way.
Direct interpretation of the language in hardware. IA-32 machine code and AMD64 machine code are implemented this way.
When a person says "Language X is implemented in Y", they are usually saying that a translator for X or an interpreter for X's bytecode is written in language Y.
One of the great secrets of compiler writers is the ability to write the compiler for language X in language X itself. If this interests you, get Andrew Appel's paper Axiomatic Bootstrapping: A Guide for Compiler Hackers.
Sometimes the answer to this question is not obvious. Squeak Smalltalk writes both a translator and a bytecode interpreter in Smalltalk, then translates the interpreter to C, which is translated to machine code. What is Squeak implemented in? Smalltalk.
Poke a professor; get a lecture.
You are right, those statements don't make any sense. It's pretty obvious that whoever made those statements doesn't understand the difference between a programming language and a compiler (or interpreter).
This is a surprisingly common problem. For example, sometimes people talk about interpreted languages or compiled languages. That's the same thing: languages aren't interpreted or compiled, they just are. Interpretation and compilation are traits of the implementation not the language.
Another goodie: Python has a GIL. No, it doesn't: one implementation of Python has a GIL, all the other implementations don't, and the Python Language itself certainly doesn't. Or: Ruby has green threads. Again, not true: Ruby has threads. Period. Whether any particular language implementation chooses to implement them as green threads, native threads, platform threads or whatever, is a trait of that particular implementation, not of Ruby. And of course my favorite: Ruby 1.9 is faster than Ruby 1.8. This doesn't even make sense: Ruby 1.9 and Ruby 1.8 are programming languages, i.e. a bunch of abstract mathematical rules. You cannot run a programming language, therefore a programming language can never be "faster" or "slower" than another one.
The most blatant confusion about the difference between programming languages and implementations is the Computer Language Benchmark Game, which claims to benchmark languages against each other but in fact benchmarks implementations.
All of these are just different expressions of the fact that apparently some people seem to be fundamentally incapable of grasping the concept of abstraction. Or at least the concept of having an abstract language and a concrete implementation of that language.
If we go back to the statement that "Python is implemented in C", it should now be obvious that that statement is not just wrong. If the statement were wrong that would imply that the statement even makes sense, i.e. that there is some possible world out there, in which it could at least theoretically be right. But that's not the case. The statement is neither wrong nor right, it simply doesn't make sense. If English were a typed language, it would be a type error.
Python is a programming language. Programming languages aren't implemented in anything. They are just implemented. Compilers and interpreters are implemented in languages. But even if you interpret the statement this way, it isn't true: Jython is implemented in Java, IronPython is implemented in C#, PyPy is implemented in RPython and Python, Pynie is implemented in PGE, NQP and PIR. (Oh, and all of those implementations have compilers, so there goes your "Python is an interpreted language".) Similar with Ruby: Rubinius is implemented in Ruby and C++, JRuby and XRuby are implemented in Java, IronRuby and Ruby.NET are implemented in C#, HotRuby is implemented in ECMAScript, Red Sun is implemented in ActionScript, RubyGoLightly is implemented in Go, Cardinal is implemented in PGE, NQP and PIR, SmallRuby is implemented in Smalltalk/X, MagLev is implemented in GemStone Smalltalk and Ruby, YARI is implemented in Io. And for C++: Clang (which is the C, C++ and Objective-C front-end for LLVM) is implemented in C++ (all three front-ends are implemented in C++).
"C++ is implemented in C". I understand this as "C++ compiler is written in C language". Quite simple, without too much philosophy.
Generally, C++ compiler can be written in any language, including C++ itself (except of the first compiler version).
"Python was implemented in C" means that at least one Python compiler (in this case the most commonly used one) is written using C. The developers of that implementation of Python made a deliberate decision not to use C++. As a statement it is incomplete as Python has also been implemented in Java, in C# and in Python.
The main relevance is that it gives you some idea of the systems you might be able to port the language onto: anything targeted by a C compiler should (at least in theory) be capable of running the C implementation of Python, but if they'd chosen to use C++ there would be a smaller set of systems that could run it.
C++ usually isn't implemented in C these days: I believe it is usually implemented in C++. It is quite common for languages to be implemented in the same language (or a subset of the language) as it means you are no longer dependent on some other unrelated language being available for the target. To bootstrap onto a new system you cross compile from some other system.
If you compile gcc for a new platform the build process involves compiling the source code once with whatever compiler is already available (perhaps an older gcc), then compiling it a second time with the newly compiled compiler, then compiling it a third time with the output from the second compilation. If the second and third versions aren't identical you get a build error. If they are identical then you've got a pretty good indication that it compiled correctly.
A programming language is a standard. Its interpreter or compiler is an implementation of this standard.
To build a new language, you don't necessarily needs to do in in low level machine code (assembly for instance). So, using another language to accomplish your goal (creating a new language here) is perfectly normal. So, when we say: Python was implemented in C, it just means that C was used to create that language. For instance, C can be complied on many different architecture, so the programmers doesn't have to take care of the different type of computers (portable).
A language is just a way to express yourself to the computer. Today, it can be done in various ways. But when you use the same syntax as the language and create your own framework, it's called a library or framework. A programming language is just a notation for writing program. If the notation change, you have a different language. Like French or Spanish comes from Latin. (French is implemented in Latin ;)
Why is there so many different languages? Because the goal of a language is to solve complex problems. So, depending on what you want to try yo accomplish, choosing the appropriate language can be an important decision.
The statement "Language X is implemented in Language Y" makes sense and is true if and only if there exists a canonical implementation of Language X and that implementation is written in Language Y. In common usage, either the first or the most popular implementation is often assumed to be canonical.
For example, Perl is one of the few languages with a definitive canon. "Python is implemented in C" makes sense if CPython is taken to be the canonical implementation of Python, and "C++ is implemented in C" is true for CFront, the original implementation of "C with classes" by Bjarne Stroustrup.
The direct answer:
Implementation in the context you are talking about just means written and language actually means compiler.
The original C++ compiler was as I understand it written in C. There is nothing (apart from knowledge and time) to stop you from writing a C++ compiler in another language.
Implementation is the code that makes software work. Often we talk about the implementation of a function as in: "the function has not been implemented yet."
eg
void foo()
{
//function has not been implemented yet
throw();
}
This often happens during the design phase of a program because the call needs to be there in order to write/debug/concept test the calling code but we haven't got round to implementing (writing the code to go insde the function)

How to create a language these days?

I need to get around to writing that programming language I've been meaning to write. How do you kids do it these days? I've been out of the loop for over a decade; are you doing it any differently now than we did back in the pre-internet, pre-windows days? You know, back when "real" coders coded in C, used the command line, and quibbled over which shell was superior?
Just to clarify, I mean, not how do you DESIGN a language (that I can figure out fairly easily) but how do you build the compiler and standard libraries and so forth? What tools do you kids use these days?
One consideration that's new since the punched card era is the existence of virtual machines already bountifully provided with "standard libraries." Targeting the JVM or the .NET CLR instead of ye olde "language walled garden" saves you a lot of bootstrapping. If you're creating a compiled language, you may also find Java byte code or MSIL an easier compile target than machine code (of course, if you're in this for the fun of creating a tight optimising compiler then you'll see this as a bug rather than a feature).
On the negative side, the idioms of the JVM or CLR may not be what you want for your language. So you may still end up building "standard libraries" just to provide idiomatic interfaces over the platform facility. (An example is that every languages and its dog seems to provide its own method for writing to the console, rather than leaving users to manually call System.out.println or Console.WriteLine.) Nevertheless, it enables an incremental development of the idiomatic libraries, and means that the more obscure libraries for which you never get round to building idiomatic interfaces are still accessible even if in an ugly way.
If you're considering an interpreted language, .NET also has support for efficient interpretation via the Dynamic Language Runtime (DLR). (I don't know if there's an equivalent for the JVM.) This should help free you up to focus on the language design without having to worry so much about the optimisation of the interpreter.
I've written two compilers now in Haskell for small domain-specific languages, and have found it to be an incredibly productive experience. The parsec library makes playing with syntax easy, and interpreters are very simple to write over a Haskell data structure. There is a description of writing a Lisp interpreter in Haskell that I found helpful.
If you are interested in a high-performance backend, I recommend LLVM. It has a concise and elegant byte-code and the best x86/amd64 generating backend you can find. There is an optional garbage collector, and some experimental backends that target the JVM and CLR.
You can write a compiler in any language that produces LLVM bytecode. If you are adventurous enough to learn Haskell but want LLVM, there are a set of Haskell-LLVM bindings.
What has changed considerably but hasn't been mentioned yet is IDE support and interoperability:
Nowadays we pretty much expect Intellisense, step-by-step execution and state inspection "right in the editor window", new types that tell the debugger how to treat them and rather helpful diagnostic messages. The old "compile .x -> .y" executable is not enough to create a language anymore. The environment is nothing to focus on first, but affects willingness to adopt.
Also, libraries have become much more powerful, noone wants to implement all that in yet another language. Try to borrow, make it easy to call existing code, and make it easy to be called by other code.
Targeting a VM - as itowlson suggested - is probably a good way to get started. If that turns out a problem, it can still be replaced by native compilers.
I'm pretty sure you do what's always been done.
Write some code, and show your results to the world.
As compared to the olden times, there are some tools to make your job easier though. Might I suggest ANTLR for parsing your language grammar?
Speaking as someone who just built a very simple assembly like language and interpreter, I'd start out with the .NET framework or similar. Nothing can beat the powerful syntax of C# + the backing of the entire .NET community when attempting to write most things. From here i designed a simple bytecode format and assembly syntax and proceeeded to write my interpreter + assembler.
Like i said, it was a very simple language.
You should not accept wimpy solutions like using the latest tools. You should bootstrap the language by writing a minimal compiler in Visual Basic for Applications or a similar language, then write all the compilation tools in your new language and then self-compile it using only the language itself.
Also, what is the proposed name of the language?
I think recently there have not been languages with ALL CAPITAL LETTER names like COBOL and FORTRAN, so I hope you will call it something like MIKELANG with all capital letters.
Not so much an implementation but a design decision which effects implementation - if you make every statement of your language have a unique parse tree without context, you'll get something that it's easy to hand-code a parser, and that doesn't require large amounts of work to provide syntax highlighting for. Similarly simple things like using a different symbol for module namespaces and object namespaces ( unlike Java which uses . for both package and class namespaces ) means you can parse the code without loading every module that it refers to.
Standard libraries - include the equivalent of everything in C99 standard libraries other than setjmp. Add whatever else you need for your domain. Work out an easy way to do this, either something like SWIG or an in-line FFI such as Ruby's [can't remember module name] and Python's ctypes.
Building as much of the language in the language is an option, but projects which start out doing either give up (rubinius moved to using C++ for parts of its standard library), or is only for research purposes (Mozilla Narcissus)
I am actually a kid, haha. I've never written an actual compiler before or designed a language, but I have finished The Red Dragon Book, so I suppose I have somewhat of an idea (I hope).
It would depend firstly on the grammar. If it's LR or LALR I suppose tools like Bison/Flex would work well. If it's more LL, I'd use Spirit, which is a component of Boost. It allows you to write the language's grammar in C++ in an EBNF-like syntax, so no muddling around with code generators; the C++ compiler compiles the grammar for you. If any of these fail, I'd write an EBNF grammar on paper, and then proceed to do some heavy recursive descent parsing, which seems to work; if C++ can be parsed pretty well using RDP (as GCC does it), then I suppose with enough unit tests and patience you could write entire compilers using RDP.
Once I have a parser running and some sort of intermediate representation, it then depends on how it runs. If it's some bytecode or native code compiler, I'll use LLVM or libJIT to process it. LLVM is more suited for general compilation, but I like the libJIT API and documentation better. Alternatively, if I'm really lazy, I'll generate C code and let GCC do the actual compilation. Another alternative, is to target an existing VM, like Parrot or the JVM or the CLR. Parrot is the VM being designed for Perl. If it's just an interpreter, I'll walk the syntax tree.
A radical alternative is to use Prolog, which has syntax features which remarkably simulate EBNF. I have no experience with it though, and if I am not wrong (which I am almost certainly going to be), Prolog would be quite slow if used to parse heavy duty programming languages with a lot of syntactical constructs and quirks (read: C++ and Perl).
All this I'll do in C++, if only because I am more used to writing in it than C. I'd stay away from Java/Python or anything of that sort for the actual production code (writing compilers in C/C++ help to make it portable), but I could see myself using them as a prototyping language, especially Python, which I am partial towards. Of course, I've never actually done any of this before, so I'm not one to say.
On lambda-the-ultimate there's a link to Create Your Own Programming Language by Marc-André Cournoyer, which appears to describe how to leverage some modern tools for creating little languages.
Just to clarify, I mean, not how do you DESIGN a language (that I can figure out fairly easily)
Just a hint: Look at some quite different languages first, before designing a new languge (i.e. languages with a very different evaluation strategy). Haskell and Oz come to mind. Though you should also know Prolog and Scheme. A year ago I also was like "hey, let's design a language that behaves exactly as I want", but fortunatly I looked at those other languages first (or you could also say unfortunatly, because now I don't know how I want a language to behave anymore...).
Before you start creating a language you should read this:
Hanspeter Moessenboeck, The Art of Niklaus Wirth
ftp://ftp.ssw.uni-linz.ac.at/pub/Papers/Moe00b.pdf
There's a big shortcut to implementing a language that I don't see in the other answers here. If you use one of Lukasiewicz's "unparenthesized" forms (ie. Forward Polish or Reverse Polish) you don't need a parser at all! With reverse polish, the dependencies go right-to-left so you simply execute each token as it's scanned. With forward polish, it's the reverse of that, so you actually execute the program "backwards", simplifying subexpressions until reaching the starting token.
To understand why this works, you should investigate the 3 primary tree-traversal algorithms: pre-order, in-order, post-order. These three traversals are the inverse of the parsing task that a language reader (i. parser) has to perform. Only the in-order notation "requires" a recursive decent to re-construct the expression tree. With the other two, you can get away with just a stack.
This may require more "thinking' and less "implementing".
BTW, if you've already found an answer (this question is a year old), you can post that and accept it.
Real coders still code in C. Just that it's a litte sharper.
Hmmm... language design? or writing a compiler?
If you want to write a compiler, you'd use Flex + Bison. (google)
Not an easy answer, but..
You essentially want to define a set of rules written in text (tokens) and then some parser that checks these rules and assembles them into fragments.
http://www.mactech.com/articles/mactech/Vol.16/16.07/UsingFlexandBison/
People can spend years on this, The above article talks about using two tools (Flex and Bison) That can be used to turn text into code you can feed to a compiler.
First I spent a year or so to actually think how the language should look like. At the same time I helped in developing Ioke (www.ioke.org) to learn language internals.
I have chosen Objective-C as implementation platform as it's fast (enough), simple and rich language. It also provides test framework so agile approach is a go. It also has a rich standard library I can build upon.
Since my language is simple on syntactic level (no keywords, only literals, operators and messages) I could go with Ragel (http://www.complang.org/ragel/) for building scanner. It's fast as hell and simple to use.
Now I have a working object model, scanner and simple operator shuffling plus standard library bootstrap code. I can even run a simple programs - as long as they fit in one file that is :)
Of course older techniques are still common (e.g. using Flex and Bison) many newer language implementations combine the lexing and parsing phase, by using a parser based on a parsing expression grammar (PEG). This works for recursive descent parsers created using combinators, or memoizing Packrat parsers. Many compilers are built using the Antlr framework also.
Use bison/flex which is the gnu version of yacc/lex. This book is extremely helpful.
The reason to use bison is it catches any conflicts in the language. I used it and it made my life many years easier (ok so i'm on my 2nd year but the first 6months was a few years ago writing it in C++ and the parsing/conflicts/results were terrible! :(.)
If you want to write a compiler obviously you need to read the Dragon Book ;)
Here is another good book that I have just read. It is practical and easier to understand than the Dragon Book:
http://www.amazon.co.uk/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=language+implementation+patterns&x=0&y=0
Mike --
If you're interested in an efficient native-code-generating compiler for Windows so you can get your bearings -- without wading through all the unnecessary widgets, gadgets, and other nonsense that clutter today's machines -- I recommend the Osmosian Order's Plain English development system. It includes a unique interface, a simplified file manager, a friendly text editor, a handy hexadecimal dumper, the compiler/linker (of course), and a wysiwyg page-layout application for documentation. Written entirely in Plain English, it is a quick download (less than a megabyte), small enough to understand in short order (about 25,000 lines of Plain English code, with just 4,000 in the compiler/linker), yet powerful enough to reproduce itself on a bottom-of-the-line Dell in less than three seconds. Really: three seconds. And it's free to all who write and ask for a copy, including the source code and and a rather humorous tongue-in-cheek 100-page manual. See www.osmosian.com for details on how to get a copy, or write to me directly with questions or comments: Gerry.Rzeppa#pobox.com

Abstract (programming) language built to transpile

Introduction
Oftentimes I encounter a situation in which a library has been written in a particular programming language. That's great, if I want to use the library in the same language, but if I want to use a different language, that's going to be a problem (that doesn't mean that there might be a more or less hacky way).
For some libraries I have the feeling that they have been written in that particular programming language, simply because any language will do (and because of personal preference of the author) meaning that no language-specific high-level external 3rdparty-libraries are being used. For these situations I thought that it would be neat, if there was some sort of abstract (programming) language in which the library author can specify the algorithms, but which can then be transpiled into a lot of other programming languages. Thus if I want to use that library, I can simply use the transpiler to get that library in my language of choice.
Actual question
So what I am looking for is a language, that is specifically meant to be transpiled into most popular languages (e.g. Java, C/C++, Python). I'm interested in whether someone has gone through the effort of creating such a "universal" transpiler language before.
Note that I'm not looking for a particular transpiler from one language to another. I want to know whether there exists a (programming) language that has been designed for being transpilable into source code of a lot of different actual programming languages. Thus the language I'm looking for would probably not even run by itself (only the transpiled code would be an actual program).
Although I would be interested to hear general pros/cons for the existence of such a language, this is also not what this question is about due to the rules here on SO. Therefore I'd ask you, to not write opinion-based answers in this kind of style.
The answer to this question might very well be that there is no such language, but as my recherche hasn't brought anything up, I thought that maybe someone here knows of such a language, that I might have missed due to it not being widely used.
One language that was designed with the intention of being transpiled to various other languages is Haxe
At the time of writing it supports generating source code for:
JavaScript
ActionScript 3
PHP (including PHP7)
C++
Java
C#
Python
Lua
(reference: https://haxe.org/documentation/introduction/compiler-targets.html)
It also supports compiling directly to byte code for specific VMs

Resources