Is there any programming language that lets you redefine its type system? [closed] - programming-languages

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 12 years ago.
I'm looking for programming languages that let you redefine their type system without having to hack into the compiler. Is there anything out there that allows you to do that?
Thanks

In C you can use DEFINE to redefine everything.
#DEFINE int double
Whether it's good or bad you can find out here:
What is the worst real-world macros/pre-processor abuse you've ever come across?

If you're talking about redefining an actual type system, like making a statically typed language dynamic or making a weakly-typed language strongly-typed, then no.
Practically every language lets you define your own types, so I don't think that's what you meant either.
The only thing I can think of that might fit into what you're asking about are Macros in Common Lisp, which let you extend the syntax. This might be able to acheive what you are looking for, but until you state what it is exactly you're looking for, I can't really elaborate.
Also OCaml and its related languages allow you to do some pretty cool things with types. You can basically define any kind of type you can think of and then match against it with pattern matching, which makes it especially good to write compilers in.

Javascript, Ruby, and Smalltalk, just that i know of, allow you to do all kinds of stuff, even redefining on the fly what an Object can do. Perl allows you to redefine practically the whole language. Basically any decent scripting language, especially one that allows duck typing, should have equal power. But it seems to be really common among functional languages and those with functional abilities.

If I remember correctly, Ada have neat type-creation possibilities, specially for measures (for instance, defining a minimum and a maximum, checking operations between differents measures...). I've seen it quoted as an example to avoid very stupid bugs.

Related

OOP with JavaScript [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I'm coding with JavaScript on Node.js for the moment, and I was asking myself if there is a convenient way for using classes with private and public properties (and methods). I'm coming from PHP so are there any similar structures?
I've been doing it for the moment with modules, I export the public methods and vars en keep the rest private, is that the right way? And are there any good guides on going OOP with JavaScript?
In Javascript the concept of "private" doesn't really exist, although it is possible once you get the hang of the language a bit more.
Here's a good introduction to OOP in Javascript: https://developer.mozilla.org/en-US/docs/JavaScript/Introduction_to_Object-Oriented_JavaScript
Javascript can emulate (or rather replace) some/most class behavior using prototyping, but to someone used to classes, the syntax can be a bit confusing. If you're going to work with Javascript professionally, learning prototyping is more or less a must though.
ECMAScript 6 which will probably eventually make its way into Javascript is rumored to add support for classes. It is not available yet though :-/
Typescript is a rather highly debated Javascript extension by Microsoft that adds some class support and strong typing for Javascript. It compiles down to standard Javascript, or can be added as a module to node.js.
You can achieve a "class structure", although what you would really be doing is making an object literal behave like a class.
I can understand why it might be confusing, especially when you read for example Backbone JS's documentation - they define their objects as classes - when really there are no classes! Only object definitions.
Using a framework such as Backbone makes things a lot easier to understand, because you are working with a class like structure. Might be a good place for you to start.

Is code clone a common practice in C,Java and Python? [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
Code clones, also known as Duplicate code is often considered harmful to the system quality.
I'm wondering whether these duplicate code could be seen in standard APIs or other mature tools.
If it is indeed the case, then which language(such like C,Java,Python,common lisp etc.) do you think should introduce code clone practice with a higher probability?
Code cloning is extremely common no matter what programming language is used, yes, even in C, Python and Java.
People do it because it makes them efficient in the short term; they're doing code reuse. Its arguably bad, because it causes group inefficiencies in the long term: clones reuse code bugs and assumptions, and when those are discovered and need to be fixed, all the code clones need to be fixed, and the programmer doing the fixing doesn't know where the clones are, or even if there are any.
I don't think clones are bad, because of the code reuse effect. I think what is bad is not managing them.
To help with the latter problem, I build clone detectors (see our CloneDR) that automatically find exact and near-miss duplicated code, using the structure of the programming language to guide the search. CloneDR works for a wide variety of programming languages (including OP's set).
In any software system of 100K SLOC or more, at least 10% of the code is cloned. (OK, OK, Sun's JDK is built by an exceptionally good team, they have only about 9.5%). It tends to be worse in older conventional applications; I suspect because the programmers clone more code out of self defense.
(I have seen applications in which the clones comprise 50%+ of code, yes, those programs tend be awful for many reasons, not just cloning).
You can see clone reports at the link for applications in several langauges, look at the statistics, and see what the clones look like.
All code is the same, regardless of who writes it. Any API that you cite was written by human beings, who made decisions along the way. I haven't seen the perfect API yet - all of them get to redo things in hindsight.
Cloning code flies in the face of DRY, so of course it's recommended that you not do it. It's harmful because more code means more bugs, and duplication means you'll have to remember to fix them in all the clones.
But every rule has its exceptions. I'm sure everyone can think of a circumstance in which they did something that "best practices" and dogma would say they shouldn't, but they did it anyway. Time and other constraints don't always allow you to be perfect.
Suggesting that permission needs to be granted to allow such a thing is ludicrous to me. Be as DRY as you can. If you end up violating it, understand why you did it and remember to go back and fix it if you get a chance.

Higher level language vs lower level language? [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
Other than the speed, what are the advantages/differences to each? i.e. can assembly do more than c/c++, or what advantages does java offer that python does not (excluding the fact that java is platform independent)?
A higher level programming language usually means that the programmer can be more abstract, and generally do less work, at the cost of fine control.
For example, programming a GUI in assembly would suicidal. On the other hand, machine code is necessary when you want to take advantage of device-dependent optimizations and features. I guess you can define that low-level languages as those that are used for low-level tasks, e.g. drivers, operating systems, and parsers. Of course, the definitions are always rather fuzzy.
Pretty broad question there, and I cannot answer for the specifics between python and java, but in general here's my thoughts... keep in mind, this is nearly a philosophical question (perhaps even best suited for the Programmers stackexchange group) so there's no god-like answer. here goes:
with low level languages (the closer you get to bit flipping), you tend to be closer to the system hardware and core operating system... resources are more explicitly manipulable... this makes for efficient programming in that you can streamline your logic, skim off the bundled crap you don't need and develop the perfect piece of code... the consequence is that it's usually harder to think in and thus code in.
higher level languages provide abstraction, moving the developer away from the worries of 1s and 0s to focus on more complex requirements of a system. they allow us to think about things closer to the semantics of human communication and thought. they also allow us to share work cross-platform sometimes when programmers work in high level languages that compile to common run-times. there are numerous other reasons, but you get the gist.
ultimately I look at low level languages as something to use to "make the best tool", and high level languages to "make the best use of the best tools"

C## -- with two pound signs? [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 13 years ago.
Visual Basic .NET
C##
etc
C#? With two pound signs?
It's on so many of these programming résumés we're getting -- from random people -- listed as a qualification.
Any ideas what these folks are talking about? Is this convention an accidental holdover from C++, or something?
EDIT/ANSWER: Turns out the corporate résumé management system converts the "C#" that applicants specify to "C##". That is just fantastic.
My guess is you shouldn't hire them.
Looks like a recruiter who doesn't know what he is talking about is trying to impress you.
That résumé speaks for itself - little attention to detail. Not good for a programmer...
If it's on "so many" I'm willing to bet that the candidates don't know what they are talking about. Similarly, I have seen 'C+' listed as a language as well.
It's not uncommon for people to list as many languages on their resume as they can, because the Bad Ones think that even knowing the name of the language gives them a foot ahead of someone who doesn't. This is obviously a flaw in logic.
I can't remember exactly where I heard this story before (someone's blog, maybe someone will remember) but the exact situation is described. A candidate comes in with a resume listing all of these languages. As the interviewer asks the candidate to demonstrate their knowledge of the language by writing some code, the candidate freezes. When the interviewer asks why, the candidate responds with "I didn't say I knew how to write in those languages, just that I know of them!"
I received a resume before has this line in the list of experiences
C \ C+ \ C++ \ C#
:)
C Sharp - now even sharper!

Languages that free you from clarifying your ideas [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
ok, so is there a programming language that frees you from clarifying your ideas?
I couldn't help asking. But if there is one, what would you say comes closest today?
You mean, a programming language that lets you program without explaining what you want the program to do?
No, how would that work? The compiler needs to be told what program to compile.
Digging out an old, somewhat appropriate quote from Charles Babbage:
On two occasions I have been asked,
'Pray, Mr. Babbage, if you put into
the machine wrong figures, will the
right answers come out?' I am not able
rightly to apprehend the kind of
confusion of ideas that could provoke
such a question.
The compiler can't read your mind. The only way it can create a program to do what you want is to tell it what it is you want.
Of course, there are languages that free you from having to specify things that are irrelevant to your overall problem and are only relevant to the underlying implementation. (an obvious example is that most modern languages free you from having to worry about pointers or many other low-level concerns. Many languages also give you ways to iterate over sequences without having to write a manual for-loop. But you still have to "clarify your ideas", you still have to specify what your program should do. The best a language can do is free you from clarifying the things that are not relevant to your ideas.
That shouldn't be the role of a language, in my view. Instead, the language should help you to clarify your ideas, and let you express those clarified ideas in as intuitive a way as possible.
You could see it from two angles:
High-level languages like Prolog
free you from having to express every
messy detail of your algorithm. You just
sketch the high-level picture, and
prolog fills in the details (e.g.,
how to do search and deduct the
answers to your questions, etc).
On the other side of the spectrum,
low-level languages like C free you
from having to express your ideas in
an abstract way. You can just give a
sequence of very concrete, detailed
procedural steps (although you can
optionally introduce abstractions if
you want to).
So both extremes free you from certain aspects of expressing and clarifying your ideas.
I don't think so, but there are a few that prevent you from clearly expressing those ideas - I nominate BCPL.
For various problem domains, there are languages that free you from having to type a lot of stuff beyond what's necessary to clarify your ideas. But every language fails in some situations, and for some people. Not everybody is comfortable expressing their ideas in an object oriented design (say, C# or Java), as functions and closures (Scheme), as logical derivations (Prolog -- there are some problems for which it fits!), or as declarative statements of the desired result (XSLT, CSS, various DSL's, with varying success) -- yet each of these is the right answer in certain contexts, and most of them overlap to some extent. Indeed, few modern languages are all that purely oriented to single paradigms.
But some languages favour other things over expressiveness: such as having efficient implementations (C), or being easy to learn (say, Python or its scripting kindred).
I hope there isn't a language that frees you from clarifying your ideas. It should be the responsibility of all programmers to do that themselves, not to pass it off to some other person or programming construct.
All good points, was thinking more along the lines of scripting languages, where you can type away in the debugger until it does what you want (I've done that a time or two for some sysadmin wmi scripts).
Yes, Whitespace.

Resources