It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
Other than the speed, what are the advantages/differences to each? i.e. can assembly do more than c/c++, or what advantages does java offer that python does not (excluding the fact that java is platform independent)?
A higher level programming language usually means that the programmer can be more abstract, and generally do less work, at the cost of fine control.
For example, programming a GUI in assembly would suicidal. On the other hand, machine code is necessary when you want to take advantage of device-dependent optimizations and features. I guess you can define that low-level languages as those that are used for low-level tasks, e.g. drivers, operating systems, and parsers. Of course, the definitions are always rather fuzzy.
Pretty broad question there, and I cannot answer for the specifics between python and java, but in general here's my thoughts... keep in mind, this is nearly a philosophical question (perhaps even best suited for the Programmers stackexchange group) so there's no god-like answer. here goes:
with low level languages (the closer you get to bit flipping), you tend to be closer to the system hardware and core operating system... resources are more explicitly manipulable... this makes for efficient programming in that you can streamline your logic, skim off the bundled crap you don't need and develop the perfect piece of code... the consequence is that it's usually harder to think in and thus code in.
higher level languages provide abstraction, moving the developer away from the worries of 1s and 0s to focus on more complex requirements of a system. they allow us to think about things closer to the semantics of human communication and thought. they also allow us to share work cross-platform sometimes when programmers work in high level languages that compile to common run-times. there are numerous other reasons, but you get the gist.
ultimately I look at low level languages as something to use to "make the best tool", and high level languages to "make the best use of the best tools"
Related
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
Code clones, also known as Duplicate code is often considered harmful to the system quality.
I'm wondering whether these duplicate code could be seen in standard APIs or other mature tools.
If it is indeed the case, then which language(such like C,Java,Python,common lisp etc.) do you think should introduce code clone practice with a higher probability?
Code cloning is extremely common no matter what programming language is used, yes, even in C, Python and Java.
People do it because it makes them efficient in the short term; they're doing code reuse. Its arguably bad, because it causes group inefficiencies in the long term: clones reuse code bugs and assumptions, and when those are discovered and need to be fixed, all the code clones need to be fixed, and the programmer doing the fixing doesn't know where the clones are, or even if there are any.
I don't think clones are bad, because of the code reuse effect. I think what is bad is not managing them.
To help with the latter problem, I build clone detectors (see our CloneDR) that automatically find exact and near-miss duplicated code, using the structure of the programming language to guide the search. CloneDR works for a wide variety of programming languages (including OP's set).
In any software system of 100K SLOC or more, at least 10% of the code is cloned. (OK, OK, Sun's JDK is built by an exceptionally good team, they have only about 9.5%). It tends to be worse in older conventional applications; I suspect because the programmers clone more code out of self defense.
(I have seen applications in which the clones comprise 50%+ of code, yes, those programs tend be awful for many reasons, not just cloning).
You can see clone reports at the link for applications in several langauges, look at the statistics, and see what the clones look like.
All code is the same, regardless of who writes it. Any API that you cite was written by human beings, who made decisions along the way. I haven't seen the perfect API yet - all of them get to redo things in hindsight.
Cloning code flies in the face of DRY, so of course it's recommended that you not do it. It's harmful because more code means more bugs, and duplication means you'll have to remember to fix them in all the clones.
But every rule has its exceptions. I'm sure everyone can think of a circumstance in which they did something that "best practices" and dogma would say they shouldn't, but they did it anyway. Time and other constraints don't always allow you to be perfect.
Suggesting that permission needs to be granted to allow such a thing is ludicrous to me. Be as DRY as you can. If you end up violating it, understand why you did it and remember to go back and fix it if you get a chance.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 12 years ago.
I'm looking for programming languages that let you redefine their type system without having to hack into the compiler. Is there anything out there that allows you to do that?
Thanks
In C you can use DEFINE to redefine everything.
#DEFINE int double
Whether it's good or bad you can find out here:
What is the worst real-world macros/pre-processor abuse you've ever come across?
If you're talking about redefining an actual type system, like making a statically typed language dynamic or making a weakly-typed language strongly-typed, then no.
Practically every language lets you define your own types, so I don't think that's what you meant either.
The only thing I can think of that might fit into what you're asking about are Macros in Common Lisp, which let you extend the syntax. This might be able to acheive what you are looking for, but until you state what it is exactly you're looking for, I can't really elaborate.
Also OCaml and its related languages allow you to do some pretty cool things with types. You can basically define any kind of type you can think of and then match against it with pattern matching, which makes it especially good to write compilers in.
Javascript, Ruby, and Smalltalk, just that i know of, allow you to do all kinds of stuff, even redefining on the fly what an Object can do. Perl allows you to redefine practically the whole language. Basically any decent scripting language, especially one that allows duck typing, should have equal power. But it seems to be really common among functional languages and those with functional abilities.
If I remember correctly, Ada have neat type-creation possibilities, specially for measures (for instance, defining a minimum and a maximum, checking operations between differents measures...). I've seen it quoted as an example to avoid very stupid bugs.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
i was thinking learn a more low level language like C, but before it i'd like some opnion about:
what's the second language do you recommend to learn?
learn a low level language will make me a better programmer?
it's really necessary learn a second programming language?
Going backwards:
(3) Absolutely - you'll increase your ability by orders of magnitude by learning multiple languages.
(2) A low level language will make you a better programmer - alternatively a functional language will help as well.
(1) Low-level: go with C. Functional, try Scheme or Haskell. C also gives you the ability to write extension modules to Python if you ever have the need.
what's the second language do you recommend to learn?
Something imperative (i.e. same paradigm) but different. Python is dynamically typed with significant whitespace, so something statically types without significant whitespace: e.g. Java or C#.
These would also make a nice stepping stone towards C. The benefit of C is you really know what's going on, but with the disadvantage that you have to control it all. This level of control is not need for most business problems.
it's really necessary learn a second programming language?
Really subjective, but most good developers know many (consider for a web app: Python, Ruby, C#, Java on the server; SQL on the database and JavaScript on the client; and then the mark-up...).
You benefit from being able to see other approaches to problems and thus create better solutions. So once you have covered more imperative languages move into other paradigms like functional.
I agree with your choice of C, which leads on to C++. If nothing else, learning C will teach you why people these days tend to prefer languages with automatic memory management - but it will potentially also give you a feeling of programming "close to the metal" (without the pain of programming in assembly language), and help you to understand how a processor actually works. Not always useful knowledge but it's nice to know.
Whatever you choose, I recommend a statically-typed language - C, C++, Java, and some functional programming languages fit this bill. Java might be a good choice if you find C a bit tough at first.
I'd say learning any new language makes you a better programmer. However, will learning C make you a better Python programmer? Probably not; why should it?!
Define "necessary"! By a strict definition, no. But you're missing out on the experience of having to think about things in a different way (even if it's only a slightly different way).
I would stay with the same paradigm, but leave options open for another paradigm (functional programming). Probably C# is a good choice, because
If you decide to learn C/C++ later, it'll become a bit easier.
If you decide you want to learn functional programming later, you can switch to F# but still use existing code written in C#, because you stay within .NET framework.
Python is not known to be a remarkably fast language. You should consider learning a language which allows better computational performance. But good old ANSI C is probably too low level, despite you can write very fast programs with it. C# has OK performance for a just-in-time compiled language, and if you need more performance later, you can still extend your knowledge towards F# or C.
Although I don't use Microsoft Windows privately and advertise Linux and Open Source frequently, it's probably a good idea to offer some knowledge about Microsoft technology in case you intent to earn money with programming.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I am currently entering my senior year as a dual major in Electrical Engineering and Computer Engineering, and have touched on a wide variety of different languages: C, C++, C#/XAML, Java, bash, python, VHDL, assembly, etc. I was wondering what you think would be a good language/few languages to become more proficient in, or to explore for a first time. Also, what level of programming you prefer (hardware, local, network, system, design, integration, and so on) If you could tell me why, I would be grateful, or if you'd like to relate your experiences, I am quite interested
. I am hoping to find a job in hardware design, but as I become better with some languages, I am finding just how much I enjoy programming, so I really have an open mind at this juncture. I would love to hear from some people in the 'real world'.
You want to understand:
Different language paradigms (procedural, oop, functional, parallel, logic [e.g., Prolog], constraint). Do some programming in each.
Different software architectures. OSes, standard applications (MVC, ...)
Software Engineering: requirements, specifation (especially design-by-contract), design, testing. These ideas hold in hardware engineering too.
I would start not by learning a programming language but the fundementals like below 1) computer organisation 2) operating systems theory 3) fundementals of programming (oop and functional) 4) data structures 5) Compiler design and principles 6) dbms concepts
As a budding hardware designer you might want to learn Bluespec. This is a very high-level hardware-description language based on work done at MIT. It's both a language and a company. They have some very impressive results on modularity, predictability, and reuse in hardware design. Check out the page on the Bluespec compiler and find out if you want to pursue it.
I was wondering what you think would be a good language/few languages to become more proficient in, or to explore for a first time?
What do you want to accomplish? You seem to have a good grasp of many popular languages with several typing systems and paradigms. If you want to learn something else new, I would recommend functional programming as it's vastly different from anything you will have encountered before (imagine trying to write a program without an assignment operator eg. =) and becoming more and more useful. Haskell, Scala, and F# are all forerunners of the functional programming pack.
Also, what level of programming you prefer?
It all depends on what you want to do and what skills you want to use. Hardware and system programming will involve more low level stuff (assem, C, C++). The others are less language specific, but involve other skills, like a thorough knowledge of networks and APIs.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 13 years ago.
Often, I am told that Security functions are not available at a level of abstraction that a developer with little security knowledge can use them. What changes will the developers want in their developement environment say for Java that will make securing their software much easier than today.
I am looking at new ways like providing configurability at the level where the programmer just has to declare the security function he desires and the level he wants and only really power programmers will need to go and do something really extra.
So 2 part question - what services will you want as a developer and how would you like it to be integrated into your IDE (your development environment) so that you can easily use it.
where the programmer just has to declare the security function he desires
That's like asking "What type of scalpel can I buy so I don't have to learn doctorin'?"
You can't.
The issue of "security" covers a very broad range of concerns. You don't just "turn on security" and it's done. Security issues involve guarding your software from an ever-growing number of malicious behaviors.
At the root of it, computers have to let people do things. To give users that freedom, people will always find ways of getting into things they are not supposed to get into. The people who write operating systems, frameworks, and development environments can patch holes and abstract away some of today's security concerns but new ones will always be developed. Daily. It's a moving target.
Security is complicated because you have to know what types of vulnerabilities your application can be subject to. The only way to learn that is through vigilant study and experience.
You can't just flip a switch.
The more I think about it, the more I realize that what you want actually exists. Programmers tell the computer what they want it to do in a very specific manner, using well defined languages. The computer is expected to do exactly what the programmer dictates, within the definition of the language. If the programmer wants security, s/he can already tell the computer to act in a secure manner - such as using String classes in C++ to avoid buffer overruns with char arrays.
Of course, this isn't enough. Programmers still get things wrong, so language designers try to help with things like Java. In Java, buffer overruns are much harder to exploit beyond a simple application crash (char[]c = new char[1]; System.out.println(c[10]);).
But that's not really enough - the compiler can have bugs that will insert overruns, or the virtual machine may have such bugs. And other vulnerabilities will exist - like bad permissions on files, or exploitable race conditions (aka TOCTOU).
As we find vulnerability types, we can take steps to prevent them (developer training, new language features, new compiler features, and new OS features), and we can take steps to identify them (dynamic analysis, source code analysis, binary analysis), but we can't eliminate all bugs. This is especially true as new technologies come into play (once XSS and SQL injection were understood, developers started introducing LDAP injection).
OWASP is trying to do this with it's ESAPI project.
The best way for security to work is to have it built into the API, with very context-aware, context-specific programming methods. This is what SqlParameters do, in .NET, and similar things in other languages (they grab as much context as possible; types of variables, and so on, and perform validation).
If you want to get involved, you can probably get in on the OWASP action, as long as you are motivated.
The best way to do security is to let someone else do it, and follow the API and instructions to the letter. But also, you need to have significant trust in the people doing the underlying work, and you need to stay up to date with what is happening with the libraries you use.
On that last point, you also need to stay flexible. If it so happens that a vulnerability is discovered in your underlying system X, then you may need to update or remove it completely (but most likely update). You need to have the facility to do this ASAP. I.e. to swap out hashing functions, or to change encryption routines.
Obviously, this area is complicated and interesting. I suggest to you that OWASP is a good place to get started.