It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 13 years ago.
What was the first programming language that had user-accessibility? For instance, a programming language that offered itself to the public for experimentation, personal use, hobby, etc; something that wasn't just 'behind the scenes' and for use by big companies for putting together professional products and services.
BASIC (1964) was at the very least the first popular hobby language.
It may not have been the first available but one of the most important has to be Integer BASIC, known as Apple BASIC originally. It was included with the Apple II.
I spent a lot of time in Commodore BASIC on my home Commodore 64 (a version of Microsoft 6502 BASIC very similar to Apple BASIC) and the various Apple computers my school had. Part of the reason I'm a programmer today is the joy of seeing the teachers struggle to exit an endless loop, finally giving up and rebooting the computer so the next kid could play Joust.
The Micral is regarded as the first personal computer. I believe you could only write programs for it in 8008 machine or assembly language. As for hobby languages (which were used on hobby computers), machine languages were the first, usually entered via front-panel toggles. The SCELBI and Mark-8 were the first marketed hobby computers; before that, hobby computers were custom made by their users, with instruction sets often cribbed from the PDP-8 instruction set[2]. The first higher level language was the version of BASIC produced by Bill Gates and Paul Allen for the Altair[3].
Further references:
The Micral, Armand Van Dormael
The early days of personal computers, Stephen B. Gray
Short History of the Microcomputer
Wikipedia: Micral
Wikipedia: SCELBI
If you count initial conception and prototyping, I'd give the vote to [Forth](http://en.wikipedia.org/wiki/Forth_(programming_language).
Related
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
Hi i need a non deterministic constraint satisfaction problem tool, because i need different solutions with the same input of the problem. Someone knows about a tool with this characteristic?
I only know tools like Gecode (c++), Choco (Java) and Curry (Haskell) that i think work in deterministic way.
If what you want is to get some random solution, most CP tools have some support for using randomised heuristics. For example, the standard Gecode branchers have options for this, for example INT_VAR_RND and INT_VAL_RND for integer variables. To get a different search each time, make sure to set the seed uniquely.
Note that using random heuristics will not give you any guarantee of the distribution. For example, your problem might have only two solutions, but almost all random choices might lead to one of the solutions giving a very skewed distribution.
Are you trying to do Pareto optimization (aka multi-objective optimization) and let the user choose one of the pareto optimal solutions?
People have done this with Drools Planner (java, open source) by simply replacing the BestSolutionRecaller class. See this thread and this thread. Planner 6.0 or 6.1 will provide out-of-the-box pareto support.
Similar to what Zayenz said, you can try Minion with the flag -randomiseorder.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
Other than the speed, what are the advantages/differences to each? i.e. can assembly do more than c/c++, or what advantages does java offer that python does not (excluding the fact that java is platform independent)?
A higher level programming language usually means that the programmer can be more abstract, and generally do less work, at the cost of fine control.
For example, programming a GUI in assembly would suicidal. On the other hand, machine code is necessary when you want to take advantage of device-dependent optimizations and features. I guess you can define that low-level languages as those that are used for low-level tasks, e.g. drivers, operating systems, and parsers. Of course, the definitions are always rather fuzzy.
Pretty broad question there, and I cannot answer for the specifics between python and java, but in general here's my thoughts... keep in mind, this is nearly a philosophical question (perhaps even best suited for the Programmers stackexchange group) so there's no god-like answer. here goes:
with low level languages (the closer you get to bit flipping), you tend to be closer to the system hardware and core operating system... resources are more explicitly manipulable... this makes for efficient programming in that you can streamline your logic, skim off the bundled crap you don't need and develop the perfect piece of code... the consequence is that it's usually harder to think in and thus code in.
higher level languages provide abstraction, moving the developer away from the worries of 1s and 0s to focus on more complex requirements of a system. they allow us to think about things closer to the semantics of human communication and thought. they also allow us to share work cross-platform sometimes when programmers work in high level languages that compile to common run-times. there are numerous other reasons, but you get the gist.
ultimately I look at low level languages as something to use to "make the best tool", and high level languages to "make the best use of the best tools"
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
How to Describe "Use case Diagram" in formal style?
Anyone have template ?
If you mean by "describe a use case diagram" describing each use case individually, there is a "procedure" in RUP methodology.
CASE tools like Enterprise Architect or MagicDraw support it by built-in forms for specifying preconditions (what must be fulfilled before the use case takes place), postconditions (what is fulfilled after taking place) and scenarios (what is particular flow of events or actions, and it alternatives) etc.
But if you are serious about describing your use case, filling all its details into those tiny forms is quite uncomfortable and not-providing-easy-survey. You may produce a .rtf generated by the tool from your use case model (providing a template already present in the tools, usually not very good-looking:).
Another way (and my preferred) is describing use cases in a separate Word document by hand (and paste the use case diagram into it). This guy wrote an amazing book "Writing effective use cases". I personally recommend it to everyone coping with use cases in his every day job. Here you can find a "compressed" guidelines extracted from the book.
I believe most of the UML drawing packages will have templates for UC diagram. For example see "dia" or eclipse with modelling framework
Other non-free tools include MS visio, MagicDraw and so many more
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 13 years ago.
Visual Basic .NET
C##
etc
C#? With two pound signs?
It's on so many of these programming résumés we're getting -- from random people -- listed as a qualification.
Any ideas what these folks are talking about? Is this convention an accidental holdover from C++, or something?
EDIT/ANSWER: Turns out the corporate résumé management system converts the "C#" that applicants specify to "C##". That is just fantastic.
My guess is you shouldn't hire them.
Looks like a recruiter who doesn't know what he is talking about is trying to impress you.
That résumé speaks for itself - little attention to detail. Not good for a programmer...
If it's on "so many" I'm willing to bet that the candidates don't know what they are talking about. Similarly, I have seen 'C+' listed as a language as well.
It's not uncommon for people to list as many languages on their resume as they can, because the Bad Ones think that even knowing the name of the language gives them a foot ahead of someone who doesn't. This is obviously a flaw in logic.
I can't remember exactly where I heard this story before (someone's blog, maybe someone will remember) but the exact situation is described. A candidate comes in with a resume listing all of these languages. As the interviewer asks the candidate to demonstrate their knowledge of the language by writing some code, the candidate freezes. When the interviewer asks why, the candidate responds with "I didn't say I knew how to write in those languages, just that I know of them!"
I received a resume before has this line in the list of experiences
C \ C+ \ C++ \ C#
:)
C Sharp - now even sharper!
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Can anyone point me towards any references that attempt to formulate an economics of software development? In my own research, I discovered a book by Barry Boehm on this, but it seems very awkward and theoretical.
Dependency Structure Matrices seem to offer something worthwhile. Carliss Baldwin has used these in some work on modularization, boundaries, and transaction costs. A lot of it comes off as just common sense, though.
Also, economists have developed something called Behavioral Economics. Is there a "Behavioral Software Engineering" that addresses cognitive biases in developers or groups of developers?
Here's an interesting looking reference:
http://www.amazon.com/Knowledge-Sharing-Software-Development-Comparing/dp/3639100840/ref=sr_1_1?ie=UTF8&s=books&qid=1232979573&sr=1-1
Before Hal Varian became the Chief Economist at Google, he had worked on the economics of information technology at Berkeley, although he did not focus on software development per se. Nevertheless I would recommend a look at his paper on the more general topic from 2001. You can find a more complete list of his research work on his website. Hope that helps.
Software as Capital wasn't a waste of time, though you won't find any math in it and it reads like a PhD thesis because it started as one.
Another review.
I think that what you're looking for might fall under a sociology of software development... sociologists study all modern subjects, and from there you will no doubt find references to an economics of software development if there is one.
Facts and Fallacies of Software Engineering by Robert Glass has some dollar amounts associated with some activities (or, at least, percentage of total budget). Don't know if that helps at all, but it's something.
Several years ago I taught an "Economics of E-Commerce" course using Varian's book INFORMATION RULES. His idea of lock-in, though, leads the reader almost towards a drug-addict model of purchaser behaviour and exploitation. This book is more of an economics of e-business than an analysis of the software development process.
In terms of actually making software, there are ideas in the Mythical Man Month well worth knowing about.
The "Applied Information Economics" approach of Douglas Hubbard could be part of what you're looking for. If we assume software development is (often|always|sometimes|???) about supporting decision making by providing (better|more accurate|more up to date|whatever) information, then AIE helps as it's a technique for quantifying the value of better information. Read Hubbard's book How to Measure Anything for a good overview of the idea.
Also, the book Software By Numbers by Mark Denne and Jane Cleland-Huang provides a model for managing software projects by using something they call the "Incremental Funding Methodology". IFM is based on decomposing software projects into features based on the business value created, rather than decomposing them along technical boundaries. They then use a series of calculations based on Discounted Cash Flow (DCF), Net Present Value (NPV), Internal Rate of Return (IRR), etc. to show when in the project lifecycle the project will reach self-funding status, when it will reach "breakeven" and when it will generate a real positive cash return for the organization.
You might also find the Capability Cases book of interest. It doesn't strictly deal with any economic issues in detail, but it's an approach to software specification which attempts to more clearly map software capabilities to business strategy and business issues.