IBM Cell programming in 2010 - feasible and worth it? - programming-languages

I would need your help. I've come across an interesting book - Programming the Cell Processor: For Games, Graphics, and Computation - it contains mostly C and some Assembly for Cell. The technology is interesting indeed, but there are some doubts on my side.
The book is from 2008 and some things has changed:
There is no Linux support on current firmware version.
Last version on IBM's website is from 2008 Red Hat Enterprise 5.2 and Fedora 9 - has anyone an experience running this IBM SDK on Fedora 13 or at least any version higher than stated Fedora 9, and is Linux available of sufficient testing?
Would it be useful for example for creation of distributable PSN game, and if anyone knows anything about price to actualy get a product there (as I've heard that it waaaaay more expensive than for example X-box indie games)
So do you think that it is worth it or not? Be it just for education purposes or something "more" serious?
Any thoughts are welcomed, thank you!

Cell was dumped by IBM for general purpose computers. It will live for the next 5 years in the Playstation and i'm pretty sure that the next generation Playstation - whenever it will be ready - will also use Cell again because establishing something new in CPU land is so unaffordable today.
But as a technolgy it is indeed no longer interested. Learning CUDA might be a better choice.

Given that you don't have access to a Cell machine, I'd advise that it's probably not worth it. I absolutely love the Cell architecture - I think it was a fantastic step in the right direction. Unfortunately, having done some Cell development in the past, I was really disappointed with the tool chain, the simulator and the seemingly hostile attitude taken towards developers recently.
So given that you're not going to be able to use a real Cell machine in order to get the speed gains you would get from writing programs within that idiom, you'd probably be much better off looking into general distributed programming techniques (using MPI or something similar). These skills are going to be readily transferrable to the Cell or its derivatives, or any similar architectures that might arise in the future.
As far as I'm concerned, and as much as it pains me, I think the Cell is basically a developmental dead end unless you have access to a commercial development license, you'll be extremely frustrated in your ability to actually get anything out of the architecture.

Related

Advanced Programming in the Unix Environment 1st Edition [closed]

This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center.
Closed 11 years ago.
I am taking an undergrad operating systems class next semester and this is a recommended book. Iam wondering if you would still recommend Advanced Programming in the Unix Environment 1st Edition as opposed to the second edition. I know you cannot recommend a book for a class you have not taken(not what I am asking for) but am wondering if anyone has read/owns both versions and whether or not they feel the 1st edition is still relevant or due to its age(written in 1992) I would be better off investing in the 2nd edition. I don't know a ton about unix and after taking a look at the 1st edition it seems like its a wealth of info let me know what you think
From the book's web site:
The second edition of Advanced Programming in the UNIX® Environment has been updated to reflect contemporary operating systems and recent changes in standards. In addition, the example chapters were overhauled. The four platforms used to test the examples in the book include FreeBSD 5.2.1, Linux 2.4.22, Mac OS X 10.3, and Solaris 9. These platforms are a moving target, and most likely there are newer versions available now, so your mileage may vary.
Major changes include the addition of a chapter on sockets, two chapters on threads, and the removal of the chapter discussing modem communication, although this lost chapter is available here. Additionally, the printer communication chapter was rewritten to account for today's network-based printers.
To my mind, the most valuable of these changes is the testing with modern platforms. APUE 1/e barely mentioned Linux and of course didn't cover OS X at all since it hadn't been created yet. 2/e fixes this.
That's not to say that APUE 1/e is useless for Linux and OS X systems programming. I used it successfully with Linux for many years. I can't think of any time a topic it covered didn't implicitly cover at least one way to do it on Linux. The main difficulty is that where there is more than one way to do something, APUE usually gives them all, but with 1/e you had to just try them all to find out which one Linux supported. It's a worse problem with OS X, because its kernel is less ecumenical than Linux's.
I don't miss the chapters on threads and sockets in my 1/e copy because I have other books for that. As a new systems programmer, you will find them valuable until you find a reason to get something more comprehensive in those areas. They're both topics worthy of full books. (Full shelves, really.)
Anyway, bottom line, I still have my 1/e copy despite buying 2/e for work. The 1/e copy just went home is all. It's still useful.
It's a good book, and the first edition is not very out of date. Much of the point of Unix is to limit how much the features and interfaces change over time. The older version of the book is still very valid, and the fact that there are only two editions in nineteen years speaks to the stability of the unix libraries and utilities. Of course, your professor should be able to explain and differences you might encounter.

AutoCAD 2006 vs. 2012

I'm looking to learn AutoCAD. I have found several videos online that relate to 2006 AutoCAD - but is there a difference to any of of the versions. I have seen job postings asking to know AutoCAD 2008 -- what happens if I only know 2011 or even 2010. Can I work with 2008? Is there a difference to any of this versions or years?
AutoCAD is a lot like Windows... They have major releases and minor releases, so the change from 2006-2007 was a significant change. They roll out a major release every couple years or so. Still, it just depends on what you're doing. If you've got to draw a line, it's drawing a line, and that doesn't change a lot from one release to another. Some companies use the "features" of the software, but lots of them don't. My advice: get an account with Autodesk University here and click through the online classes. Look for some basic AutoCAD classes. It will really help you learn about the software and the changes made from one release to another. Also: If you get an interview for a CAD job, they will probably give you a test... usually it's just drawing something in CAD from a piece of paper. I had one of these where I had to use a version of AutoCAD called Architectural Desktop. I had never even seen Architectural Desktop before, so I asked the person interviewing me, "Where do I start?" He showed my how to start, and I actually got the job. That was 7 years ago and I still work for that company today. Use the free tutorials to acquaint yourself with the software, but don't be intimidated by it. If you get as far as testing in an interview, do your best, and don't be afraid to ask questions.
If I recall correctly 2006 was still a version without the ribbon interface. In any case, the most significant change in user interface in the recent years was exactly that - caused quite a bit of stir when it was first introduced, and many drafters still switch to "old" toolbar.
As far as changes go, yes, there are quite a few. But as Asheville said, they more relate to some advanced features of the software, which at this time you will probably not be using. My advice would be to start with some of the newer versions ("the ribbon" ones) and adjust yourself to it. After you've grasped the fundamentals, and found your way around, and wish to expand your knowledge in a more systematic way (although we all know this almost never works :) I would go to one of the either; "Autocad xxxx Bible", or "Mastering Autocad xxxx" books where xxxx signifies the version. They are quite heavy (figuratively and literally) and you can skim through as you progress. Most of the things in there you probably won't need, unless you find yourself working in a large draft office which has it's own way of organizing data, drawing styles, ...
Autocad forums are also a good place to ask questions (search first !) ... the community there is quite helpful.

Where can I start looking to better understand how computers work?

I've been trying to figure out what computer field I want to go into later on in life. College is just around the corner for me and I've considered looking into Computer Engineering, Software Engineering, etc.
Lately, I've been looking into computer security systems and exploitations of such (purely for educational purposes, on my own property). Unfortunately, it seems to me like 99% of the people out there have no idea what they're talking about. Oftentimes, it's just "run this" or "run that" or "you can find a program that will do all that for you" - no one knows how these programs work or what exactly they do.
I find no fun or interest in using something that someone else created simply to call myself a "hacker" as most people do. In fact, I'm not even interested in hacking systems as much as HOW they do it.
My question all comes down to this.
I want to learn the ins, outs, ups, and downs of computers - everything from abstract concepts such as the internet and data transferral, to hardware. I want to know how computers store data (how the bites are organized, etc.) and what processors, etc. actually do. What is WIFI, really? Do computers communicate with light (something I picked up from a magazine that I read on a plane).
I have multiple years of computer/programming experience, but so much of what I know about computers in general is very broad. Computers send packages of information back and forth between one another, each with a header and content. Computers are composed of multiple components, each with their own function (processor, video card, RAM, hard drive(s), etc.), which I have some basic understanding of already. etc. etc. etc.
There is just so much to a computer and I don't know where to start. I'm sure some of my college classes will clear things up for me, but I'm so curious that I want to start learning as much as I can now.
This question is probably all over the place, so please ask me to clarify when necessary. I'm a little jet lagged at the moment, but I tried to write my thoughts in the quickest, most coherent way possible (I could have completely failed in the process, though).
Thanks in advance for any advice!
Justian Meyer
Please, feel free to edit the tags for this question. The current ones are terrible.
EDIT:
All these comments are making me excited :). So much to learn, so much to explore :).
To help you choose which specialization to go into, I would very highly recommend computer engineering(Known as CMPE or CE in college course books). Your classes will take you to everything you just listed, and with electives you can delve deeper into whichever aspects you wish(such as security and networking).
In CMPE you will learn both software(C, C++, and some C#) and then hardware( maybe two electrical engineering classes). Once you get to assembly programming, you will start to learn how the two combine to make up everything else in any computer or embedded system. It will take you down to the bit level of memory, CPU, data buses, I/O, and so many other things. I am just starting to do Digital Design, and its ****ing glorious. From what you described, you will enjoy being a CMPE major greatly.
There's computer science majors and software engineers; there's electrical engineers; but there is no cell phone, GPS, or computer designed without computer engineers!
Structured Computer Organization, Tanenbaum
It is a great book and explains everything from a transistor to a Java virtual machine.
These two helped me understand how the OS and memory in general works.
I believe a lot of things are derived out of these 'simple mechanics.
1.Anatomy of a program in memory
2.Pushing the limits on Windows memory
Steve Gibson of security now has been doing a series of podcasts on computer basics.
http://www.grc.com/securitynow.htm Episode 233 "Let's Design a Computer (part 1)" up to the most recent one "What We'll Do for Speed".
Every other episode he does listener feedback and those are good to listen to too.
a few times (like right now) they interrupted the series if a important security news item comes up (like when that big SSL thing broke a few months ago)
Its a really good show and I recommend starting on 233 and working your way up, then starting over on episode 1. Has also done very good series on how a computer network works and how cryptography works. (Ep 203 will blow your mind when he talks about the Boyer & Moore
method of searching)
Since you are deciding where to go exactly, to be in software development or to become expert in hardware and networking, I would like to point out that in my opinion it is two different occupations and they require two different mindsets. Good hardware experts are usually not good programmers and good programmers almost always not experts in hardware and networking. So I would say don't try to embrace both, stick to one direction which is most suitable to your mindset. To pursue two rabbits would result in catching no one.
#Justian
I see, sorry I somewhat misunderstood you. Desire to understand intricacies of how code gets processed inside of hardware is a very natural one. When in college I was reading the book "How computer works" - it is fairly simple, even somewhat primitive book about general hardware functionality. But it can get you a broad look on the topic.
Another analogy came to mind. Say linguists research internal mechanics of language, but it is neuroscientists who research on how language signals get processed in brain. Two very different occupations. This is not to discourage you from learning hardware though, this is just to underline difference between two realms.

Development time in various languages

Does anybody know of any research or benchmarks of how long it takes to develop the same application in a variety of languages? Really I'm looking for Java vs. C++ but any comparisons would be useful. I have the feeling there is a section in Code Complete about this but my copy is at work.
Edit:
There are a lot of interesting answers to this question but it seems like there is a lack of really good research. I have made a proposal over at meta about this problem.
Pratt & Whitney, purveyors of jet engines for civilian and military applications, did a study on this many years ago, without actually intending to do the study.
They went on the same metrics kick everyone else went on in the 1990s. They collected a bunch of data about their jet engine controller projects, including timecard data. They crunched it. The poor sap who got to crunch the data noticed something in the results: the military projects uniformly had twice the programmer productivity and one/fourth the defect density as the civilian projects.
This, by itself, is significant. It means you only need half as many programmers, and you aren't going to spend quite as much time fixing bugs. What is even more important is that this was an apples-to-apples comparison. A jet engine controller is a jet engine controller.
He then went looking for candidate explanations. All of the usual candidates: individual experience, team size, toolsets, software processes, requirements stability, everything, were trotted out, and they were ruled out when it was seen that the story on those items was uniformly the same on both sides of the aisle. At the end of the day, only one statistically significant difference showed up.
The civilian projects were written in every language you could think of. The military projects were all written in Ada.
IN EVERY SINGLE CASE, against every other comer, for jet engine controllers at Pratt & Whitney, using Ada gave double the productivity and one/fourth the defect density.
I know what the flying code monkeys are going to say. "You can do good work in any language." In theory, that's true. In practice, however, it appears that, at least at Pratt & Whitney, language made a difference.
Last I heard about this, Pratt & Whitney upper management decreed that ALL jet engine controller projects would be done in Ada.
No, I don't have a citation. No paper was ever written. My source on this story was the poor sap who crunched the numbers. Here's a similar study from 1995:
http://archive.adaic.com/intro/ada-vs-c/cada_art.html
This, incidentally, was BEFORE Boeing did the 777, and BEFORE the 777 brake subcontractor story happened. But that's another story.
One of the few funded scientific studies that I'm aware of on cross-language productivity, from the early 90s, funded by ARPA and the ONR,
Haskell vs. Ada Vs. C++ vs Awk vs ... An Experiment in Software Prototyping Productivity, Hudak & Jones, 1994.
We describe the results of an
experiment in which several
conventional programming languages,
together with the functional language
Haskell, were used to prototype a
Naval Surface Warfare Center (NSWC)
requirement for a Geometric Region
Server. The resulting programs and
development metrics were reviewed by a
committee chosen by the Navy. The
results indicate that the Haskell
prototype took significantly less time
to develop and was considerably more
concise and easier to understand than
the..
This article(a pdf) has some benchmarks (note that it's from 2000) between C, C++, Perl, Java, Perl, Python, Rexx and Tcl.
Some common wisdom I believe holds true (also somewhere within the article):
The number of lines written per hour is independent of the language
Opinion: more important is what is faster for a given developer, for example yourself. What you are used to, will usually be faster. If you are used to 20 years of C++ pitfalls and never skip an uninitialized variable, that will be faster than Java for anybody.
If you remember all parameters of CreateWindowEx() by heart, it will be faster than MFC or winforms.
A couple of anecdotal data points:
On Project Euler, which invites programming solutions to mathematical problems,
the shortest solutions are almost invariably written in J or K, a relative of APL; there are occasionally MatLab solutions in the same range. It can be argued, though, that these languages specialized in math.
runners up were Ruby solutions. A lot of algorithm can be wrapped in very little code, and it's much more legible than J / K.
Python and Haskell solutions also did very well, LOC-wise.
The question asked about "fastest development," not "shortest code." But it's conceivable that shorter solutions are faster to come up with - certainly for slow typists!
There's an annual competition among roboticists. Contestants are given some specs for some hardware, a practical problem to solve in software, and limited time to do so. Again very domain specific, of course. Programmers have their choice of tools, including language of course. Every year, the winning team (often a single person) used Forth.
This admittedly limited sample suggests that "development speed" and "effect of language on speed" is often very dependent on the problem domain.
See also
Are there statistical studies that indicates that Python is "more productive"?
for some discussions about this kind of question.
It would make more sense to benchmark the programmers, not the languages. The time to write a program in any mainstream language depends more on the ability of the programmer in that language than on qualities of that specific language.
I think most benchmarks and statements on this topic will mean very little.
Benchmarks can always be gamed; see the history of "Pet Store".
A language that's good at solving one kind of problem might not apply as well to another.
What matters most is the skill of your team, its knowledge of a particular technology, and how well you know the domain you're trying to solve.
UPDATE: Control software for jet engines and helicopters is a very specialized subset of computing problems. It's characterized by very rigorous, complete, detailed specs and QA that means the multi-million dollar aircraft cannot crash.
I can second the (very good) citation by John Strohm of Pratt & Whitney control software written in Ada. The control software for Kaman helicopters sold to Australia was also written in Ada.
But this does not lead to the conclusion that if you decided to write your next web site in Ada that you'd have higher productivity and fewer defects than you would if you chose C# or Java or Python or Ruby. All languages are not equally good in all problem domains.
Language/framework comparison for web applications
The Plat_Forms project provides some information of this type for web applications.
There are three studies with different tasks (done in 2007, 2011, 2012), all of the following format: Several teams of three professional developers implemented the same application under controlled conditions within two days.
It covers Java, Perl, PHP, and Ruby and has multiple teams for each language.
The evaluation reports much more than only development time.
Findings of iteration one for instance included
that experience with the language and framework appeared to be more relevant than what that framework was.
that Java tended to induce teams to make laborious constructions while Perl induced them to make pragmatic (and quite handy) constructions.
Findings of iteration two included
that Ruby on Rails was more productive in this type of project (which due to its duration was more rapid prototyping than full-blown development of a mature application)
and that the one exception to the above rule was the one team using Symfony, a PHP framework that has similar concepts to Ruby on Rails (but still the very different base language underneath it).
Look under http://www.plat-forms.org or search the web for "Plat_Forms".
There is plenty more detail in the reports, in particular the thick techreport on iteration 1.
Most programs have to interface with some other framework. It tends to be a good idea to pick the language that has libraries specifically for what you are trying to do. For instance are you trying to build a distributed redundant messaging system? If so I would use Erlang. Are you trying to make a quick and dirty data driven website, use Ruby and Rails. You get the idea. Real time DirectX where performance is key, C++/C/Asm.
If you are writing something that is algorithm based I would look to a functional language like Haskell, although it has a very high learning curve.
This question is a little old fashioned. Focusing on development time solely based on the choice of language is of limited value. There are so many other factors that have equal or more impact than the language itself:
The libraries or frameworks available / used.
The level of quality required (ie. defect count).
The type of application (eg. GUI, server, driver etc...)
The level of maintainability required.
Developer experience in the language.
The platform or OS the application is built on.
As an example, many would say Java is the better choice over C++ to build enterprise (line of business) applications. This is not normally because of the language itself, but instead it is perceived that Java has better (or more mature) web server and database frameworks available to it. This may or may not be true, but that is beside the point.
You may even find that the building an application using the same language on different operating systems or platforms gives greatly differing development time. For example using C++ on Linux to build a GUI application may take longer than a Windows based GUI application using C++ because of less extensive and mature GUI libraries avaialble on Linux (once again this is debatable).
According to Norvig, Lutz Prechelt published just such an article in the October 1999 CACM: "Comparing Java vs. C/C++ Efficiency Issues to Interpersonal Issues".
Norvig includes a link to that article. Unfortunately, the ACM, despite having a bitmap graphic proclaiming their goal of "Advancing Computing as a Science & Profession", couldn't figure out how to maintain stable links on their webpage, so it's just a 404 now. Perhaps your local library could help you out.
That Ada story might be an embellished version of this: http://www.adaic.com/whyada/ada-vs-c/cada_art.html
Erlang vs C++/Corba
"... As the Erlang DCC is less than a quarter of the size of a similar C++/CORBA implementation, the product development in Erlang should be fast, and the code maintainable. We conclude that Erlang and associated libraries are suitable for the rapid development of maintainable and highly reliable distributed products."
Paper here
There's a reason why there are no real comparisons in that aspect, except for anecdotal evidence (which can be found in favor of almost any language).
Actually writing code takes relatively small portion of developer's time. Even if language lets you cut coding time in half, it will be barely noticeable by the time project ends. Design, structure of program, development process are all much more important, and then there are libraries, tools and experience with them.
Some languages are better suited for certain development processes than the others, so if you've settled on design and process you can decide which language will be more efficient - but not before.
(didn't notice there's a similar answer already, so feel free to ignore this)

Is there a stable Programming Language for Web Programming?

A renowned PHP user once said: There will be a relaunch in 2 years, anyway.
Those times are gone. Web applications that are older than 5 years are common. With the original developer(s) gone.
The release cycles of the operation system, programming language, and framework are getting in the way of doing real work, if you don't have a big staff.
Is there any way to develop something that doesn't need constant porting to the next level, without the fear of losing support and backing in a community? For people who want to stay in programming instead of climbing the corporate ladder and leaving the problems to the next "generation"?
My company codes almost exclusively in C#, however we have ColdFusion 5 apps still humming along written back in 2001 or so. Theres no need to port them.
If it ain't broke, dont fix it.
Other than security flaws (which are usually handled by an OS/Server Patch, so they dont need code changes), theres no need to change an app just because a new version of the language has come out.
If I'm not mistaken, ColdFusion has had at least 2 new releases since we stopped using it for new code. but that hasn't affected our ColdFusion sites one bit.
Write CGI programs in FORTRAN 77. Should be pretty stable.
Firstly, it is possible to overstate the difficulty in maintaining web applications. In many cases, the changes to a language or platform are expansionary in nature rather than destructive. .NET, python, etc code from several years ago will still run, but new options are being added to make these these tools more powerful for future applications. The case where massive changes occur tends to be on the first or second iteration of a language, e.g. Rails 1 to Rails 2.
Secondly, the still active development of web programming is something to be thankful for.It means that this is a part of the industry that will remain productive and exciting for years to come.
Traditional CGI is stable. It's not sexy, but if your OS continues to be able to run the same binaries or scripts, it's still going to work.
The only programming frameworks that stay stable are those that have been abandoned. A framework that stood still long enough would have no support for, say, AJAX or JSON or even XML.
You're not going to find what you're asking for. The best you can hope for is a mature framework with good support like ASP.net or JSP. And, as #Neil N said, don't keep upgrading unless there's a compelling business need.
The first web programming I ever did was writing Apache modules in C which communicated with a dBase database. I'm fairly sure that code would still run today (if the company I wrote it for still existed).
I do most of my current web-related programming in Perl, which is very stable and has an excellent track record for backwards compatibility. Most, if not all, code written for Perl 4 (released 21 March, 1991) should still run on the latest stable Perl (5.10) - although you might want to update it anyhow to take advantage of the last 18 years of improvements in both software development techniques and language features.
Consider the shearing layers. I've previously worked in large aerospace companies where the same Fortran back-end code and databases have had their front-ends evolve from the paper tape era through mainframe, client server and onto Intranet web sites.
On the outside, you have will typically have CSS and XHTML templates which can be changed to re-skin an application. These change quite rapidly, in large organisations as upper management seems to decide the bike shed should be a different colour every few weeks.
Typically you then have some logic to combine the templates with data from the back-end, and forward user actions to the back-end. This shouldn't change that rapidly, but translate the presentation to calls into the back-end. Expect to refresh this every few years, and rewrite it once a decade. We used Java for this, starting in the late 1990s. Some parts get changed faster than others, but it's not a big issue.
The back-end is usually stable ( some of the aerodynamics code dated from the 1970s; the laws of physics don't change that often ), and will outlast the web UI, as it has all the other UI paradigms. Fortran is forever.
Write your own web server in C then you don't have to worry about a web programming language.
(No, that's not a serious answer)
Have you seriously looked at what TDD, CI, pair-programming, and a solid, rapid development framework (basicaly Django or Rails) can offer to you as a developer vis-a-vis the way you write and design code? There are some really massive benefits that all of those pieces offer to the development process that make it almost a joy to be a programmer again. There are downsides, of course, but the upsides are all in support of the happiness and ease of development for the engineer, which leads to more productivity. In my book, that's a slam dunk win. And the result of my productivity and happiness, has been solid products and great engineering.
YMMV, but if you are having the serious thoughts that you are (and I take them very seriously), I think it's worth you investigating what those tools can offer. By taking the good and leaving the bad from the agile religion plus some of the things I listed above, I've returned to find the joy in programming again this last year, after a good 5 years of a downhill slide of my happiness with this career. It's about finding what works for you. I can only help and lead the way by showing you what worked for me. I'd be more than happy to discuss at length if you want to talk offline, I think this is a really important topic...it lead me to consider a career change many times.
Java Servlets and JSPs have been in use for a decade or so, and they still work the same way like they did in '99. But honestly, can you imagine something uglier than a '90s web application without any rework done since?
The Python web framework web2py promises backward compatibility:
Always backward compatible. We have
not broken backward compatibility
since version 1.0 in 2007, and we
pledge not to break it in the future.
And supports Python versions from 2.4 to 2.7
EDIT: Updated an important project 2 times and every time there was a problem. Well, …
EDIT 2: Needs Python 2.6 to 2.7 now. No support for Python 3.

Resources