I am currently taking a year off between high school and college (computer science).
I'm pretty good with Visual basic (unfortunately, this is the only language my school offered in High School). I've dabbled in some PHP, and have pretty good knowledge of broad programming principals and concepts.
I'm more interested in web programming then conventional, but I'd like to do both.
What are some good languages I should pick up over this next eight months, and what are some good, (tough but attainable) goals I should set for myself in this time frame?
Thanks!
If it's web programming you're after, you have three top contenders at the moment for web development (in no particular order):
ASP.Net
Ruby on Rails
PHP
If you've been schooled on VB, the ASP.Net might offer the most familiar development environment, but all three are very marketable.
As far as personal development and goal-setting is concerned and given you only have 8 months to work with, I'd say you want to get intimate with the following concepts and how the work in practice:
unit testing
CSS
JavaScript
See if you can write your first practical application. This will set you up incredibly well for future employment if you can say you actually delivered something (grades are good, but delivery and experience is better).
If you really want to aim high, see if you can secure a casual part-time job at a software shop.
Ruby on Rails is pretty cool and easy to learn esp. if you are after web development.
AJAX can help you with giving your web page some cool features.
I would suggest you to get the book : Agile Development with Ruby on Rails. It will help you get started.
You may want to focus on the language that the college you will be attending will be teaching their low-level classes in. this way, you can have a bit of a head start on the class, giving you more time for your other classes.
I think most colleges currently start off in Java. You should be able to find that out with a bit of research.
If it is good grades and ease of programming in college you seek, learn the language of choice of the school you plan on attending. Most schools stick with one primary language for the introductory classes as many universities teach conceptual programming. I think the most common languages right now for universities to teach are Java and C++ as both offer good, cross-platform introductions into object-oriented concepts such as polymorphism and aggregation.
If you are attending a technical college or community college for an associates degree, those are normally more applied and teach "how to program a website with PHP" or similar. In that case, you may focus on the fundamentals of the class such as how do web application work, learn about compilers and how they work, etc. Things they won't teach you but are valuable to know in the real world.
If you want to parlay this knowledge into a job writing web applications, you must consider where you may work. Different industries have accepted different languages. Many young businesses and industries accept newer languages such as PHP, Ruby, etc. Some shops are purely Windows (there are a lot) and do much of their web apps in .NET. Then there are still a number of middleware-based solutions such as WebSphere, WebLogic, JBoss, etc. There are also some in-between things that are still web focused such as PeopleTools programming. You may also consider learning about web application scaling.
If I were you, I would focus on a primary skill you already posess, and nurture that so that you become highly skilled. You can't master everything, but being an expert in something makes you desirable.
Hope this helps.
Read some books.
The Pragmatic Programmer. From Journeyman to Master, Thomas, Hunt - to make your brain think in a pragmatic way, not PHP or other technology way. PHP or Ruby will die, the knowledge from this book won't as it's universal.
Apprenticeship Patterns, Hoover, Oshineye - to plan your career, get to know what's important, what to avoid and what to do to make yourself better.
Personally, I'd start looking at data structures and algorithms, they are the building blocks of good computer science knowledge, as most of them will make use of the majority of features of any given programming language, and as you learn to implement these in the chosen language, you'll get to grips with the programming language.
I heartily agree with Muad'Dib. Look at what language the course is using, and start using that language. If it's C++, then you can get utilities like Cygwin for Windows where you can develop in a virtual linux box without having to re-wipe your computer.
If it's Visual Studio stuff, then there's the MSDN Express stuff that's free from Microsoft, although it's a bit of a download.
Also, the Pragmatic Programmer is a MUST READ! It's full of great advice, and you're at the very best stage to start picking up good habits, start doing that now, and you'll go far in the programming world.
Hope that helps.
Related
When I was a kid I wrote hundreds of programs in BASIC but then as I got older I got out of it (when I discovered girls). Now I want to get back into it again and I don't want to let my prior knowledge & experience go to waste - is there a modern language that is at least somewhat similar? Every time I try to search I get pushed toward Visual BASIC but I would rather learn a modern language that's more widely used. Any suggestions? Thank you in advance!
Start from scratch.
Programming in a modern language (Object or Functional) is different enough from programming basic on a C64 that you will probably carry over more bad habits than good ones.
I would pick a language you like the look and feel of, but mostly think of what you want to do:
Java is probably the "safe" bet, especially if you want to start a career in programming or if you want to work on Android development.
If you want to program for Windows / Microsoft devices then C#
If you want to want to write for the Mac or iOs devices then Swift.
If you like the idea of functional programming then Clojure is a good bet.
If you want to do web development then Javascript and maybe Ruby
If you want to work on things like machine learning or statistics then Python to start and then maybe R
If you want to be cutting edge and maybe work on some DevOps kind of things I would suggest Go
With all of these I would suggest also learning some flavor of SQL
Languages I personally would generally avoid either because they are overly complex or tend to teach bad programming practices:
Objective C, C++, Perl, Lisp, Ruby
If you want to explore some other more esoteric languages I recommend two books:
Seven Languages in Seven Weeks
Seven More Languages in Seven Weeks
Keep in mind, that just because you might start from scratch it doesn't mean your prior experience goes to waste, it just may not be as useful as you may like.
I was in this exact position about eight years ago; whilst I could do some assembly and BASIC, these skills were (and are) generally not required in a modern context. So I went to study a Foundation Degree in Enterprise Computing in the UK (MMU affiliated) because this had Java. Due to a Government change in 2010, that cut funding to Higher Education, the 3rd year of that course was scrapped for all affiliated establishments, so I spent a year at the University of Derby on its Games Programming degree, which was all programming in C, C++, MIPs assembly, C# and Java.
I found the following useful:
6502 is good if you want to learn more modern assembly like MIPs; Z80 is probably good for x86/64, though that is an educated guess rather than fact (I use both 65x and Z80 in personal projects today mixed with C when I get the chance);
C is the most beautiful language that I've ever used. I did C programming on Windows and for the PSP. I've since made Sinclair ZX81 games with C and done a bit of experimental programming for the Commodore 64 and Sinclair ZX Spectrum. I love C;
Object Orientated programming took me a while for me to get my head around. At first, I thought an object was simply a container for computer RAM. Maybe this is a good base to think about it, maybe not;
Going to University is a good thing because you will always learn something if you apply yourself;
8-bit BASIC can still teach you a thing or two if you can transpose your logic without bad practises that 8-bit BASIC encourages;
I had most difficulty understanding databases, mostly the relational algebra side but also all other database stuff. I finally got my head around M:M relationships sometime last year after years of looking at it. If you struggle with SQL/Database stuff, don't give up;
I now work at a PHP Web Application developer with bespoke OO and Procedural frameworks. I have also worked with simpler off-the-shelf solutions such as Magento, CodeIgniter, Joomla! and ExpressionEngine (built on CodeIgniter).
I'm a second year Computer Science student, and am currently applying for jobs for the Summer '11 coop term, this will be my first.
I have found a lot of the jobs 'perfect' candidate have many technical skills I do not possess.
For example, one which I have an interview Monday for, uses Java Enterprise Edition, but I've only used regular Java. Also many things I have no experience with whatsoever, like XML, Adobe FLEX, and Ruby.
Obviously the employer is not expecting a potential employee to have all of these skills, but I was wondering how difficult it would be to pick them up?
I have a strong knowledge of C and Java, and of many concepts and data structures.
With this in mind, is it difficult to pick up languages like Ruby or less related tech like XML or AJAX for example?
Or if I have a good background in compsci, do the concepts apply broadly, and mostly it's just the syntax and rudimentary concepts I will need to pick up to get started?
Please if you have any advice, feel free to share.
Thanks for all your help!
PS: I noticed that most people seem to call Java Enterprise 'J2EE' isnt it now just 'Java EE'? Whats up with that?
Once you've mastered one programming language, you'll easily catch up on a whole family of related languages. The differences are more than just syntax, but there is more or less a fixed set of features that most of the popular languages provide.
A good way to gather programming experience would be to become very proficient in at least one language. I don't know what your experience with C is, but unless it's over 3 years it won't take you much time to get to the same level in any other language, and it won't be too hard anyway. If you ask me, knowing C is a "must" because you learn a little about how the computer executes your programs. Some of the higher-level languages have been (initially) implemented in C themselves.
I'm not a Ruby programmer but i'm guessing it's not a big deal if you already know some C and Java. It's probably even easier to learn than those two. XML is just a data format, and AJAX is a buzzword for "doing stuff dynamically in the browser-based client side of a web application". So there isn't much "tech" to learn here, it's just a matter of knowing and mixing skills to get that kind of thing done. (Basically you'll need to learn at least a little bit about javascript, HTML and HTTP, and know how browsers and web servers work).
Computer science is a branch of mathematics, and as such it will always be relevant. Learning about specific algorithms, data structures, etc. is important but more important (from a practical point of view) is the knowledge of how algorithms or data structures perform, how they can be analyzed, what affects their performance, etc. If you got the basics, you can always open a book by yourself and learn more algorithms and data structures, and it's probably a good idea.
Finally, a useful set of things to know in today's software world is: networking (esp. TCP/IP and HTTP), C, Java / C# or both, minimal knowledge of Javascript, and a little experience with programmatic access to XML
I'm currently taking a year off between high school and college. I'm working as a junior IT technician, so I'm getting plenty of experience on the hardware side of things. I want to use this year off to also get started on some programming; I have experience in Visual basic from high school courses, but want to further my learning before going to school. Now, obviously I will not be able to become overly proficient in all of these, but these are the languages that I plan on learning over the course of the next few years:
PHP
Ruby (on Rails)
Python
(Objective) C__ (I'll research my college program and see what C they use, and learn that)
Java
Lisp
Will being proficient in these languages give me a good base to work from? I tried to pick a selection of languages that seem to offer good employability, ability to develop on a number of platforms (desktop, web, mobile), and ones that are currently popular and sought-after.
Am I missing anything? Does anyone see anything important that I've missed, things I've picked that are a waste of time, or otherwise?
Thanks a lot guys.
I don't know if you've read the Pragmatic Programmer (a great read) But there's a section in there on expanding your knowledge portfolio, and they suggest learning one language a year, and I must confess I agree with them.
So I would work out what you want to write, and then pick the language that fulfills the requirements you want. And as you're going to college, I would also consider what language that the course you're attending is going to be teaching. I'd most certainly place my effort in to learning that language.
I'd also recommend (assuming they're going to let you use linux) learning a scripting language, such as BASH, and learn to create make or build files in your chosen environment, it's a heck of a lot easier than remembering compiler options.
Python is good to start with and then do Java. It would be enough for a starter on my opinion.
If you thought of learing php instead then learn mysql too. And more over you have overall missed the database. :(
Atleast learn my sql or ms sql of your choice.
Are there any good reason to learn languages such as Ada and COBOL? Are there any future in programming in those languages? I'm interested in those languages and i'm currently learning them just for fun.
Its always worthwhile learning new languages. Even if they're never useful to you professionally chances are they'll teach you something about programming you didn't know before or at very least broaden your outlook.
As for prospects from a quick bit of reading around it seems ada is still somewhat in favour for critical systems in the aviation industry and Cobol still has its place in business. I know an engineer in his mid 20's who writes all his code in fortran77 as that's what industry wants!
While the number of employers looking for these languages might be low, because there are a limited number of people who know them the salary for developers who specialise in them can be quite high. When mission critical apps developed in them could cost millions to replace having to pay more than usual for a coder to maintain the existing system is easily accepted.
Ada is used in the aerospace/defense industry. COBOL is used in the financial industry. Fortran is used in engineering. The question "is there any future" is borderline subjective/argumentative since all of those languages are still in active use.
Fortran is old, but is used in scientific programming. Ada is the basis for VHDL, a very important language in electrical engineering. You could also say that C is "old", and it's used pretty much everywhere.
Cobol and Algol are both still in widespread use. You won't find them running on your latest and greatest tech firms, but you can bet your car insurance company process' claims on it. Your health insurance company most certainly uses it. Reports of Cobol's death have been highly exaggerated.
You will find difficulty in colleges and places that will actually teach you Cobol or Algol. So finding developers for these so-called dead languages is getting harder and harder. Very tough to tell a kid coming out of high school that has been programming in Java, iOS, and Perl for half his life that Cobol is where the money is at.
Cobol/Algol developers are becoming harder and harder to come by, so if you have that language in your back pocket, it is only going to help you out. Algol is a lot harder of a language in my opinion to get good at. You can teach anyone with half a brain how to program in Cobol.
These languages are not going away any time soon at all. As long as companies like IBM and Unisys provide compilers for them on the mainframes, they will continue to thrive. So grab a book and an open source compiler and brush up. Plenty of people out there looking for Cobol/Algol developers.
Many of these 'old' languages are actively in use today. Lisp for instance is gaining popularity again in the form Clojure. Smalltalk is becoming popular again with the Seaside MVC framework.
In addition many of the hottest development lanaguages borrow heavily from Lisp and Smalltalk, both of which pioneered Object Oriented methodologies long before C++ came along. Javascript, Ruby, Perl 6 and Perl 5 Moose (Object System) all use mixins which were first used in Lisp and Smalltalk. Metaclasses, first used in Common Lisp and Smalltalk-80, are making a resurgence in Perl 5 Moose, Objective-C (iPhone development), Python and Groovy.
Much like learning Latin, it can be intriguing to understand where and how many English and other current languages' words had their roots. Also, if you know Latin and valuable new books/papers/scrolls are found that need translation, you suddenly become valuable too.
Honestly, I'd say learning them is great for a historical perspective, especially if you're a language designer, but not very much else.
There are roles out there for COBOL programmers, but in general they are looking for experienced developers. From what I have seen you are unlikely to get a first programming role in COBOL - in general, they are looking for people experienced with similar application domains and who are familiar with building an understanding of legacy systems. Knowing the limitations of the language can be useful for understanding why certain things you are asking for when connecting to mainframes are either considered difficult or problematic.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
Have you ever tried learning a language while on a project? I have, and from my personal experience I can say that it takes courage, effort, time, thinking, lots of caffeine and no sleep. Sometimes this has to be done without choice, other times you choose to do it; if you are working on a personal project for example.
What I normally do in this kind of situation, and I believe everyone does, is "build" on top of my current knowledge of languages, structures, syntax and logic. What I find difficult to cope with, is the difference of integrity in some cases. Some languages offer a good background for future learning and "language study", they pose as a good source of information or a frame of reference and can give a "firm" grasp of what's to come. Other languages form or introduce a new way of thinking and are harder to get used to.
Sometimes you unintentionally think in a specific language and when introduced to a new way of thinking, a new language, can cause confusion or make you get lost between the "borders" of your new and your current knowledge of languages.
What can be a good solution in this case? What should be used to broaden the knowledge of the new language, a new way of thinking, and maintain or incorporate the current knowledge of other languages inside the "borders" of the new language?
I find I need to do a project to properly learn a language, but those can be personal projects. When I learned Python on the job, I first expected (and found) a significant slowdown in my productivity for a while. I read the standard tutorials, coding standards and I lurked on the Python list for a while, which gave me a much better idea of the best practices of the language.
Doing things like coding dojos and stuff when learning a language can help you get a feel for things. I just recently changed jobs and went back to Java, and I spent some time working on toy programs just to get back in the feel for things (I'm also reading Effective Java, 2nd edition as my previous major experience had been with Java 1.4).
I think, in some respects no matter what the impetus for learning the language, you have to start by imitating good patterns in the new language. Whether that means finding a good book, with excellent code examples, good on-line tutorials, or following the lead of a more experienced developer, you have to absorb what it means to write good code in a particular language first. Once you have developed a level of comfort, you can start branching out and and experimenting with alternatives to the patterns that you've learned, looking for ways to apply things you've learned from other languages, but keeping within the "rules" of the language. Eventually, you'll get to the point where you know you can 'break the rules" that you learned earlier because you have enough experience to know when they do/don't apply.
My personal preference, even when forced to learn a new language, is to start with some throw away code. Even starting from good tutorials, you'll undoubtedly write code that later you will look back on and not understand how stupid you could have been. I prefer, if possible, to write as my first foray into a language code that will be thrown away and not come back to haunt me later. The alternative is to spend a lot of time refactoring as you learn more and more. Eventually, you'll end up doing this, too.
I would like to mention ALT.NET here
Self-organizing, ad-hoc community of developers bound by a desire to improve ourselves, challenge assumptions, and help each other pursue excellence in the practice of software development.
So in the spirit of ALT.NET, it is challenging but useful to reach out of your comfort zone to learn new languages. Some things that really helped me are as follows:
Understand the history behind a language or script. Knowing evolution helps a lot.
Pick the right book. Research StackOverflow and Amazon.com to find the right book to help you ease the growing pains.
OOP is fairly common in most of the mature languages, so you can skip many of the chapters related to OOP in many books. Syntax learning will be a gradual process. I commonly bookmark some quick handy guides for that.
Read as many community forums as possible to understand the common pitfalls of the new language.
Attend some local meetups to interact with the community and share your pains.
Take one pitch at a time by building small not so complicated applications and thereby gaining momentum.
Make sure you create a reference frame for what you need to learn. Things like how security, logging, multithreading are handled.
Be Open minded, you can be critical, but if you hate something then do not learn that language.
Finally, I think it is worthwhile to learn one strong languages like C# or Java, one functional language and one scripting language like ruby or python.
These things helped me tremendously and I think will help all software engineers and architects to really gear for any development environment.
I learned PHP after I was hired to be the project lead on the Zend Framework project.
It helped that I had 20 years of professional programming background, and good knowledge of C, Java, Perl, JavaScript, SQL, etc. I've also gravitated towards dynamic scripting languages for most of my career. I've written applications in awk, frameworks in shell, macro packages in troff, I even wrote a forum using only sed.
Things to help learn a language on the job:
Reading code and documentation.
Listening to mailing lists and blogs of the community.
Talking to experts in the language, fortunately several of whom were my immediate teammates.
Writing practice code, and asked for code reviews and coaching.(Zend_Console_Getopt was my first significant PHP contribution).
Learning the tools that go along with the language. PHPUnit, Xdebug, phpDoc, phing, etc.
Of course I did apply what I knew from other programming languages. Many computer science concepts are language-universal. The differences of a given language are often simply idiomatic, a way of stating something that can be done another way in another language. This is especially true for languages like Perl or PHP, which both borrow a lot of idioms from earlier languages.
It also helped that I took courses in Compiler Design in college. Having a good foundation in how languages are constructed makes it easier to pick up new languages. At some level, they're all just ways of abstracting runtime stacks and object references.
If you're a junior member of the team and don't know the language, this is not necessarily an issue at all. As long as there is some code review and supervision, you can be a productive.
Language syntax is one issue, but architectural differences are a more important concern. Many languages are also development platforms, and if you don't have experience with the platform, you don't know how to create a viable solution architecture. So if you're the project lead or working solo, you'd better have some experience on the platform before you do your design work.
For example, I would say an experienced C# coder with no VB experience would probably survive a VB.NET project just fine. In fact, it would be more difficult for a developer who only had experience in C#/ASP.NET to complete a C# WPF project than a VB ASP.NET project. An experienced PHP developer might hesitate a bit on a ColdFusion project, but they probably won't make any serious blunders because they are familiar with a script based web development architecture.
Many concepts, such as object modelling and database query strategies, translate just fine between languages. But there is always a learning curve for a new platform, and sometimes it can be quite nasty. The worst case is that the project must be thrown out because the architure is too wrong to refactor.
I like to learn a new language while working on a project, because a real project will usually force me to learn aspects of the language that I might otherwise skip. One of the first things I like to to is read code in that language, and jump in. I find resources (such as books and various internet sites) to help as I go along.
Then, after I've been working on it for a while, I like to read (or re-read) books or other resources on the language. By this time I have some knowledge, so this will help solidify some things and also point out areas where I am flat-out wrong in my understanding. For instance, I can see that I was making incorrect assumptions about similarities between languages.
This also applies to tools -- after using a tool for a while and learning the basics, reading (or skimming) the documentation can teach me a lot.
In my opinion, you should try to avoid that. I know, most of the times you can't but in any case try not to mix the new language with the old one, and never add to the mixture old habits, practices and patterns.
Always try to find resources that will help you get through the new language in the way the language works, not in the way other languages do; that will never have a happy ending, and if it does it will be very hard to modify it to the right way.
Cheers.
Yes I have.
I mean, is there another way? The only language I ever learned that was not on a project was ABC basic, which was what you used on my first computer.
I would recommend if you start with a certain language, stick with it. I only say that because many times in the past I tried more and more different ones, and the one I started out with was the best :D
Everytime I have/want to learn a new language, I force myself to find something to code.
But to be sure I did it well, I always want to be able to check my code and what it ouputs.
To do so, I just try to do the same kind of stuff with languages I know and to compare the outputs. For that, I created a little project (hosted on Github) with an exercise sheet and the correction for every language I learnt. It's a good way to learn in my opinion because it gives you a real little project.