Related
I'm insterested to know the techniques that where used to discover vulnerabilities. I know the theory about buffer overflows, format string exploits, ecc, I also wrote some of them. But I still don't realize how to find a vulnerability in an efficient way.
I don't looking for a magic wand, I'm only looking for the most common techniques about it, I think that looking the whole source is an epic work for some project admitting that you have access to the source. Trying to fuzz on the input manually isn't so comfortable too. So I'm wondering about some tool that helps.
E.g.
I'm not realizing how the dev team can find vulnerabilities to jailbreak iPhones so fast.
They don't have source code, they can't execute programs and since there is a small number of default
programs, I don't expect a large numbers of security holes. So how to find this kind of vulnerability
so quickly?
Thank you in advance.
On the lower layers, manually examining memory can be very revealing. You can certainly view memory with a tool like Visual Studio, and I would imagine that someone has even written a tool to crudely reconstruct an application based on the instructions it executes and the data structures it places into memory.
On the web, I have found many sequence-related exploits by simply reversing the order in which an operation occurs (for example, an online transaction). Because the server is stateful but the client is stateless, you can rapidly exploit a poorly-designed process by emulating a different sequence.
As to the speed of discovery: I think quantity often trumps brilliance...put a piece of software, even a good one, in the hands of a million bored/curious/motivated people, and vulnerabilities are bound to be discovered. There is a tremendous rush to get products out the door.
There is no efficient way to do this, as firms spend a good deal of money to produce and maintain secure software. Ideally, their work in securing software does not start with a looking for vulnerabilities in the finished product; so many vulns have already been eradicated when the software is out.
Back to your question: it will depend on what you have (working binaries, complete/partial source code, etc). On the other hand, it is not finding ANY vulnerability but those that count (e.g., those that the client of the audit, or the software owner). Right?
This will help you understand the inputs and functions you need to worry about. Once you localized these, you may already have a feeling of the software's quality: if it isn't very good, then probably fuzzing will find you some bugs. Else, you need to start understanding these functions and how the input is used within the code to understand whether the code can be subverted in any way.
Some experience will help you weight how much effort to put at each task and when to push further. For example, if you see some bad practices being used, then delve deeper. If you see crypto being implemented from scratch, delve deeper. Etc
Aside from buffer overflow and format string exploits, you may want to read a bit on code injection. (a lot of what you'll come across will be web/DB related, but dig deeper) AFAIK this was a huge force in jailbreaking the iThingies. Saurik's mobile substrate allow(s) (-ed?) you to load 3rd party .dylibs, and call any code contained in those.
This question already has answers here:
Closed 12 years ago.
Possible Duplicates:
How do you protect your software from illegal distribution?
Best practice to prevent software copy
Hypothetical situation:
Lets say I have built a software product from the scratch and it does wonderful things. The only problem is that, once someone takes a look at the code, they will understand it very easily and they can easily build it up themselves.
Now, the thing is that I built the code from the scratch 100% and uses a mixture of API calls.
Nobody else is involved in the development of the code.
If I want to sell this product, what is the guarantee that someone much smarter than me will reverse engineer the whole thing and come up with better product?
Right now I am thinking of fragmenting the whole code. Adding lots of redundant code and tonnes of comments.
Is there any software which encrypts the software code, that will make debugging, troubleshooting, and understanding how the code works virtually impossible? and yet runs as usual? so that the developer can have peace of mind?
Very few things in a program are truly novel. Almost everything that you are likely to put into your code, someone else could invent on their own. Generally more easily than they could learn it by reading your code. Reading code is harder than writing it, and most programmers don't really like doing it anyway.
So it's much more likely that they will look at your app and think "I could do that", then "That's cool, I'm gonna read that code and then copy it!". Even if they understand it, you will still own the copyright, you still get to market first.
I recommend that you just forget about it.
once someone takes a look at the
code, they will understand it very
easily and they can easily build it up
themselves.
So don't give anybody the source code.
If I want to sell this product, what
is the guarantee that someone much
smarter than me will reverse engineer
the whole thing and come up with
better product?
(a) So start selling it now and capture the market. Reverse engineering takes time, during which you are capturing market and 'mind-share'. (b) Put a provision in your licence agreement that prohibits reverse-engineering. (c) Make sure everybody who gets the product signs the agreement.
Right now I am thinking of fragmenting
the whole code. Adding lots of
redundant code and tonnes of comments.
That only has a point if you're going to distribute the source code. In which case nobody even needs to reverse-engineer. They have your source code. Don't give it to them.
Is there any software ...
There's lots of software that purports to do this job. However it is a technical solution to a business problem. All software can be reverse-engineered, because at some point or other it all has to be decrypted and de-obfuscated to the point where the CPU will understand it. At that point it is essentially plaintext. So no technical solution is formally speaking possible (short of something like code that executes in a tamper-proof HSM).
I will add that there is another business mechanism you can use to defend against business loss, which is what this is all about: price. Make the price so high that the licensees will value their copy and not permit it to be inspected, or make it so low that reverse-engineering is cost-infeasible; or make it free and make your money on the support contract.
Once you actually have the knowledge and experience to write such a codebase, it will be clear to you that obfuscation is meant to deter casual IP infringement.
Someone who wants to know your code is going to know your code.
If it becomes an issue of monetary loss, the courts are your protection.
That's how it works.
Someone will always be able to understand and work out your code. Heck, if you had 0 way getting to the code, even just using the system is enough for someone to be able to replicate the process.
Example: I take a jug of water and pour it into the cup, while my back is facing to another person. This other person knows that water and gravity are awesome at making things fall into other containers, so they can then work out a process of lifting a jug to let gravity (API call) work in their favour. They mightn't know exact what angle you used in your forearm and any super-sneaky cup-holding techniques you used, but they can replicate the same process and improve on it over time.
tl;dr: You can't protect code.
The thing to do is invent even more wonderful things while the competition is reverse-engineering your current stuff. It's called competing through innovation.
I am not a lawyer
if you are really worried about it, to the point you are willing to invest money in it, dont protect your code (beyond something reasonable like obfuscation or encryption) but rather patent your idea and your art. Then if someone does take it, reverse engineer it and make a better process based of yours, you have legal grounds to get your money.
There are tons of things you will have to do, include proving they took your idea (which isnt easy), but if this is the solution to world hunger and all of humanities problems its the thing to do.
Now for the downside, I will guess, and probably be 90% right that your method is:
Not patentable, for various reasons (I was amazed at the number of already patented ideas, and how difficult it was to identify original art)
Not new, or unique (i.e. there is already established art for it)
Not worth patenting because the expense far outways the benefits
An IP lawyer can tell you for sure, and the expense of a consult is not that much. Overall it will be cheaper to consult with them then to invest a lot of time in hiding code.
Good luck.
Don't even bother. If your code really "does wonderful things" be assured that it'll get hacked. And be it just for curiosity.
There is no 100% way to protect your code from reverse engineering. What language are we talking about? If this is C/C++ then it is pretty hard to reverse engineer, more you could strip it from debugging information etc. But if this is for example Java then even if you obfuscate the code, there are some pretty cool tools (like JAD) that will reveal much of your work anyway.
Despite all of this I think you should try to change your attitude. Big companies pay a lot of money for simple solutions and it seems that nowadays service is the most important thing, not the software (hence the success of open-software based companies). So, if you have a great software don't be scared that someone might steal it, rather think how to sell it good.
Is there any software which encrypts the software code, that will make debugging, troubleshooting, and understanding how the code works virtually impossible? and yet runs as usual? so that the developer can have peace of mind?
This is the totally wrong mindset IMO. What happens if you get hit by a bus? Your company goes bankrupt? All your data gets destroyed in a fire? For every single one of your customers, the value of their investment in your software will drop, and eventually reach zero, because the software can't be developed, or troubleshot, any further without you. I have seen so much money wasted that way, I think it's a horrible business model.
I earn my bread with making software myself so I know the hardships of making a living with it. Still, obfuscation can't be the way to go nowadays. Impose strict license agreements on your customers, scare the hell out of them so they don't even think about redistributing the software, but leave it open.
This is futile. There is always someone smarter than you and therefore they will be able to reverse engineer your obfuscation.
Usually someone smart enough to hack your code and use it in a meaningful way is smart enough to do it on their own, and probably thinks they can do it better than you did, so they won't bother stealing your stuff.
Don't worry about the people who can hack your code but not make meaningful use of it. If you've done a good job, this can only reinforce the quality of the job you've done (think of all the crappy touchscreen phone imitators).
They are going to reverse-engineer your code. Nothing can stop them.. The only thing you can do is make it harder. This ranges from obfuscating code that is inheritely "open" such as PHP and Javascript, all the way down to littering your code with a crap load of self-modification.
In a lot of ways, I think, the thing that makes a piece of software valuable, is not the crazy technological advancement that it provides, but rather the things that we think might think of as being tertiary to the piece of software itself. Like the fact that you'll be there to support it. Or that it's provided as a web service and you'll be there to make sure the server is running. Or that it's a community, and you'll be there to moderate and build the community.
While you may be actually selling code, the value you that your code has isn't intrinsic to the code itself, but rather derives from the features and ecosystem that surrounds your code.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Its always said that more you program, the better you become. Sounds good and true.
But I was wondering if there is a proven route to becoming a better programmer.
Something like:
Learn a
Learn b
Learn c > 'Now you are good to burn the engines'
Try stuff around based on your learning.
The answer might be similar to a CS course roadmap, but I want to hear from successful programmers who might want to pitch in with something notable.
Thanks
It's not true that practice makes perfect.
It's perfect practice that makes perfect.
If all you do is keep repeating the same bad practices again and again, you'll only make it possible to create bad code faster.
By all means keep coding. But at the same time be critical of everything you do. Always have a jaundiced eye that looks for ways to do things better. Read widely to get new ideas. Talk to others about how they do things. Look at other people's code, good and bad.
There's no "sure" way to learn anything that I know of. If there was, anyone could master this.
All questions are rhetorical and meant to stimulate thought.
Technical parts:
Design Patterns - There are probably some specific to a domain but generally these are useful ways of starting parts of an application. Do you know MVC or MVP?
Basic algorithm starting points - Divide and conquer, dynamic programming, recursion, creating special data types like a heap, being greedy, etc.
Problem solving skills - How easily can you jump in and find where a bug is? Can you think of multiple solutions to the problem?
Abstract modelling - How well can you picture things in your head in terms of code or classes when someone is describing a problem?
High level versus low level - How well do you understand when one wants something high or low? This is just something I'd toss out there as these terms get through around a lot, like a high level view of something or a low level language.
Process parts:
Agile - Do you know Scrum, XP, and other new approaches to managing software projects? How about principles like YAGNI, DRY and KISS? Or principles like SOLID? Ideas like Broken Windows?
Developer Environment - How well do you know the IDE you use? Source Control? Continuous Integration? Do you know the bottle necks on your machine in terms of being productive?
xDD - Do you know of TDD, BDD, and other developments driven from a paradigm?
Refactoring - Do you go back over your old code and make it better or do you tend to write once and then abandon your code?
Soft skills:
Emotional Intelligence - Can be useful for presentations and working with others mostly.
Passions/Motivation - Do you know what gets your juices flowing and just kick butt in terms of being productive? Do you know what you would like to do for many many years?
My main piece of advice would be: don't be afraid to rewrite your own code. Look at stuff you wrote even a month ago and you will see flaws and want to rewrite stuff.
Make sure that you understand some fundamentals: collections, equality, hashcodes etc. These are useful across pretty much all modern languages.
Depending on the language you use - use lint and metric tools and run them over your code. Not all their suggestions will be applicable but learning which are important and which are not is important. E.g FindBugs, PMD etc for Java.
Above all refine and keep refining your work. Don't treat your work as abandonware!
Learn your 1st programming language a new programming paradigm or a
find a mentor you can learn from
Apply what you've learnt in a real world project
Learn from your mistakes and successes and goto step one
The trick is knowing what to learn first:
Programming languages - this is the place to start bcause you cannot write software without knowing at least one of these. After you've mastered one language try learning another.
Programming paradigm - i.e. object oriented, dynamic/functional programming etc. Try to learn a new one with each new language.
Design concepts - S.O.L.I.D, design patterns as well as architectural concepts.
People skills - learn to communicate your ideas.
Team leadership - learn how to sweep others and how to become a team or technological lead.
After that the sky is the limit.
I would look at improving roughly in this order, in iterations with each building on the previous one:
Programming concepts. Understand things like memory management, pointers, stacks, variable scope, etc.
Languages. Work on mastering several modern languages.
Design concepts. Learn about design patterns. Practice using them.
Communication. Often-overlooked. You can only become a highly valued Software Engineer if you can communicate effectively with non-tech people. Learn to listen and understand the needs that people are expressing, translate that into a set of requirements and a technical design, but then explain what you understood (and designed) back to them, in terms they can understand, for validation before you code. This is not an easy one to master, but it is essential.
Architectural concepts. Learn to understand the big picture of large, complex systems.
Learning a programming language is in many ways similar to learning a spoken language. The only way to get good at it is to do it as often as possible. In other works
Practice, practice, read and then practice more
Take time to learn about all sorts of coding techniques, tools and programming wisdom. This I have found to be crucial to my development. It's to easy to just code away and feel productive. What about what could be if you just had some more knowledge / weaponry under your belt to bang out that next widget.
Knowledge/know how is our real currency. The more we know the more we can make a better decision about how something should be done and do it faster.
For example, learn about:
•Development Practices, Software Design, Estimation, Methodologies Business Analysis Database Design (there are a lot of great books out there and online resources)
•Read Code - Open Source Projects are a good place for this. Read
Programming blogs
•Try to participate on Open Source
Projects.
•Look for programming user groups in
your town and/or someone who can mentor you.
And yes, as mentioned practice. Don't just read, do and watch how you will improve. :)
Practice, practice, practice.
Once you're over the basic hump of being able to program, you can also read useful books (i.e. Code Complete, Effective Java or equivalents, etc.) for ideas on how to improve your code.
First and foremost write code. Write as much as you can. Tackle hard problems. If you want to be a really good programmer you need to get into the guts of what you are doing. Spend a lot of time in debuggers looking at how things work. If you want to be a good programmer who really understands what is going on you need to get down to the metal and write highly async code, learn about how processors work and why SSE is so awesome. Understand threading primitives and be able to write them as well as describe what is actually happening in the processor. I could keep going here but you get the idea.
Second find someone who knows a lot more than you and learn. This relationship will work better if you are already deeply immersed in writing lots of code.
Third, spend some time in a large high quality open source code base. I learned a ton from the Quake I and Quake II code. Helped me be a better programmer.
Fourth take on hard problems. Push your limits. Build things that you thought were impossible. Right now I am writing a specialized compiler. I have learned so much just working on this for the last couple of months.
Sure, strictly speaking, the more you practice programming, the better you become at solving those sorts of problems. But is that what you really want?
Programming is a human activity more than a technological one, at its heart. It's easy to improve your computer skills, not so hard to improve your interpersonal skills.
Read "Journey of the Software Professional" by Hohmann. One of the concepts the concepts Hohmann describes is the "cognitive library," which includes both programming skills and non-programming skills. Expand your cognitive library, and your programming skill will improve too.
Read a lot of non-programming books too, and observe the world around you. Creating useful metaphors is an essential skill for the successful programmer. Why do restaurants do things how they do? What trade-offs is the garbage department making when they pick up the garbage every few days instead of every day? How does scaling affect how a grocery store does business? Be an inquisitive human to be a better programmer.
For me, there has to be a reason to learn something new... that is, unless I have a project in mind or some problem I need to solve, there's no hope. If that prerequisite is met, then I usually try to get "Hello, world" working, and after that the sky's the limit. So much of development these days is just learning new APIs. Occasionally there's some kind of paradigm shift that blows your mind, but that's not as common as people like to think, IMHO.
Find a program that intrigues you, one that solves a problem, or one that would simplify many of your tasks. Try to write something similar. You'll get up to speed very quickly and have fun doing it at the same time.
You can try learning one thing really well and then expanding out to programming areas that are associated with the things that you have learnt, so that you can offer complete solutions to customers.
At the same time, devote part of your time to explore things outside your comfort zone.
One you have learned something, try to learn something a little harder. Read and practice a lot about things that seem confusing at first time (lambda functins, threading, array manipulation, etc). It will take its time, but once you have practiced enough, what seemed confusing at first, will be familiar and easy.
In addition to the rest of the great advice already given here, don't be afraid to read about coding and good practice, but also take everything with a grain of salt and see what works best for you. A lot of advice is opinion.
Good sites to read:
-thedailywtf.com
-joelonsoftware.com
-codinghorror.com
-blogs.msdn.com/oldnewthing
A great place to get practice is programming competition websites. Those will help you learn how to write good algorithms, not necessarily maintainable code, but they're still a good place to start for learning.
The one I used to use (back when I had time) was:
http://uva.onlinejudge.org/
Learn more than one language. One at a time, definitely, but ultimately you should be fluent in a couple. This will give you a better perspective I think, and help you to become an expert at programming, rather than being an expert at a certain language.
Learn the ins and outs of computers at all levels, hardware, os, etc. Ideally you should be able to build your own system, install multiple operating systems on it, and diagnose just about every problem that can arise. I know many programmers who are not "computer tech people" and their failure to understand what is happening at every level becomes a major hindrance in diagnosing and fixing unusual bugs or performance issues.
As well as looking at 'last weeks code', talk to users of your work after delivery - be one yourself if possible.
Its not my bag, but some of the best coders I know have spent time supporting applications. The experience improved their product I'm sure.
eat breath dream the programming language your using (no seriously, it helps)
There are two kinds of learning -
1. Informal (like how you learned how to function in society- through interaction with peers and family)
2. Formal (like your high school training- through planned instruction)
If you want an entry-level programming job, formal training via an undergrad Computer Science/Engineering degree is the way to go. However, if you want to become a rock-star developer, it is best done by informal training- make unintentional mistakes and have senior developers curse at you, learn a design pattern because an app you are updating uses it, almost cry because a bad developer wrote a huge messy program lacking documentation and best practices and now you have to do several updates to it ASAP; thing of these nature.
It is hard for anyone to give you a list of all you need to know. It varies per area (e.g. a web developer vs. to a desktop developer) and it varies per company (e.g. Microsoft that sells software vs. General Motors that mainly just use it in their cars.) Informal traiing and being engaged in trying to learn to do your job better and get promoted is your best bet in my opinion.
To prove my point, everyone here has great answers but they all differ. Ask a rock-star developer how he learned something or when, why; they may not know- things just happen.
Practice, individually and collectively
Keep an open mind, always learn new things, don't limit yourself to what's familiar. Not solely from a tech perspective, ui design, people skills, ... Don't be afraid of what's new
Peer review, talk to people about your code, let people talk to you about their code, everyone has a unique way of looking at a problem and you will learn a great deal from peers
Love coding. If you love what you're doing, putting in alot of time seems effortless. Every coder needs the drive!
One small addition to these good answers. When I work on someone else's code, usually I pick up something new. If you have the opportunity to work with someone else that is of equal or greater skill, noticing their programming style can teach you tons.
For example, in C++ & Javascript I no longer use if() statements without braces. The reason is that it's just too easy to mistakenly put:
while (true) {
if (a > b)
print a
print b
}
This is an obvious typo, but very easy to introduce, especially if you're editing existing code. I just call it defensive programming in my mind, but little tricks like this are valuable at making you better.
So, find a peer or mentor, and work on their code.
I am not sure if the OP was looking for general advice on how to be a good programmer, but rather something more specific.
I know I am reviving this thread, but I found it because I was trying to see if anyone asked this question already.
What I had in mind was, can we come up with a "knowledge-map" of programming concepts similar to the map that Khan Academy uses.
As a programmer, I want to be able to visualize the dependencies and relationships between different ideas, so that I can understand what skill level I am currently at; what I need to know before tackling a challenging subject; and be able to visualize my progress.
The very belief in the roadmap's existence blocks the road to perfection.
The title may seem slightly self-contradictory, and I accept that you can't really learn a language quickly. However, an experienced programmer that already has knowledge of a few languagues and different styles (functional, OO, imperative etc.) often wants to get started quickly. I've seen a few websites doing effective "translations" in the form of "just show me syntax equivalence". I can't remember the sites now, but for related languages (e.g. Perl/PHP) it's quite common.
Is there a better resource that covers more languages? Is there a resource that covers idioms as well as syntax? I think this would be incredibly useful for doing small amounts of work on existing code bases where you are not familiar with the language. Looking at the existing code, as we know, is not always a good indicator of quality. Likewise, for "learn by doing" weekend project I always have the urge to write reasonably idiomatic, clean code from the start. Such a resource could also link to known good example projects of varying sizes for those that prefer to learn by reading. Reading a well-written medium sized code base can also be much more practical when access to development environments might be limited.
I think it's possible to find tutorials and summaries for individual languages that provide some of this functionality in disparate web locations but I'm hoping there is a good, centralised, comparative place that the busy programmer can turn to.
You generally have two main things to overcome:
Syntax
Reference
Syntax you can pick up fairly quickly with a language tutorial and a stack of samplecode.
Reference (library/API calls) you need to find a proper guide to; perhaps the language reference, or perhaps google...
With those two in place, following a walkthrough (to get you used to using the development environment) will have you pretty much ready - you'll be able to look up what you want to say (reference), and know how to say it (syntax).
This, of course, applies principally to procedural/oop languages; languages that require a paradigm switch (ML/Haskell) you should go to lectures for ;)
(and for the weirder moments, there's SO!)
In the past my favour was "learning by doing". So e.g. I know a little bit of C++ and a lot of C#.Net but I must write a FTP Tool in Python.
So I sit for an hour and so the syntax differences by a tutorial, than I develop the form itself and look at the generated code. Then I search a open source Python FTP Client and get pieces of code (Not copy and paste, write it self to see, feel and remember the code!)
After a few hours I get it.
So: The mix is the best. A book, a piece of good code, the willing to learn and a free night with much coffee.
At the risk of sounding cheesy, I would start with the language's website tutorial and/or FAQ, followed by asking more specific questions here. SO is my centralized location for programming knowledge.
I remember when I learned Perl. I was asked to modify some Perl code at work and I'd never seen the language before. I had experience with several other languages, however, so it wasn't hard to figure out the syntax with the online Perl docs in one window and the code in another, side-by-side. I don't know that solely reading existing code is necessarily the best way to learn. In my case, I didn't know Perl but I could tell that the person who originally wrote the code didn't know Perl either. I'm not sure I could've distinguished between good Perl and really confusing Perl. It would've been nice to be able to ask questions here at the time.
Language isn't important. What is important is learning your ways around designing algorithms and the proper application of design patterns. Focus on the technique, not the language that implements a certain technique. Once you understand the proper development techniques, any programming language will just become real easy, no matter how obscure they are...
When you put a focus on a language, you're restricting your own knowledge.
http://devcheatsheet.com/ seems to be a step in the right direction: it aggregates cheat sheets/quick references and they are (somewhat) manually reviewed. It's also wide-ranging. It still comes up short a bit in terms of "idiomatic" quick reference: for example, the page on Ruby doesn't mention yield.
Rosetta Code appears to be an excellent resource that includes hints on coding idiomatically and moves from simple (like for-loops) to things like drawing. I haven't checked out how comprehensive it is, but there are a large number of languages and tasks listed. The drawbacks re: original question are:
Some of the linking is not accurate
(navigating Python->ForLoop will
take you to the top of the ForLoop
page, not the Python section). It's a
wiki, this can be improved.
Ideally you could "slice" the wiki
however you chose to see e.g. the top
20 tasks for two languages
side-by-side.
http://hyperpolyglot.org/ seems to be an almost perfect match for what I was looking for. The quality is not always there, or idiom can be lacking, but it has the same intention and is pretty comprehensive.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Just want to ask for some opinions here. How do you feel about using a language (and/or framework) that isn't widely used in your location to write software for a company? For instance, I live in an area dominated by .NET, with the occasional PHP job. Let's say that I'm learning Python and decide to use it to write software for my job (I'm a "Team of One" so I can pretty much use anything I want).
Now their software is written in a language that pretty much nobody in the area uses or knows; if I were to leave the company, they'd basically have nobody to maintain/add to it unless they retain on me as a consultant. While that's really good for me, it seems a bit "crooked" - granted, that's how the business world works.
What are your thoughts?
I should mention that this is a very small company and I'm the only IT person, so I have full reign to choose our development platform. I'm not specifically using Python, but chose it as an example since my area is almost entirely .NET based; I don't care for .NET anymore though, which is why I don't want to consider using it. Also, the company is.. how shall we say... extremely frugal and wouldn't purchase the required resources for .NET (e.g. server licenses, SQL licenses, Visual Studio, components). I personally have an MSDN subscription but I can't use that for them.
Also FWIW there are people in the area who use the language I'm considering using (Ruby on Rails), but nowhere near as many people as .NET developers. It's not like I'm using something that only I know.
You may think that this approach is good for you. But in fact all this does is paint you in to a corner. The best way to get promotion - within an organisation is to make yourself unnecessary in your current position. That might seem like nonsense, but it is in fact true. Think of it like this, if it is essential to the company that you continue to maintain the python code you wrote for them, and they can't go to anyone else to get that skill, then they will continue to pay (maybe a little above market rates) to maintain that code.
If however, you write that code in .net where there is a plentiful supply in your area, then as the company expands and the code you've written proves successful, you will be able to hire people to maintain that code and you can move on to designing other systems. Or moving in to managing a team of .net coders - if that's your want.
Even if you want to leave, the best thing for your career is going to be to get the best possible reference. To do that, write them some code that is easy to maintain. Help them hire someone to replace you to maintain it. They will be grateful and recommend you as a consultant to their friends.
Code in something esoteric - for which there is little support in your area - and they will be saying to their friends on the golf course "no don't hire that guy, he wrote this system for us which does the job, but no one else can maintain it. We're stuck with him forever and now he's too busy to look after us properly!"
Do what's best for the business, not what might be of most interest to you - or appear that way on the face of it. You'll win out in the long run.
I think that you're responsible to decide on the language that's best suited for the job. That includes an objective evaluation of the merits of the language and framework, it includes your own personal skill with the language (since you're the one doing the work) and it includes maintainability by others. Only you and your company can decide how much importance to place on each of those.
For your own personal development, if your area is dominated by .net, why don't you want to get up to speed in that instead of Python?
From an ethical standpoint, I would not write something that could not easily be maintained by someone else.
A lot of responses seem to be a poor fit for the question. We're not talking about using an unapproved language in an environment with existing standards. We're talking about a situation where the poster is the entire IT and development department for his company.
It's certainly important to keep in mind availability of talent, but Ruby is hardly a fringe language these days. In an environment where there's only one developer, productivity is also a very important consideration. Being able to build and maintain software quickly and easily without a large team requires tools with different characteristics than a large team might require.
I think what's more important than whether to use Ruby or (something else) is to try to pick something as general-purpose as practical and use it for everything unless there's a really good reason to use something else. If you go with Ruby, stick with Ruby for your utility scripts, cron jobs and that little GUI app the boss wanted to automatically SMS the intern when he takes more than five minutes to bring him his coffee.
I think using python would be the right thing to do if it would meet the clients requirements, and save them money over the alternative. Whether or not there is a wide assortment of characters to work on the application down the road is irrelevant, unless they've specified this as a non-functional requirement.
As usual, using the best tool for the job at hand will serve you well.
It indeed is a bit crooked IF you use it only for that purpose.
However, if you use it because it IS the best solution, youre in the clear.
Also, they can just hire someone else who knows python.
My work ethics dont allow me to do something like this just to keep me in business.
My personal opinion is you should try where possible to respect the working practices of wherever you are - whether that's indentation style, naming convention, testing procedure or programming language.
If you feel strongly that a different language would be better suited for a certain task, then lobby to have it accepted (with the required re-training of others).
Purposefully leaving an app that no one else can maintain is very bad professional conduct, IMO.
We recently had a bad hire at my shop and he decided out of the blue he was going to use Perl instead of any version of .NET to do some simple reporting stuff (That could have just as easily been done in .NET). It was atrocious. I would suggest using the platform as specified and clearing any deviation with the people who run the joint...
Plenty of answers have touched on this, but here's my take based on production application support.
My company had a startup phase where code hustlers whipped up solutions in whatever the personal preference or flavor of the week was. Bad for maintainability and supportability.
Making a change is ok, though, as long as it's consistent. If Python is going to pave the way to the future, then go for it. Don't forget that the legacy .NET and PHP code still needs to be supported until end of life. Building yourself a hodge podge of platforms and frameworks will just create more difficulty for you on the job and the company when you're no longer around.
If you feel in your heart you are acting dishonestly, then you probably are.
No one likes a dishonest person. That can't be good for your reputation.
Do your best to choose based on what is actually best, not what satisfies some underhanded motives.
It depends. I did some of what would normally just be a bash script, in Java instead at one place. Why? Because they're all Java programmers and frequently have interns/coops coming through that may or may not know anything else (and may or may not even be all that great with Java).
Other places though tend to have more experienced programmers and I expect that they'll be able to figure out another language without too much effort. So, I would go with what's "best" for the project.
I agree with what mquander says above, but you may also have to be prepared to justify why you want to use this other language to your development manager. If he/she then agrees, perhaps the language could become more widely adopted within the company.
Think of it in terms of business benefit you bring to the company, now and in the mid-term.
If you can deliver something much faster using a different technology, and it still achieves the goals, I'd go for it - but I'd still let some other people know and respect the company's final decision. If however, it's purely for yourself, then I'd probably be a litte more careful.
I think it's a really bad idea. For you, it means there's no back up in case you want to have a day (or week) off. For them, there's no one else if you leave or are taking a day off. It's a well known ploy, and, honestly, might be reason to not keep you around.
However, this could also be a chance to introduce Python into the environment. You could teach others about it, and explain to management while it's a good third language to have at the group's disposal.
I used to think that you should always pick the right language for the job at work. I'm reversing my opinion though.
The problem arises when some other guy picks a language you don't want to learn. I am concerned that I might be the guy who picks the language no one else wants to learn. Just because I think that Erlang might be the right choice for something doesn't mean that everyone else will want to learn Erlang or respect my decision for using Erlang.
"if I were to leave the company, they'd basically have nobody to maintain/add to it unless they retain on me as a consultant."
Are you saying no one else can learn Python? I find that hard to believe.
New technology is often introduced in small projects by knowledgeable people and diffused through the organization because the small projects were successful.
Use Python. Be successful. Make your case based on your successes.
I had this same problem very often. Coincidentally, it was with those two languages you mention: .NET forced on me, when I preferred to use Python (among others). Could be the opposite, I don't judge.
I refrained to use Python, because of the reasons already mentioned in other answers. I did what I thought was best for the company. Using IronPython won't make your python code any more maintainable for an unexperienced Python programmer.
However, I left the company and now I work in something more in line with my tastes. I'm much happier. In this economy you may not have this option... but it will pass. Do the right thing.
Cheers.
There is a large difference between 'prototype' or 'one-shot' code and production code. For prototyping I use whatever works fastest, but I'm very clear about its status. Production code is written in one of the approved and supported environments.
The ethics is to use the best tool for the job. If there is a tool that takes you only 20% of the time to code vs other choices, and next to no maintenance, and easy to re-factor, you have a duty to pick that tool, assuming it's extensible as you may need in the business.
If you do a good job, hiring future people and training them in terms of HOW your workplace does business should be the practice of any growing business. They will be able to learn the code if they're the right person for the business.
In your case I'm not sure if you want to use Python, unless it has native .NET support to allow your .NET world to interact with it.
Other posters have made some good points, but here's one I've not seen: Communicate the situation to management and let them decide. In other words, talk with your boss and tell him or her that there currently are more .NET developers in your area, so that if you're hit by a bus tomorrow it would easier to find someone else to maintain your code; however, there are tools you need to do your job more efficiently and they cost money (and tell them how much). Alternatively, you could do this in Python or RoR (or whatever) and use free tools, but from what you know, there aren't currently that many people in the area who know those languages. I've used "currently" a couple times here because this may change over time.
Before having this conversation, it might be good to see if you can find user groups for the alternative technology in your area, and how large they are. You could also ask on listserves if there are people who know the alternatives in your area.
Of course, the boss may tell you to keep using .NET without any tools, but in that case it's their decision to shoot themselves in the foot. (And yours to decide if you want to find a new job.)
Regarding the question as asked, I see nothing unethical about it, provided that:
It is a freely-available language. Although I am something of a FOSS partisan, that's not the point of this criterion. It needs to be freely-available (not necessarily FOSS) so that it doesn't impose costs on the company and so that others will have the opportunity to learn it if you ever need to be replaced (or if they want to compete with you for your job).
You are changing languages for solid reasons and not for the sake of creating vendor lock-in (or, if you prefer to think of it as such, "job security"). Ethics aside, you really don't want to have a job where they hate you, but are stuck with you because you're the only one who can maintain the mess you've created anyhow.
In the particular case you've described, I would suggest that switching to RoR may be the more ethical choice, as it would be decidedly unethical (not to mention illegal) to use .NET if there are required resources which are for-pay only and your employer is too cheap frugal to purchase proper licenses for them.
When in Rome... do as the Romans.
You might not be the one who as to maintain this code in the long term and not everyone wants to learn a "fringe language" to make bugfixes or enhancements.
I migrated some VBA stuff over to Perl for processing at a previous job and increased the efficiency by several orders of magnitude, but ultimately no-one else there was willing to learn Perl so I got stuck with that task longer than I wanted it.
I did that, it was Delphi in my case. I think Delphi was used often however when i was looking for a job .... i saw 3 delphi job offers in my whole life. i also saw more java/j2ee/php offers that i can remember. i think its bad idea, with the time i wasted in learning advance delphi programming i could get better with j2ee and start in better company and maybe make now more money.
If they cant find somebody to maintain the app you will always do it and when you quit they will have to re-write it. i think consultant thing is not used often.
I used to be in the "use the best tool for the job" school, but I've changed my mind. It's not enough to just ask "how can I do this job the fastest." If you think you're the only one who will ever need to look at some code, there's a good chance you're mistaken. The total cost of introducing a new language into an environment is higher than you might imagine at first.
If you just need to produce a result, not a program, then you can use whatever you want. Say you need a report or you need to munge some files. If the output is really all that matters, say it's something you could have chosen to do by hand, you can practice using any language you want.
With the release of the MVC Framework I too have been in a similar ethical delema. Use WebForms or switch over to MVC Framework for everything. The answer really is you have to do the right thing and use whatever the standard of the company is. If you deviate from the standard it creates a lot of problems for people.
Think how you would feel if you were dumped a project on VB6 when all you have been doing for years is .Net. So these are the two solutions I have come up with.
Use your fun languages for consulting contracts you do on the side. Make sure the client knows what you are doing and if they agree go for it.
Try and convince your current company to migrate over to this great new language you are working with.
If you follow these routes you will learn your language and not piss anyone off in the process.
Ruby on Rails is certainly not a fringe language. If the company is too cheap to pony up for the appropriate licensing for Microsoft's tools, then you would have no choice but to find an alternative. RoR certainly would be a reasonable choice and if helps move your career along as well, then it's win-win for both of you!
You can develop .NET adequately with free tools; cost is not a good reason to avoid that platform. Ruby on Rails is becoming reasonably mainstream for building data-driven internet websites. You haven't even told us if thats the sort of software you are building though.
There is really no way with the information that you have provided that anyone can give you a single correct answer.
If you are asking is it ethical to do your work in such a way that the company is dependent upon you, of course the answer is no. If you are asking is it ethical to develop in RoR then the answer is "we don't know" - but my opinion is that probably it would be fine if its the right tool for the job.
Don't underestimate the ability of someone else to support your work or replace you though - if you do your work reasonably well once the solution is in place any programmer worth their pay should be able to learn the platform well enough to maintain it. I've debugged, migrated and supported a few PHP applications for example without ever hardly learning the first thing about PHP. I'd be lost building a new PHP app from scratch and would never even try but its no problem to support one. I think the same would be true of the languages you mention as well - they've got the critical mass that means there is plenty of books and forums etc. Of course if its written badly enough in any language then it may be difficult to support regardless of anyone's skill in the language...
So much discussion for such a clear-cut situation...
It's not up to you, it's up to them. If they're not technical enough to make the call, as it seems, then you have to make it for them in good faith. Anything less is dishonest, and I'm fairly sure that's not in your job description ;)
You've muddied the waters with all the wandering about in the thickets of personal motivations. The answer to that one is that your personal motivations are irrelevant unless and until you've formulated the business case for the possible decisions. If you've done that and the answer still isn't clear-cut, then sure, choosing the answer you like the best is one of the nice things about being in a position to make technical decisions in the first place.
As far as the actual question goes, to my mind if the most technically apt choice is also one that very few people work with, one of two things is happening: a) It's a good choice, and the number of people working with it is going to be exploding over the next 18-24 months (e.g. Django), or b) There's something wrong with my analysis. Technologies may be on the fringe because people are slow to adopt them, but that's generally not why they stay on the fringe.
If you find yourself thinking "I can't choose technology X, that'll make it easier for them to replace me!" you're in the wrong line of work. In almost any enterprise that's not actually failing, the IT guy who makes himself easy to replace tends to move up to harder and more interesting and more lucrative work.
I would not bring a new language/framework/whatever into the place unless they understood that's what I was doing, and that if I left/was fired/was hit by a bus, they'd have to find/train someone to work with it.
I have some experience in a contractor pulling in things just because he felt like it. In some cases they were the best tool for the job (in other cases they were not), but in all cases they were not the best tool for the team that had to maintain the code. In my case the contractor was a serious jerk who didn't really give a darn about anyone else and I believe WAS trying to make himself harder to replace.
In your case, talk to your bosses. If they really don't want to spend the needed money on .NET framework tools/libs, then switching to something else may well BE the right thing to do for them, long term.
And, as someone who has spent his career walking into the middle projects that others have already started - thank you for thinking before you add a new tool to the mix.