Related
I am developing a Windows Form Application in C#.I have heard that one should not use built in methods and functions in code since hackers have deep understanding of such built in methods and know how to fail them Instead one should always use his/her own functions and methods and if not then call built in functions intelligently from those newly made functions.How much is that true?
A supporting example in favour of my argument is that I have seen developer always develope there own made encryption algorithm like AES,DES,RC4 and Hash functions since they believe that built in encryption algorithm have many times backdoor in them.
What?! No, no, no! Whoever told you this is just wrong.
There is a common fallacy that published source code is more vulnerable to "h4ckerz" because it is available for anyone to spot the flaws in. However, I'm glad you mentioned crypto, because this is an area where this line of reasoning really stands out as the fallacy it is.
One of the most popular questions of all time on https://security.stackexchange.com/ is about a developer (in the OP he was given the pseudonym "Dave") who shared this fear of published code. Dave, like the developer you saw, was trying to homebrew his own encryption algorithm. Here's one of the most popular comments in that thread:
Dave has a fundamentally false premise, that the security of an algorithm relies on (even partially) its obscurity - that's not the case. The security of a hashing algorithm relies on the limits of our understanding of mathematics, and, to a lesser extent, the hardware ability to brute-force it. Once Dave accepts this reality (and it really is reality, read the Wikipedia article on hashing), it's a question of who is smarter - Dave by himself, or a large group of specialists devoted to this very particular problem. (emphasis added)
As a matter of fact, as it stands now, the top two memes on Security.SE are "Don't roll your own" and "Don't be a Dave".
While this has all been about crypto, this applies in general to most open-source software. The chance that a backdoor will get found and fixed goes up with each new set of eyes laid on the code. This should be a simple and uncontroversial premise: the more people are looking for something, the higher the chance it will be found. Yes, this applies to malicious users looking for exploits. However, it also applies to power users, white hat hackers, security researchers, cryptographers, professional developers, and others working for "good", which generally (hopefully) outnumber those working for "evil". This also implicitly relies on the false premise that hackers need to see the source code to do bad things. This should be obviously false based on the sheer number of proprietary systems whose source code has never been published (various Microsoft and Adobe programs come to mind) which have been inundated with vulnerabilities for years. Maybe having source code to read makes the hacker's job easier, but maybe not -- is it easier to pore over source code looking for an attack vector or to just use scanning tools and scripts against a compiled binary?
tl;dr Don't be a Dave. Rolling your own means you have to be the best at everything to succeed, instead of taking a sampling of the best the community has to offer.
Heartbleed
In your comment, you rebut:
Then why was the Heartbleed bug in openSSL not found and corrected [earlier]?
Because no one was looking at it. That's the sad truth. Here's the difference -- what happened once someone did find it? Now tens of thousands of security researchers, crypto experts, and others are looking at it. Suppose the same kind of vulnerability existed in one of the proprietary products I mentioned earlier, which it very well could. Once it's caught (if it's caught), ask yourself:
Could the team of programmers at the company responsible benefit from the help of the entire worldwide community of security experts, cryptographers, and other analysts right now?
If a bug this critical were discovered (and that's a big if!) in your software, would you be prepared to deal with the fallout caused by your custom implementation?
Unless you know of specific failure modes or weaknesses of the built-in methods your application would use and know how to minimize or eliminate them, it is probably better to use the methods provided by the language or library designers, which will often be both more efficient and more secure than what an average programmer would come up with on the fly for a particular project.
Your example absolutely does not support your view: developing your own encryption algorithm without some serious background in the domain and review by cryptanalysts, and then employing it in security-critical code, is a recipe for disaster. Even developing your own custom implementation of an industry standard encryption algorithm can present problems, and almost certainly will if you are inexperienced at it.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
Code clones, also known as Duplicate code is often considered harmful to the system quality.
I'm wondering whether these duplicate code could be seen in standard APIs or other mature tools.
If it is indeed the case, then which language(such like C,Java,Python,common lisp etc.) do you think should introduce code clone practice with a higher probability?
Code cloning is extremely common no matter what programming language is used, yes, even in C, Python and Java.
People do it because it makes them efficient in the short term; they're doing code reuse. Its arguably bad, because it causes group inefficiencies in the long term: clones reuse code bugs and assumptions, and when those are discovered and need to be fixed, all the code clones need to be fixed, and the programmer doing the fixing doesn't know where the clones are, or even if there are any.
I don't think clones are bad, because of the code reuse effect. I think what is bad is not managing them.
To help with the latter problem, I build clone detectors (see our CloneDR) that automatically find exact and near-miss duplicated code, using the structure of the programming language to guide the search. CloneDR works for a wide variety of programming languages (including OP's set).
In any software system of 100K SLOC or more, at least 10% of the code is cloned. (OK, OK, Sun's JDK is built by an exceptionally good team, they have only about 9.5%). It tends to be worse in older conventional applications; I suspect because the programmers clone more code out of self defense.
(I have seen applications in which the clones comprise 50%+ of code, yes, those programs tend be awful for many reasons, not just cloning).
You can see clone reports at the link for applications in several langauges, look at the statistics, and see what the clones look like.
All code is the same, regardless of who writes it. Any API that you cite was written by human beings, who made decisions along the way. I haven't seen the perfect API yet - all of them get to redo things in hindsight.
Cloning code flies in the face of DRY, so of course it's recommended that you not do it. It's harmful because more code means more bugs, and duplication means you'll have to remember to fix them in all the clones.
But every rule has its exceptions. I'm sure everyone can think of a circumstance in which they did something that "best practices" and dogma would say they shouldn't, but they did it anyway. Time and other constraints don't always allow you to be perfect.
Suggesting that permission needs to be granted to allow such a thing is ludicrous to me. Be as DRY as you can. If you end up violating it, understand why you did it and remember to go back and fix it if you get a chance.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm currently faced with a very unusual design problem, and hope that a developer wiser than myself might be able to offer some insight.
Background
Without being too specific, I've been hired by a non-profit organisation to assist with the redevelopment of their legacy, but very valuable (in terms of social value) software. The development team is unlike any I've encountered in my time as a software developer, and is comprised of a small number of developers and a larger group of non-programming domain experts. What makes the arrangement unusual is that the domain experts (lets call them content creators), use custom tooling, some of which is based around a prolog expert system engine, to develop web based software components/forms.
The Problem
The system uses a very awkward postback model to perform logical operations server side and return new forms/results. It is slow, and prone to failure. Simple things, like creating html forms using the existing tooling is much more arduous than it should be. As the demand for a more interactive, and performant experience grows, the software developers are finding increasingly that they have to circumvent the expert system/visual tooling used by the content creators, and write new components by hand in javascript. The content creators feel increasingly that their hands are tied, as they are now unable to contribute new components.
Design approach: Traditional/Typical
I have been advocating for the complete abandonment of the previous model and the adoption of a typical software development process. As mentioned earlier, the project has naturally evolved towards this as the non-programmatic development tooling has become incapable of meeting the needs of the business.
The content creators have a very valuable contribution to make however, and I would like to see them focusing on formally specifying the expected behaviour of the software with tools like Cucumber, instead of being involved in implementation.
Design approach: Non-programmatic
My co-worker, who I respect a great deal and suspect is far more knowledgeable than me, feels that the existing process is fine and that we just need to build better tooling. I can't help but feel however that there is something fundamentally flawed with this approach. I have yet to find one instance, either historical or contemporary where this model of software development has been successful. COBOL was developed with the philosophy of allow business people/domain experts to write applications without the need for a programmer, and to my mind all this did was create a new kind of programmer - the COBOL programmer. If it was possible to develop effective systems allowing non-programmers to create non-trivial applications, surely the demand for programmers would be much lower? The only frameworks that I am aware of that roughly fit this model are SAP's Smart Forms and Microsoft's Dynamix AX - both of which are very domain specific ERP systems.
DSLs, Templating Languages
Something of a compromise between the two concepts would be to implement some kind of DSL as a templating language. I'm not even sure that this would be successful however, as all of the content creators, with one exception, are completely non technical.
I've also considered building a custom IDE based on Visual Studio or Net Beans with graphical/toolbox style tooling.
Thoughts?
Is non-programmatic development a fools errand? Will this always result in something unsatisfactory, requiring hands on development from a programmer?
Many thanks if you've taken the time to read this, and I'd certainly appreciate any feedback.
You say:
Something of a compromise between the
two concepts would be to implement
some kind of DSL as a templating
language. I'm not even sure that this
would be successful however, as all of
the content creators, with one
exception, are completely non
technical.
Honestly, this sounds like exactly the approach I would use. Even "non-technical" users can become proficient (enough) in a simple DSL or templating language to get useful work done.
For example, I do a lot of work with scientific modeling software. Many modelers, while being much more at home with the science than with any form of engineering, have been forced to learn one or more programming languages in order to express their ideas in a way they can use. Heck, as far as I know, Fortran is still a required course in order to get a Meteorology degree, since all the major weather models currently in use are written in Fortran.
As a result, there is a certain community of "scientific programming" which is mostly filled with domain experts with relatively little formal software engineering training, expertise, or even interest. These people are more at home with languages/platforms like Matlab, R, and even Visual Basic (since they can use it to script applications like Excel and ESRI ArcMap). Recently, I've seen Python gaining ground in this space as well, mainly I think because it's relatively easy to learn.
I guess my point is that I see strong parallels between this field and your example. If your domain experts are capable of thinking rigorously about their problems (and this may not be the case, but your question is open-ended enough that it might be) then they are definitely capable of expressing their ideas in an appropriate domain-specific language.
I would start by discussing with the content creators some ideas about how they would like to express their decisions and choices. My guess would be that they would be happy to write "code" (even if you don't have to call it code) to describe what they want. Give them a "debugger" (a tool to interactively explore the consequences of their "code" changes) and some nice "IDE" support application, and I think you'll have a very workable solution.
Think of spreadsheets.
Spreadsheets are a simple system that allows non-technical users to make use of a computer's calculation abilities. In doing so, they have opened up computers to solve a great number of tasks which normally would have required custom software developed to solve them. So, yes non-programmatic software development is possible.
On the other hand, look at spreadsheets. Despite their calculational abilities you really would not want as a programmer to have to develop software with them. In the end, many of the techniques that make programming languages better for programmers make them worse for the general population. The ability to define a function, for example, makes a programmer's life much easier, but I think would confuse most others.
Additionally, past a certain point of complexity trying to use a spreadsheet would be a real pain. The spreadsheet works well within the realm for which it was designed. Once you stray too far out that, its just not workable. And again, its the very tools programmers use to deal with complexity which will prevent a system being both widely usable and sufficiently powerful.
I think that for any given problem area, you could develop a system that allows the experts to specify a solution. It will be much harder to develop that system then to solve the problem in the first place. However, if you repeatedly have similiar problems which the experts can develop solutions for, then it might be worthwhile.
I think development by non-developers is doomed to failure. It's difficult enough when developers try it. What's the going failure rate? 50% or higher?
My advice would be to either buy the closest commercial product you can find or hire somebody to help you develop a custom solution with your non-developer maintenance characteristics in mind.
Being a developer means keeping a million details in mind at once and caring about details like version control, deployment, testing, etc. Most people who don't care about those things quickly tire of the complexity.
By all means involve the domain experts. But don't saddle them with development and maintenance as well.
You could be putting your organization at risk with a poorly done solution. If it's important, do it right.
I don't believe any extensive non-programmer solution is going to work. Programming is more than language, it's knowing how to do things reasonably. Something designed to be non-programmer friendly will still almost certainly contain all the pitfalls a programmer knows to avoid even if it's expressed in English or a GUI.
I think what's needed in a case like this is to have the content creators worry about making content and an actual programmer translate that into reasonable computer code.
I have worked with two ERP systems that were meant for non-programmers and in both cases you could make just about every mistake in the book with them.
... Simple things, like creating html forms using the existing tooling is much more arduous than it should be...
More arduous for whom? You're taking a development model that works (however badly) for the non-programming content creators, and because something is arduous for someone you propose to replace that with a model where the content creators are left out in the cold entirely? Sounds crazy to me.
If your content creators can learn custom tooling built around a Prolog rules engine, then they have shown they can learn enough formalism to contribute to the project. If you think other aspects of the development need to be changed, I see only two honorable choices:
Implement the existing formalism ("custom tooling") using the new technology that you think will make things better in other ways. The content creators contribute exactly as they do now.
Design and implement a domain-specific language that handles the impedance mismatch between what your content creators know and can do and the way you and other developers think the work should be done.
Your scenario is a classic case where a domain-specific language is appropriate. But language design is not easy, especially when combined with serious usability questions. If you are lucky you will be able to hire someone to help you who is expert in both language design and usability. But if you are nonprofit, you probably don't have the budget. In this case one possibility is to look for help from another nonprofit—a nearby university, if you have one.
I'd advise you to read this article before attempting to scrap the whole system. I look at it this way. What changed to prompt the redevelopment? Your domain experts haven't forgotten how to use the original system, so you already have some competent "COBOL programmers" for your domain. From your description, it sounds like mostly the performance requirements have changed, and possibly a greater need for web forms.
Therefore, the desired solution isn't to change the responsibilities of the domain experts, it's to increase the performance and make it easier to create web forms. You have the advantage of an existing code base showing exactly what your domain experts are capable of. It would be a real shame not to use it.
I realize Prolog isn't exactly the hottest language around, but there are faster and slower implementations. Some implementations are designed mostly for programmer interactivity and are dynamically interpreted. Some implementations create optimized compiled native code. There are also complex logical programming techniques like memoization that can be used to enhance performance, but probably no one learns them in school anymore. A flow where content creators focus on creating new content and developers focus on optimization could be very workable. Also, Prolog is ideally suited for the model layer, but not so much for the view layer. Moving more of your view layer to a different technology could really pay off.
In general though, 2 thoughts:
You cannot reduce life to algorithms. Everything we know (philosophically, scientifically) and experience demonstrates this. (Sorry, Dr. Minsky).
That said, a Domain-Specific tool that allows non-programmers to express a finite language is definitely doable as several people have given examples. Another example of this type of system is Mathematica and especially Simulink which are used very successfuly over a range of applications. However, the failure of Expert Systems, Fuzzy logic, and Japan's Fifth Generation computer project of the 80's to take-off demonstrate the difficulty in doing this.
Labview is a very successful none programatic programming environment.
What an interesting problem.
I would have to ultimately agree with you, and disagree with your colleague.
The philosophy and approach of Domain Driven Development/Design exists exactly for your purposes, in that it puts paramount importance on the specific knowledge of the experienced domain experts, and on communicating that knowledge to talented software developers.
See, in your issue, there are two distinct things. The domain, and the software. The domain should be understood and specified first and foremost without software development in mind.
And then the transformation to software happens between the communication between domain experts and programmers.
I think trying to build "programming" tools for domain experts is a waste of time.
In Domain Driven Development your domain experts will continue to be important, and you'll end up with better software.
In your colleague's approach you're trying to replace programmers with tool.... maybe in the future, like, start trek future, that will be possible, but today I don't think so.
I am currently struggling with a similar problem in trying to enable healthcare providers to write rules for workflows, which isn't easy because they aren't programmers. You're a programmer not because you went to programming school -- you're a programmer because you think like a programmer. Fortunately, most hospitals have some anesthesiologist or biomedical engineer who thinks like a programmer and can manage to program. The key is to give the non-programmers-who-think-like-programmers a language that they can learn and master.
In my case, I want doctors to be able to formulate simple rules, such as: "If a patient's temperature gets too low, send their doctor a text message". Of course it's more complicated than that because the definition of "too low" depends on the age of the patient, a patient may have many doctors, and so on, but a smart doctor will be able to figure out those rules. The real issue is that the temperature sensor will often fall off the patient and read ambient temperature, meaning that the rule is useless unless you can figure out how to determine that the thermometer is actually reading the patient's temperature.
The big problem I have is that, while it's relatively easy to create a DSL so that doctors can express IF [temperature] [less-than] [lookup-table [age]] THEN [send-text-message], it's much harder to create one that can express all the different heuristics you might need to try before coming up with the right way to make sure the reading is valid.
In your case, you may want to consider how VB became popular: It has a form drawing tool that anybody can easily use to draw forms and set properties on form items. Since not everything can be specified by form properties and data binding, there's a code-behind mode that lets you do complex logic. But to make the tool accessible to beginning programmers, the language is BASIC, so users didn't have to learn about pointers, memory allocation, or data structures.
While you probably wouldn't want to give your users VB, you might consider a hybrid approach. You would have one "language" (it could be graphical, like VB's form designer or Labview) where inexperienced users can easily do the simple stuff, and another language to enable expert users to do the complicated stuff.
I had this as a comment previously but I figured it deserved more merit.
There are definitely a number of successful 'non-programmatic' tools around, off hand I can think of Labview, VPL and graph based (edit: I just noticed this link has more far more than just shaders on it) shaders which are prevalent in 3D applications.
Having said that, I don't know of any which are suited to web based dev (which appears to be your case).
I dout very much the investment on developing such tools would be worth it (unless maybe you could move to sell it as a product as well).
I agree with you - non-technical people will not be able to program anything non-trivial.
Some products try to create what's basically a really simple programming language. The problem is that programming is an aptitude as well as a skill. It takes a certain kind of mindset to think in the sort of logic used by computers, which just can't be abstracted away by any programming language (at least not without without making assumptions that it can't safely make).
I've seen this in action with business people trying to construct workflows in MS Dynamics CRM. Even though the product was clearly intended to allow them to do it without a programmer they just couldn't figure out how to make a loop or an if-else condition work, no matter how "friendly" the UI tried to make it. I watched in amazement as they struggled with something that seemed completely self-obvious to me. After a full day (!) of this they managed to produce a couple of very basic workflows that worked in some cases, but didn't handle edge cases like missing values or invalid data. It was basically a complete waste of time.
Granted, Dynamics CRM isn't exactly the epitome of user-friendliness, but I saw enough to convince me that this is, indeed, a fool's errand.
Now, if your users are not programmers, but still technical people they might be able to learn programming, but that's another story - they've really become "new programmers" rather than "non-programmers" then.
This is a pretty philosophical topic and difficult to answer for every case, but in general...
Is non-programmatic development a fools errand?
Outside of a very narrow scope, yes. Major software vendors have invested billions over the years in creating various packages to try to let non-technical users create & define workflows and processes with limited success. Your best bet is to take advantage of what has been done in that space rather than trying to re-invent it.
Edit:
Sharepoint, InfoPath, some SAP stuff are the examples I'm talking about. As I said above, "a very narrow scope". It's possible to let non-programmers create workflows, complex forms, some domain processes, but that's it. Anything more general-purpose is simply trying to make non-programmers into programmers by giving them very crude tools.
Non programatic software development IS feasable, as long as you are realistic about what non-developers can reasonaby achieve - its all about a compromise between capability and ease of use.
The key is to break the requirements down cleanly into things that the domain experts need to be able to do, and spend time implementing those features in a foolproof way - The classic mistake is to try to let the system to do too much.
For example suppose you want domain experts to be able to create a form with a masked text input:
Most developers will look at that requirement and create a fancy control which accepts some sort of regular expression and lets the domain expert do anything.
This is the classic developer way of looking at things, however it's likely that your domain expert does not understand regular expressions and the developer has missed the point of the reqirement which was for the domain expert to be able to create this form.
A better solution might have been to create a control that can be confgiured to mask either Email addresses or telephone numbers.
Yes this control is far less capable than the first control, and yes the domain experts have to ask developers to extend it when they want to be able to mask to car registration numbers, however the domain experts are able to use it.
It seems that the problem is of organizational nature and cannot be solved by technical means alone.
The root is that content creators are completely non-technical, yet have to perform inherently technical tasks of designing forms and writing Prolog rules. Various designers and DSLs can alleviate their problems, but never solve them.
Either reorganize system and processes so knowledge carriers actually enter knowledge - nothing else; or train content creators to perform necessary development with existing tools or may be DSLs.
Non-programmatic development can save from low-level chores, but striving to set up system once and let users indefinitely and unrestrictedly expand it is certainly a fools errands.
Computer games companies operate like this all the time, so far as I can tell: a few programmers and a lot of content creators who need to be able to control logic as well (like level designers).
It's also probably a healthy discipline to be able to separate your code from the data and rules driving it if you can.
I'm therefore with your colleague, but of course the specifics may not make this general solution appropriate!
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am currently working on a project where i need to create some architecture, framework or any standards by which i can "at least" increase the cracking method for a software, i.e, to add to software security. There are already different ways to activate a software which includes online activation, keys etc. I am currently studying few research papers as well. But there are still lot of things that i want to discuss.
Could someone guide me to some decent forum, mailing list or something like that? or any other help would be appreciated.
I'll tell you the closest thing to "crackproof": a web application.
Desktop applications are doomed, for many other reasons, but making your application run "in the cloud", in a browser, gives you a lot more control about security.
A desktop software runs on the client's computer, so the client has full access to it. A web app runs on your server, so the client only sees a tiny bit of it.
You need to begin by infiltrating the local hacking gang, posing as an 11 year old who wants to "hack it up". Once you've earned their trust you can learn what features they find hardest to crack. As you secretly release "uncrackable" software to the local message boards, you can see what they do with it. Build upon your inner knowledge until they can no longer crack your software. When that is done, let your identity be known. Ideally, this will be seen as a sign of betrayal, that you're working against them. Hopefully this will lead them to contact other hackers outside the local community to attack your software.
Continue until you've reached the top of the hacker mafia. Write your thesis as a book, sell to HBO.
Isn't it a sign of success when your product gets cracked? :)
Seriously though - one approach is to use License objects that are serialized to XML and then encrypted using public/private key pairs. They are then read back in at runtime, de-serialized and processed to ensure they are valid.
But there is still the ubiquitous "IsValid()" method which can be cracked to always return true.
You could even put that method into a signed assembly to prevent tampering, but all you've done then is create another layer of "IsValid()" which too can be cracked.
We use licenses to turn on or off various features in our software, and to validate support/upgrade periods. But this is only for our legitimate customers. Anyone who wants to bypass it probably could.
We trust our legitimate customers to not try to bypass the licensing, and we accept that our illegitimate customers will find a way.
We would waste more money attempting to imporve the 'tamper proof' nature of our solution that we loose to people who pirate software.
Plus you've got to consider the pain to our legitimate customers, and asking them to paste a license string from their online account page is as much pain as I'd want to put them through. Why create additional barriers to entry for potential customers?
Anyway, depending on which solution you've got in place already, my description above might give you some ideas that might decrease the likelyhood someone will crack your product.
As nute said, any code you release to a customer's machine is crackable.
Don't try for "uncrackable." Try for "there's enough deterrent to reasonably protect my assets."
There are a lot of ways you can try and increase the cost of cracking. Most of them cost you but there is one thing you can do that actually reduces your costs while increasing the cost of cracking: deliver often.
There is a finite cost to cracking any given binary. That cost is increased by the number of binaries being cracked. If you release new functionality every week, you essentially bifurcate your users into two groups:
Those who don't need the latest features and can wait for a crack.
Those who do need the latest features and will pay for your software.
By engaging in the traditional anti-cracking techniques, you can multiply the cost of cracking one binary an, consequently, widen the gap between when a new feature is released and when it is available on the black market. To top it all off, your costs will go down and the amount of value you deliver in a period of time will go up - that's what makes it free.
The more often you release, the more you will find that quality and value go up, cost goes down, and the less likely people will be to steal your software.
As others have mentioned, once you release the bits to users you have given up control of them. A dedicated hacker can change the code to do whatever they want. If you want something that is closer to crack-proof, don't release the bits to users. Keep it on the server. Provide access to the application through the Internet or, if the user needs a desktop client, keep critical bits on the server and provide access to them via web services.
Like others have said, there is no way of creating a complete crack-proof software, but there are ways to make cracking the software more difficult; most of these techniques are actually used by bad guys to hide the malware inside binaries and by game companies to make cracking and copying the games more difficult.
If you are really serious about doing this, you could check e.g., what executable packers like UPX do. But then you need to implement the unpacker also. I do not actually recommend doing this, but studying game protectors and binary obfuscation might help you in your quest.
First of all, in what language are you writing this?
It's true that a crack-proof program is impossible to achieve, but you can always make it harder. A naive approach to application security means that a program can be cracked in minutes. Some tips:
If you're deploying to a virtual machine, that's too bad. There aren't many alternatives there. All popular vms (java, clr, etc.) are very simple to decompile, and no obfuscator nor signature is enough.
Try to decouple as much as possible the UI programming with the underlying program. This is also a great design principle, and will make the cracker's job harder from the GUI (e.g. enter your serial window) to track the code where you actually perform the check
If you're compiling to actual native machine code, you can always set the build as a release (not to include any debug information is crucial), with optimization as high as possible. Also in the critical parts of your application (e.g. when you validate the software), be sure to make it an inline function call, so you don't end up with a single point of failure. And call this function from many different places in your app.
As it was said before, packers always add up another layer of protection. And while there are many reliable choices now, you can end up being identified as a false positive virus by some anti-virus programs, and all the famous choices (e.g. UPX) have already pretty straight-forward unpackers.
there are some anti-debugging tricks you can also look for. But this is a hassle for you, because at some time you might also need to debug the release application!
Keep in mind that your priority is to make the critical part of your code as untraceable as possible. Clear-text strings, library calls, gui elements, etc... They are all points where an attacker may use to trace the critical parts of your code.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Just want to ask for some opinions here. How do you feel about using a language (and/or framework) that isn't widely used in your location to write software for a company? For instance, I live in an area dominated by .NET, with the occasional PHP job. Let's say that I'm learning Python and decide to use it to write software for my job (I'm a "Team of One" so I can pretty much use anything I want).
Now their software is written in a language that pretty much nobody in the area uses or knows; if I were to leave the company, they'd basically have nobody to maintain/add to it unless they retain on me as a consultant. While that's really good for me, it seems a bit "crooked" - granted, that's how the business world works.
What are your thoughts?
I should mention that this is a very small company and I'm the only IT person, so I have full reign to choose our development platform. I'm not specifically using Python, but chose it as an example since my area is almost entirely .NET based; I don't care for .NET anymore though, which is why I don't want to consider using it. Also, the company is.. how shall we say... extremely frugal and wouldn't purchase the required resources for .NET (e.g. server licenses, SQL licenses, Visual Studio, components). I personally have an MSDN subscription but I can't use that for them.
Also FWIW there are people in the area who use the language I'm considering using (Ruby on Rails), but nowhere near as many people as .NET developers. It's not like I'm using something that only I know.
You may think that this approach is good for you. But in fact all this does is paint you in to a corner. The best way to get promotion - within an organisation is to make yourself unnecessary in your current position. That might seem like nonsense, but it is in fact true. Think of it like this, if it is essential to the company that you continue to maintain the python code you wrote for them, and they can't go to anyone else to get that skill, then they will continue to pay (maybe a little above market rates) to maintain that code.
If however, you write that code in .net where there is a plentiful supply in your area, then as the company expands and the code you've written proves successful, you will be able to hire people to maintain that code and you can move on to designing other systems. Or moving in to managing a team of .net coders - if that's your want.
Even if you want to leave, the best thing for your career is going to be to get the best possible reference. To do that, write them some code that is easy to maintain. Help them hire someone to replace you to maintain it. They will be grateful and recommend you as a consultant to their friends.
Code in something esoteric - for which there is little support in your area - and they will be saying to their friends on the golf course "no don't hire that guy, he wrote this system for us which does the job, but no one else can maintain it. We're stuck with him forever and now he's too busy to look after us properly!"
Do what's best for the business, not what might be of most interest to you - or appear that way on the face of it. You'll win out in the long run.
I think that you're responsible to decide on the language that's best suited for the job. That includes an objective evaluation of the merits of the language and framework, it includes your own personal skill with the language (since you're the one doing the work) and it includes maintainability by others. Only you and your company can decide how much importance to place on each of those.
For your own personal development, if your area is dominated by .net, why don't you want to get up to speed in that instead of Python?
From an ethical standpoint, I would not write something that could not easily be maintained by someone else.
A lot of responses seem to be a poor fit for the question. We're not talking about using an unapproved language in an environment with existing standards. We're talking about a situation where the poster is the entire IT and development department for his company.
It's certainly important to keep in mind availability of talent, but Ruby is hardly a fringe language these days. In an environment where there's only one developer, productivity is also a very important consideration. Being able to build and maintain software quickly and easily without a large team requires tools with different characteristics than a large team might require.
I think what's more important than whether to use Ruby or (something else) is to try to pick something as general-purpose as practical and use it for everything unless there's a really good reason to use something else. If you go with Ruby, stick with Ruby for your utility scripts, cron jobs and that little GUI app the boss wanted to automatically SMS the intern when he takes more than five minutes to bring him his coffee.
I think using python would be the right thing to do if it would meet the clients requirements, and save them money over the alternative. Whether or not there is a wide assortment of characters to work on the application down the road is irrelevant, unless they've specified this as a non-functional requirement.
As usual, using the best tool for the job at hand will serve you well.
It indeed is a bit crooked IF you use it only for that purpose.
However, if you use it because it IS the best solution, youre in the clear.
Also, they can just hire someone else who knows python.
My work ethics dont allow me to do something like this just to keep me in business.
My personal opinion is you should try where possible to respect the working practices of wherever you are - whether that's indentation style, naming convention, testing procedure or programming language.
If you feel strongly that a different language would be better suited for a certain task, then lobby to have it accepted (with the required re-training of others).
Purposefully leaving an app that no one else can maintain is very bad professional conduct, IMO.
We recently had a bad hire at my shop and he decided out of the blue he was going to use Perl instead of any version of .NET to do some simple reporting stuff (That could have just as easily been done in .NET). It was atrocious. I would suggest using the platform as specified and clearing any deviation with the people who run the joint...
Plenty of answers have touched on this, but here's my take based on production application support.
My company had a startup phase where code hustlers whipped up solutions in whatever the personal preference or flavor of the week was. Bad for maintainability and supportability.
Making a change is ok, though, as long as it's consistent. If Python is going to pave the way to the future, then go for it. Don't forget that the legacy .NET and PHP code still needs to be supported until end of life. Building yourself a hodge podge of platforms and frameworks will just create more difficulty for you on the job and the company when you're no longer around.
If you feel in your heart you are acting dishonestly, then you probably are.
No one likes a dishonest person. That can't be good for your reputation.
Do your best to choose based on what is actually best, not what satisfies some underhanded motives.
It depends. I did some of what would normally just be a bash script, in Java instead at one place. Why? Because they're all Java programmers and frequently have interns/coops coming through that may or may not know anything else (and may or may not even be all that great with Java).
Other places though tend to have more experienced programmers and I expect that they'll be able to figure out another language without too much effort. So, I would go with what's "best" for the project.
I agree with what mquander says above, but you may also have to be prepared to justify why you want to use this other language to your development manager. If he/she then agrees, perhaps the language could become more widely adopted within the company.
Think of it in terms of business benefit you bring to the company, now and in the mid-term.
If you can deliver something much faster using a different technology, and it still achieves the goals, I'd go for it - but I'd still let some other people know and respect the company's final decision. If however, it's purely for yourself, then I'd probably be a litte more careful.
I think it's a really bad idea. For you, it means there's no back up in case you want to have a day (or week) off. For them, there's no one else if you leave or are taking a day off. It's a well known ploy, and, honestly, might be reason to not keep you around.
However, this could also be a chance to introduce Python into the environment. You could teach others about it, and explain to management while it's a good third language to have at the group's disposal.
I used to think that you should always pick the right language for the job at work. I'm reversing my opinion though.
The problem arises when some other guy picks a language you don't want to learn. I am concerned that I might be the guy who picks the language no one else wants to learn. Just because I think that Erlang might be the right choice for something doesn't mean that everyone else will want to learn Erlang or respect my decision for using Erlang.
"if I were to leave the company, they'd basically have nobody to maintain/add to it unless they retain on me as a consultant."
Are you saying no one else can learn Python? I find that hard to believe.
New technology is often introduced in small projects by knowledgeable people and diffused through the organization because the small projects were successful.
Use Python. Be successful. Make your case based on your successes.
I had this same problem very often. Coincidentally, it was with those two languages you mention: .NET forced on me, when I preferred to use Python (among others). Could be the opposite, I don't judge.
I refrained to use Python, because of the reasons already mentioned in other answers. I did what I thought was best for the company. Using IronPython won't make your python code any more maintainable for an unexperienced Python programmer.
However, I left the company and now I work in something more in line with my tastes. I'm much happier. In this economy you may not have this option... but it will pass. Do the right thing.
Cheers.
There is a large difference between 'prototype' or 'one-shot' code and production code. For prototyping I use whatever works fastest, but I'm very clear about its status. Production code is written in one of the approved and supported environments.
The ethics is to use the best tool for the job. If there is a tool that takes you only 20% of the time to code vs other choices, and next to no maintenance, and easy to re-factor, you have a duty to pick that tool, assuming it's extensible as you may need in the business.
If you do a good job, hiring future people and training them in terms of HOW your workplace does business should be the practice of any growing business. They will be able to learn the code if they're the right person for the business.
In your case I'm not sure if you want to use Python, unless it has native .NET support to allow your .NET world to interact with it.
Other posters have made some good points, but here's one I've not seen: Communicate the situation to management and let them decide. In other words, talk with your boss and tell him or her that there currently are more .NET developers in your area, so that if you're hit by a bus tomorrow it would easier to find someone else to maintain your code; however, there are tools you need to do your job more efficiently and they cost money (and tell them how much). Alternatively, you could do this in Python or RoR (or whatever) and use free tools, but from what you know, there aren't currently that many people in the area who know those languages. I've used "currently" a couple times here because this may change over time.
Before having this conversation, it might be good to see if you can find user groups for the alternative technology in your area, and how large they are. You could also ask on listserves if there are people who know the alternatives in your area.
Of course, the boss may tell you to keep using .NET without any tools, but in that case it's their decision to shoot themselves in the foot. (And yours to decide if you want to find a new job.)
Regarding the question as asked, I see nothing unethical about it, provided that:
It is a freely-available language. Although I am something of a FOSS partisan, that's not the point of this criterion. It needs to be freely-available (not necessarily FOSS) so that it doesn't impose costs on the company and so that others will have the opportunity to learn it if you ever need to be replaced (or if they want to compete with you for your job).
You are changing languages for solid reasons and not for the sake of creating vendor lock-in (or, if you prefer to think of it as such, "job security"). Ethics aside, you really don't want to have a job where they hate you, but are stuck with you because you're the only one who can maintain the mess you've created anyhow.
In the particular case you've described, I would suggest that switching to RoR may be the more ethical choice, as it would be decidedly unethical (not to mention illegal) to use .NET if there are required resources which are for-pay only and your employer is too cheap frugal to purchase proper licenses for them.
When in Rome... do as the Romans.
You might not be the one who as to maintain this code in the long term and not everyone wants to learn a "fringe language" to make bugfixes or enhancements.
I migrated some VBA stuff over to Perl for processing at a previous job and increased the efficiency by several orders of magnitude, but ultimately no-one else there was willing to learn Perl so I got stuck with that task longer than I wanted it.
I did that, it was Delphi in my case. I think Delphi was used often however when i was looking for a job .... i saw 3 delphi job offers in my whole life. i also saw more java/j2ee/php offers that i can remember. i think its bad idea, with the time i wasted in learning advance delphi programming i could get better with j2ee and start in better company and maybe make now more money.
If they cant find somebody to maintain the app you will always do it and when you quit they will have to re-write it. i think consultant thing is not used often.
I used to be in the "use the best tool for the job" school, but I've changed my mind. It's not enough to just ask "how can I do this job the fastest." If you think you're the only one who will ever need to look at some code, there's a good chance you're mistaken. The total cost of introducing a new language into an environment is higher than you might imagine at first.
If you just need to produce a result, not a program, then you can use whatever you want. Say you need a report or you need to munge some files. If the output is really all that matters, say it's something you could have chosen to do by hand, you can practice using any language you want.
With the release of the MVC Framework I too have been in a similar ethical delema. Use WebForms or switch over to MVC Framework for everything. The answer really is you have to do the right thing and use whatever the standard of the company is. If you deviate from the standard it creates a lot of problems for people.
Think how you would feel if you were dumped a project on VB6 when all you have been doing for years is .Net. So these are the two solutions I have come up with.
Use your fun languages for consulting contracts you do on the side. Make sure the client knows what you are doing and if they agree go for it.
Try and convince your current company to migrate over to this great new language you are working with.
If you follow these routes you will learn your language and not piss anyone off in the process.
Ruby on Rails is certainly not a fringe language. If the company is too cheap to pony up for the appropriate licensing for Microsoft's tools, then you would have no choice but to find an alternative. RoR certainly would be a reasonable choice and if helps move your career along as well, then it's win-win for both of you!
You can develop .NET adequately with free tools; cost is not a good reason to avoid that platform. Ruby on Rails is becoming reasonably mainstream for building data-driven internet websites. You haven't even told us if thats the sort of software you are building though.
There is really no way with the information that you have provided that anyone can give you a single correct answer.
If you are asking is it ethical to do your work in such a way that the company is dependent upon you, of course the answer is no. If you are asking is it ethical to develop in RoR then the answer is "we don't know" - but my opinion is that probably it would be fine if its the right tool for the job.
Don't underestimate the ability of someone else to support your work or replace you though - if you do your work reasonably well once the solution is in place any programmer worth their pay should be able to learn the platform well enough to maintain it. I've debugged, migrated and supported a few PHP applications for example without ever hardly learning the first thing about PHP. I'd be lost building a new PHP app from scratch and would never even try but its no problem to support one. I think the same would be true of the languages you mention as well - they've got the critical mass that means there is plenty of books and forums etc. Of course if its written badly enough in any language then it may be difficult to support regardless of anyone's skill in the language...
So much discussion for such a clear-cut situation...
It's not up to you, it's up to them. If they're not technical enough to make the call, as it seems, then you have to make it for them in good faith. Anything less is dishonest, and I'm fairly sure that's not in your job description ;)
You've muddied the waters with all the wandering about in the thickets of personal motivations. The answer to that one is that your personal motivations are irrelevant unless and until you've formulated the business case for the possible decisions. If you've done that and the answer still isn't clear-cut, then sure, choosing the answer you like the best is one of the nice things about being in a position to make technical decisions in the first place.
As far as the actual question goes, to my mind if the most technically apt choice is also one that very few people work with, one of two things is happening: a) It's a good choice, and the number of people working with it is going to be exploding over the next 18-24 months (e.g. Django), or b) There's something wrong with my analysis. Technologies may be on the fringe because people are slow to adopt them, but that's generally not why they stay on the fringe.
If you find yourself thinking "I can't choose technology X, that'll make it easier for them to replace me!" you're in the wrong line of work. In almost any enterprise that's not actually failing, the IT guy who makes himself easy to replace tends to move up to harder and more interesting and more lucrative work.
I would not bring a new language/framework/whatever into the place unless they understood that's what I was doing, and that if I left/was fired/was hit by a bus, they'd have to find/train someone to work with it.
I have some experience in a contractor pulling in things just because he felt like it. In some cases they were the best tool for the job (in other cases they were not), but in all cases they were not the best tool for the team that had to maintain the code. In my case the contractor was a serious jerk who didn't really give a darn about anyone else and I believe WAS trying to make himself harder to replace.
In your case, talk to your bosses. If they really don't want to spend the needed money on .NET framework tools/libs, then switching to something else may well BE the right thing to do for them, long term.
And, as someone who has spent his career walking into the middle projects that others have already started - thank you for thinking before you add a new tool to the mix.