What is a hack? [closed] - security

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I use the term all the time... but I was just sort of thinking that I don't really have a solid denotational sense behind the term (or at least the term in the sense I want to discuss here). I'm interested in the sense of the word related to code, not the anthropomorphic idea. I'm also not interested here in the sense of the word related to intentional malicious computing (i.e. a hack to unlock secret powers in a game). What I want to explore is what it means to 'hack' in terms of writing software to solve a problem
wikipedia's def of 'hack' to me is a bit vague, but a decent starting point. It considers a hack
can refer to a solution or method which functions correctly but which is "ugly" in its concepion
works outside the accepted structures and norms of the environment
is not easily extendable or maintainable
can be slang for "copy", "imitation" or "rip-off."
These traits of a hack conform to my usage of the word--when applied to code it is always a term of derision. To my mind, a hack
Is likely to be difficult to maintain & hard to understand in the context of the rest of the code.
Is likely to cause failure of the app.
tends to indicate a poor understanding by the coder either of the problem space, usage of the language or both
tends to be the byproduct of aggressive schedules
suggests potential changes in requirements that have not been fully incorporated into the architecture of the solution (requiring an 'inorganic' workaround).
smells
all bad, bad, bad. To me, a hack in this sense is always negative, indicating either lack of time, incompetence, or sloth on the part of the developer, though a decent percentage of hacks must be written to compensate for ill-conceived designs or systems that have gained requirements which their original design cannot handle 'organically'.
I don't think I've really captured it totally though--it's like pornography a bit: I can't really define it, but I know it when I see it. So I ask you: what does it mean to 'hack' when you are trying to solve a problem in software?

I've always preferred Paul Graham's definition:
To add to the confusion, the noun "hack" also has two senses. It can be either a compliment or an insult. It's called a hack when you do something in an ugly way. But when you do something so clever that you somehow beat the system, that's also called a hack. The word is used more often in the former than the latter sense, probably because ugly solutions are more common than brilliant ones.

From the Jargon File, the glossary of hacker slang:
The Meaning of ‘Hack’
“The word hack doesn't really have 69 different meanings”, according to MIT hacker Phil Agre. “In fact, hack has only one meaning, an extremely subtle and profound one which defies articulation. Which connotation is implied by a given use of the word depends in similarly profound ways on the context. Similar remarks apply to a couple of other hacker words, most notably random.”
Hacking might be characterized as ‘an appropriate application of ingenuity’. Whether the result is a quick-and-dirty patchwork job or a carefully crafted work of art, you have to admire the cleverness that went into it.
An important secondary meaning of hack is ‘a creative practical joke’. This kind of hack is easier to explain to non-hackers than the programming kind.

When I think of "hack", I think of it as being a non-expected workaround to solve a problem, not necessarily a bad thing. Creative, innovative, and well-placed. "Hack" can apply to more than just computers, though I seldom hear it used that way.

Too often "hack" simply means: "Not the way I would do it."

This topic will turn into something like a question about Love. Everyone's gonna have their own definition. The best way to know the proper (default) definition is in the dictionary

It's when you've stepped out of the idiomatic, natural, sensible and (sometimes) supported ways of doing something in a given language/framework/etc.
Sometimes that's a stroke of genius, usually it's an act of idiocy, occasionally it's one disguised as the other, and on rare occasions it's both.
(Incidentally, the judge who coined that statement about pornography you quote later retracted in making another ruling).

When I use the term 'hack' it usually refers to a solution to a problem that was done usually in response to a pressing issue, and so not a lot of thought went into it in regards to the overall design of the application. Sometimes it works out, sometimes not so much, and sometimes it turns out to be a work of genius. But mainly, it's an admitted temporary solution that (hopefully) gets refactored and refined when possible.

Here's a great sentence I saw about the difference between hacking and scamming and it says, "Hacking attacks are successful when the criminal knows how a particular computer system works. Scams are successful when the perpetrator knows how the human brain works.", which brings the idea out that to hack into something, you need to have a deep understanding of how it works.

Related

What's the difference between hacks and messy/bad code [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
When is code a hack?
People seem to define a hack as ugly coding to solve a problem but how is that different from writing messy code.
Also is the only difference between a problem coded badly and a problem hacked the mindset of the programmer?
When I say hack I mean in the programming/development sense not the illegal sense.
A hack is a section of code you write to overcome a technology deficiency such as your programming language, communications protocol, hardware, or some other programmer's bug. You usually tag your code as a hack to let other people know you could have done it the "right" way if you just didn't have this limitation.
That being said, it is often misused and simply refers to a section of code where the programmer was too lazy to do it the "right" way, or where the code seems to work for what they designed, but they are aren't sure of the unintended consequences. For example: one may "hack" code if it was poorly designed and they don't understand what the change is going to really do to the entire system. That isn't really a hack, it is just a lack of understanding.
A hack is an ugly solution which may be implemented in well documented, perfectly formatted code with exquisitely named variables and all that. As they say, you can put lipstick on a pig, but it's still a pig.
On the other hand, you can also have a messy, hard to read implementation of a beautiful algorithm. Poorly chosen names, bad formatting, and poor documentation make code harder to understand, but the underlying idea may still be sound. This sort of thing isn't lipstick on a pig, it's a diamond in the rough.
A hack implies that the programmer is using a system in a way it was not designed for. For a hack to take place, the system must have a clear way it was designed. This design is then violated by the hack.
Ugly coding usually implies there is no clear design to hack. A hack is no longer a hack if it is just as ugly as the surrounding code.
Normally hacks are quick and dirty solutions which the programmer intents to rectify some day. Ugly programming has a way of never getting fixed.

How to counter the "one true language" perspective? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
How do you work with someone when they haven't been able to see that there is a range of other languages out there beyond "The One True Path"?
I mean someone who hasn't realised that the modern software professional has a range of tools in his toolbox. The person whose knee jerk reaction is, for example, "We must do this is C++!" "Everything must be done in C++!"
What's the best approach to open people up to the fact that "not everything is a nail"? How may I introduce them to having a well-equipped toolbox, selecting the best tool for the job at hand?
As long as there are valid reasons for it to be done in C++, I don't see anything wrong with this monolithic approach.
Of course a good programmer must have many different tools in his/hers toolbox, but these tools don't need to be a new language, it can simply be about learning new programming paradigms.
As much as I've experienced actually, learning many different languages doesn't make you much of a better programmer at all.
This is also true with finding the right language for the job. Yeah ok, if you're doing concurrency you might want a functional language rather than an Object Oriented language, but what are the gains of using another programming language?
At the end of the day; "Maintenance".
If it can be maintained without undue problems then the debate may well be moot and comes down to preference or at least company policy/adopted technology.
If that is satisfied then the debate becomes "Can it be built efficiently to be cost effective and not cause integration problems?"
Beyond that it's simply the screwdriver/build a house argument.
Give them a task which can be done much easily in some other language/technology and also its hard to do it the language/technology that he/she is suggesting for everything.
This way they will eventually search for alternatives as it gets harder and harder for them to accomplish the task using the language/technology that they know.
Lead by example, give them projects that play to their strengths, and encourage them to learn.
If they are given a task that is obviously better suited for some other technology and they choose to use a less effective language, don't accept the work. Tell them it's not an appropriate solution to the problem. Think of it as no different then them choosing Cobol to take the replace of a shell script -- maybe it works, but it will be hard to maintain over time, take too long to develop, require expensive tools, etc.
You also need to take a hard look at the work they do and decide if it's really a big deal or not if it's done in C++. For example, if you have plenty of staff that knows that language and they finished the task in a decent amount of time, what's the harm? On the other hand, if the language they choose slows them down or will lead to long term maintenance problems they need to be aware of that.
There are plenty of good programmers who only know one language well. That fact in and of itself can't be used to determine if they are a valuable member of a team. I've known one-language guys who were out of this word, and some that I wouldn't have on a team if they worked for free.
Don't hire them.
Put them in charge of a team of COBOL programmers.
Ask them to produce a binary that outputs an infinite Fibonacci sequence.
Then show them the few lines (or 1 line, depending on the implementation) it takes in Haskell, and that it too can be compiled into a binary so there are better ways forward.
How may I introduce them to having a
well-equipped toolbox, selecting the
best tool for the job at hand?
I believe that the opposite of "one true language" is "polyglot programming", and I will then refer to another answer of mine:
Is polyglot programming important?
I actually doubt that anybody can nowadays realize a project in one and only one language (even though there might be exceptions). The easiest way to show them the usefulness of specific tools and languages, is then to show them that they are already using several ones, e.g. SQL, build file, various XML dialect, etc.
Though I embrace the polyglot perspective, I do also believe that in many area "less is more". There is a balance to find between the number of language/tools, the learning curve, and the overall productivity.
The challenge is to decide which small set of languages/tools fit nicely together in your domain and will push productivity and creativity to new limits.
Give them a screwdriver and tell them to build a house?

sentimental code [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I've come across an article that discusses the issue of "code admiration". Basically, the author talks about how developers should be more skeptic about code they write. How we may "admire" our code too much, attach our-self to it, making us more vulnerable to bugs and other mishaps that may be lying in front of us.
How do you feel about this issue? And do you have more tips on how to avoid/be more aware of this issue?
Some years ago, I was working with another on a small "hobby" project, and I realised that we had to re-assess things. We had written lots of code but it wasn't all good code.
We didn't really want to "throw away" all the work we had put in. But I realised something: what mattered most was the amount of work we would need to put in from now on.
We couldn't change the fact that we had already put so much work into the project, so the only way to minimise the total amount of work the project would need, would be to minimise the amount of work we hadn't done yet.
Since that day, I have stopped being attached to my code. If I'm confident that throwing it away and starting from scratch means less work than keeping it and adapting it to my needs, then I'll throw it away.
My high school art teacher used to encourage us to take what we considered to be our best drawings and tear them up; he called this "cleansing the soul". His reasoning was that, as artists, we were driven to create works of art, and any time we produced something that we liked and that gave us satisfaction, our motivation to continue creating would be lessened.
So I followed his advice and tore up my best stuff, and it worked. Instead of spending my time admiring my old work, I created new stuff and continually got better. I've tried to follow the same principle with my code, but it doesn't really work: my computer has a tough plastic shell that is nearly impossible to tear through.
I post a fragment from Jeff Atwood's blog, Sucking Less Every Year, and I agree 100%.
I've often thought that sucking less
every year is how humble programmers
improve. You should be unhappy with
code you wrote a year ago. If you
aren't, that means either A) you
haven't learned anything in a year, B)
your code can't be improved, or C) you
never revisit old code. All of these
are the kiss of death for software
developers.
We sure like to admire our nice code, but it's not always easy to know what to admire. Complicated and elaborate code is sometimes mistaken for admirable code, while elegance and simplicity should rather be what to strive for.
Two quotes come to mind:
"Debugging is twice as hard as writing
the code in the first place.
Therefore, if you write the code as
cleverly as possible, you are, by
definition, not smart enough to debug
it.”
-- Brian Kernighan
and
"Make everything as simple as
possible, but not simpler."
-- Albert Einstein
Jonathan Edwards wrote an impressively beautiful essay on this subject, prompted by the work on the O'Reilly book Beautiful Code. Here's the final paragraph, but the rest of the essay is also worth reading.
Another lesson I have learned is to distrust beauty. It seems that infatuation with a design inevitably leads to heartbreak, as overlooked ugly realities intrude. Love is blind, but computers aren’t. A long term relationship – maintaining a system for years – teaches one to appreciate more domestic virtues, such as straightforwardness and conventionality. Beauty is an idealistic fantasy: what really matters is the quality of the never ending conversation between programmer and code, as each learns from and adapts to the other. Beauty is not a sufficient basis for a happy marriage.
Other versions of this same wisdom exist in other fields. Samuel Johnson, about writing:
Read over your compositions, and wherever you meet with a passage which you think is particularly fine, strike it out.
William Faulkner's version of this was much more succinct: “Kill your darlings.”
My father-in-law works as a film editor, and he studiously avoids the set where the film is being shot. When he does have to visit, he shields his eyes as much as he can. This is because when he decides whether or not to include a scene in the final film, he doesn't want to be influenced by how much effort it took to shoot the scene. What matters is how well the scene works in the final film.
My essay, "My evolution as a programmer" (which I would link to if I weren't a new user), is largely about learning skepticism about the code I'd written: whether it works, whether it's useful, whether it's comprehensible (pair programming was a real wake-up call here). It's hard!
I never admire my code. I admire other peoples code that i "borrow" and try and emulate them or better them and i find that the more i know, especially about coding the more i find i don't to know. The only thing of value wold be for peer programmers to admire my code and borrow it.
I think he has a good point. It's frustrating to work with people that have too much of this, as it really hinders teamwork and getting to the best solution to the problem.
As I can be a bit delusional, I try to put practices in place that will keep me grounded in reality. For code,
unit tests: These keep me more focused on what the code is supposed to do, as opposed to any abstract "beauty".
shared code ownership: There are two camps here: give people more ownership of their code and hope pride takes over, or give them less and let peer pressure come into play. I believe that giving people more ownership can lead to this code admiration. We practice shared code ownership, so I am constantly forced to see someone rewrite my perfect code to make it better (in their mind). I quickly realized admiring it too much was a waste of time and emotionally difficult.
pair programming: working side-by-side with someone will keep you realistic.
other feedback: These are all feedback loops, but there are others. There's no better way to see if something works than by watching someone (try to) use it. Put your work in front of as many people as possible. Have code reviews. Read other people's code. Run static code analysis tools.
I'm with PurplePilot - I don't admire my own code, and as such I'm constantly searching for new, more efficient (hell, easier) ways of doing the same thing. I like the Effective c# book, picked up lots of useful code from there that I admire.
I would have no hesitation about throwing code away and starting again, but not necessarily from scratch, i.e. by writing some code for a specific scenario and then throwing it away, you'll probably have a better grasp of the scenario. In other words, it's a "wicked problem", or you've found another way that doesn't work a la Edison.
It begs a wider question: if code isn't thrown away, or at least revisited, is developing on libraries that are becoming stagnant a good thing?
There is nothing wrong with admiring your code ... this is part of the positive reinforcement process that will motivate you to write more and better code in the future.
However, misplaced or misused admiration can be a problem. If the code is really not good, or has bugs that haven't been exposed by unit or other testing, or needs refactoring/redesign/replacement then this misplaced admiratoin is a problem. And using admiration as an excuse to skip part of the process - such as code reviews, or not having a skeptical attitude towards code - is misuse of admiration.
Like anything else that is good, admiration of code can be misplaced or misused - it doesn't mean that it in itself is bad. That would be like saying "religion is a bad thing, because it causes conflicts and wars between people".
Two words: code review.
Gather two or more fellow developers and invite them to review/criticize/comment on your code. 'twill shed some (admittedly harsh) light on your code.
It's perhaps better to have a healthier perspective - we aren't rocket scientists, and we aren't curing cancer - it's just work.
(Yes, it's reasonable to be proud of an entire building you helped build if you're an architect, but do they really have a lot of their self-esteem wrapped up in an individual blueprint, or a closet on floor 3 they designed by themselves?).

How to implement code in a manner that lessens the possibility of complete re-works [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I had a piece of work thrown out due to a single minor spec change that turned out not to have been spec'ed correctly. If it had been done right at the start of the project then most of that work would have never have been needed in the first place.
What are some good tips/design principles that keep these things from happening?
Or to lessen the amount of re-working to code that is needed in order to implement feature requests or design changes mid implementation?
Modularize. Make small blocks of code that do their job well. However, thats only the beginning. Its usually a large combination of factors that contribute to code so bad it needs a complete rework. Everything from highly unstable requirements, poor design, lack of code ownership, the list goes on and on.
Adding on to what others have brought up: COMMUNICATION.
Communication between you and the customer, you and management, you and the other developers, you and your QA department, communication between everyone is key. Make sure management understands reasonable timeframes and make sure both you and the customer understand exactly what it is that your building.
Take the time to keep communication open with the customer that your building the product for. Make milestones and setup a time to display the project to the customer at each milestone. Even if the customer is completely disappointed with a milestone when you show it, you can scratch what you have and start over from the last milestone. This also requires that your work be built in blocks that work independent of one another as Csunwold stated.
Points...
Keep open communication
Be open and honest with progress of product
Be willing to change daily as to the needs of the customers business and specifications for the product change.
Software requirements change, and there's not much one can do about that except for more frequent interaction with clients.
One can, however, build code that is more robust in face of change. It won't save you from throwing out code that meets a requirement that nobody needs anymore, but it can reduce the impact of such changes.
For example, whenever this applies, use interfaces rather than classes (or the equivalent in your language), and avoid adding operations to the interface unless you are absolutely sure you need them. By building your programs that way you are less likely to rely on knowledge of a specific implementation, and you're less likely to implement things that you would not need.
Another advantage of this approach is that you can easily swap one implementation for another. For example, it sometimes pays off to write the dumbest (in efficiency) but the fastest to write and test implementation for your prototype, and only replace it with something smarter in the end when the prototype is the basis of the product and the performance actually matters. I find that this is a very effective way to avoid premature optimizations, and thus throwing away stuff.
modularity is the answer, as has been said. but it can be a hard answer to use in practice.
i suggest focussing on:
small libraries which do predefined things well
minimal dependencies between modules
writing interfaces first is a good way to achieve both of these (with interfaces used for the dependencies). writing tests next, against the interfaces, before the code is written, often highlights design choices which are un-modular.
i don't know whether your app is UI-intensive; that can make it more difficult to be modular. it's still usually worth the effort, but if not then assume that it will be thrown away before long and follow the iceberg principle, that 90% of the work is not tied to the UI and so easier to keep modular.
finally, i recommend "the pragmatic programmer" by andrew hunt and dave thomas as full of tips. my personal favourite is DRY -- "don't repeat yourself" -- any code which says the same thing twice smells.
iterate small
iterate often
test between iterations
get a simple working product out asap so the client can give input.
Basically assume stuff WILL get thrown out, so code appropriately, and don't get far enough into something that having it be thrown out costs a lot of time.
G'day,
Looking through the other answers here I notice that everyone is mentioning what to do for your next project.
One thing that seems to be missing though is having a washup to find out why the spec. was out of sync. with the actual requirements needed by the customer.
I'm just worried that if you don't do this, no matter what approach you are taking to implementing your next project, if you've still got that mismatch between actual requirements and the spec. for your next project then you'll once again be in the same situation.
It might be something as simple as bad communication or maybe customer requirement creep.
But at least if you know the cause and you can try and help minimise the chances of it happening again.
Not knocking what other answers are saying and there's some great stuff there, but please learn from what happened so that you're not condemned to repeat it.
HTH
cheers,
Sometimes a rewrite is the best solution!
If you are writing software for a camera, you could assume that the next version will also do video, or stereo video or 3d laser scanning and include all hooks for all this functionality, or you could write such a versatile extensible astronaut architecture that it could cope with the next camera including jet engines - but it will cost so much in money, resources and performance that you might have been better off not doing it.
A complete rewrite for new functionality in a new role isn't always a bad idea.
Like csunwold said, modularizing your code is very important. Write it so that if one piece falls prone to errors, it doesn't muck up the rest of the system. This way, you can debug a single buggy section while being able to safely rely on the rest.
Beyond this, documentation is key. If your code is neatly and clearly annotated, reworking it in the future will be infinitely easier for you or whoever happens to be debugging.
Using source control can be helpful too. If you find a piece of code doesn't work properly, there's always the opportunity to revert back to a past robust iteration.
Although it doesn't directly apply to your example, when writing code I try to keep an eye out for ways in which I can see the software evolving in the future.
Basically I try to anticipate where the software will go, but critically, I resist the temptation to implement any of the things I can imagine happening. All I am after is trying to make the APIs and interfaces support possible futures without implementing those features yet, in the hope that these 'possible scenarios' help me come up with a better and more future-proof interface.
Doesn't always work ofcourse.

How much code can a programmer be intimately familiar with? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Are there any statistics for this? I realize it must vary from person to person, but it seems like there should be a general average.
The reason I ask is that the company I contract for has multiple software products, totaling ~75,000 lines of code - and they seemed disappointed and shocked when they ask me a question about a specific portion that I don't immediately know the answer to (I am the only programmer they have, and did not author the majority of the systems) They think I should just know it all from memory. So I wanted something like a statistic to show them that an average programmer couldn't possibly have all that in his head at one time. Or should I?
You should remember where to find the needed stuff not remember it itself.
You should also be familiar with code structure and architecture enough to make an educated guess where a problem might originate and where you could possibly find the stuff you know exist but not sure where exactly.
You brain works like cache. The stuff you used recently is kept there, more older entries are erased. But there will never be enough memory to remember the code all at once. Because then you will want to remember all API functions, then all specs, then something else. This all is not feasible.
And being surprised with you that you don't remember all the code is probably one more instance of those perversed notions of how programmers do things. Ignore them.
It depends not only on your memorization skills, but also a lot on the code. Obviously, clean, idiomatic code is much easier to memorize than a badly written inconsistent mess.
Probably because clean code can be broken down into much larger "abstract tokens".
Indeed interesting question but I am in doubt if there is adequate answer at all. Here are only obvious factors I see right from the start:
Overall design quality. Even if you are new in well designed code you can very quickly identify where you should look to get answers.
Project documentation quality. For poor documented projects even developers that are in project from the start can't say anything about some parts.
Implementation quality. OK. You have good general architecture, good documentation for interfaces but even one really bad programmer could break all of this. This is because many companies are very strict about code reviews and I think it is the only one technique to prevent such situation.
Programmer experience. As you move ahead you see number of 'already known' code "bricks" in software new to you and experience is great help in this so contractors are often very experienced specialists familiar with various approaches and this gives average contractor ability to move much faster then full time programmer which is brilliant but worked 10 years in only 1 project context.
General person smartness. My opinion this is really not so important as most of others factors but it is really important.
... but the common problem is often companies hire contractors for some existing software improvement and they simply think this is only about to hang picture on the wall. You should perform some negotiation to force them to understand part of work is to understand what really should be done to meet their requirements at all. And such "learning" requires resources and is part of work itself. But I think it is slightly off-topic for StackOverflow (despite I voted up ;) ). Is it more for Startups discussion?
Even if you have written all that code you might forget portions of it. But you'll be able to recall it once you review it.
I think its natural for a programmer to forget some portions of his/her code after a long time.
Ask them how they want you to spend your time: surveying vast amounts of code you didn't write and perhaps writing up internal documentation, or whatever currently keeps you occupied It's not a facetious question. If they want quicker response to new issues, they need to invest in research.
I don't think there's a meaningful answer to this measured in LOC. As a manager, what I want to know is that someone in your situation can answer a question in a reasonable amount of time -- and unless I know you're in the middle of something, I wouldn't expect that 'reasonable amount of time' to be 'instantaneously'.
You should be able to understand all the components within the system and how they interact so that when there is a problem you can isolate one or two likely components and drill down.
I find it helpful to draw a few diagrams and keep them handy so I can use them to communicate with my boss\customer as well as jog my memory.

Resources