What is the best argumented proof of the statement "IE6 is not standards compliant"? - internet-explorer-6

I've met a new friend. That's a woman and she is a designer. And she has a strange attitude towards IE of version 6 (and older). She just LOVES it. And she has a strong argument: "when I started programming websites, there were no "correct" browsers", so she beleives, that IE is the most correct ever. I'm a programmer and I was always scared by designing something, but I am not new to CSS, HTML and Javascript. However, professionally, I just lack serious proofs, that IE actually never tried to follow any standards.
And now I have problems trying to prove my position. From a lot of professional literature I read in my life I heard, that IE really violated standards seriously both in HTML, CSS and JavaScript. But I can't find valuable of this fact. Can you help in this quest? :)

With all due respect, you're both probably wrong.
First off, it's probably not subjective or controversial to say that IE6 is horrible.
But, at the time it was released almost ten years ago, AFAIK IE6 was one of the most standards-compliant browsers out there, in terms of CSS compliance. The problem is that the standard evolved significantly since those draft revisions ten years ago, and there hasn't been the push to auto-update the browser like Firefox and Chrome have. That has left a significant installed base of IE6 around.
She probably loves the browser because she's let her skills lapse in terms of CSS.

Saying "it was good when i started" is not a strong argument. She reminds me of a point that Crockford made in his talks, that the main opponents to technology, the ones that were the most prone to oppose change was us, the power users... "I never use this feature, but still get things done therefore this feature is crap" is a common bias, even when you know about it and look out for it...
IE6 was the very best at a time. Not anymore. Things change... As pointed, the ACID test, the user experience and most of all the non-implementation of all the standards that could help us advance are all valid reasons for axing IE6
But anyway, is it important that your lady friend likes IE6? Do you feel strongly about it? Perhaps you should let her like what she wants and not think too much about it :)

Well, ACID is a test page testing browsers if they follow certain web standards.
For further reading have a look at http://en.wikipedia.org/wiki/Acid2 .
Internet Explorer has and will ever ( ;-) ) display web pages differently from other browsers.

Suggest her to try CSS Zen Garden's "Gemination" both on IE6 and Firefox.

Run the acid test, it is the best proof. Here you have a whole browser comparison test: http://www.pcgameshardware.com/aid,687738/Big-browser-comparison-test-Internet-Explorer-vs-Firefox-Opera-Safari-and-Chrome-Update-Firefox-35-Final/Practice/

The problem is that IE6 is still the standard browser for a lot of users. If a design is created on IE6 without pushing the limits much, then it is lickly to look OK on all broswers.
So as a tool for a web designer, IE6 may be the best browser. (Just don't ask me to use it on my personal PC!)

Related

How many websites comply to web standards

Does one of you know where to find statistics on websites using web standards?
I've read multiple books on web standards, all give very unspecific data on how many sites comply to web standards.
Books like "designing with web standards", "CSS Mastery", "HTML and web standards solutions" all contain definitions like "some", "more and more" and "not nearly enough" without a source for this data.
Of course I realise there won't be an exact number, an indication would be great. I feel like those writers will need to have gotten their data from somewhere.
I would be great if any of you could point me in the right direction.
There's a very good reason that those books you read only provided indeterminate quantities like "some", "more", and "not nearly enough". It's because this type of data is not easily obtainable, not by the authors nor by us on Stack Overflow. Anything you might find would be idle speculation at best, and more likely tainted by some strong personal biases.
Those writers did get their data from somewhere, but I can't provide any more detail because this is a family-friendly site. Scott Adams used a similar theme to great success in one of his famous comics:
Fortunately, even if you were able to find this data, it wouldn't be particularly relevant. The decision to comply or not to comply with web standards should not be made based on what everyone else is doing.
As has been mentioned, there won't be any such list available. However, I believe most web developers / designers would have a good 'feel' for how many sites they've looked at conform to all standards - and my hunch would be this number would be pretty low (based on my personal experience).
If you are really keen on figures, maybe take the top 100 websites on the web and go validate them all. Shouldn't take too long and you'd get a feel for how many conform to at least basic valid (X)HTML standards.
Since all the browsers don't actually follow the standards in the same way, this leads to websites being forced to use non-standard code to ensure a consistent feature set and look and feel. The situation has improved massively in recent years but is still far from ideal.
You also have the problem that even agreeing on what the standards are can take a very long time let alone waiting for everyone to comply. The current pace of development of modern browsers means standards will always be playing catch up somewhat.
Sites such as validator.w3.org, quirksmode.org, acid3.acidtests.org, html5test.com are great to keep up with the latest cross browser techniques and browser support but attempting to code a website and strictly remain inside a set of out dated rigid standards appears a zero sum game.

Bug tracking for legacy physics models

I am the lone software engineer on a team that develops physics models (approx 30,000 lines of code). The rest of the team consists of scientists who have been developing their codebases for about 20 years. My workflow goes something like this:
Scientist requests a new feature
I implement it
Via testing & validation, I find a serious problem somewhere deep within the numerics
Scientist requests a new feature (without fixing the problems identified in #3)
Our problem seems to be that bug tracking is done via e-mail and post-it notes. Busy work schedules let bugs slip under the radar for months and months. I think some formalized bug tracker (i.e. Trac, Redmine, Jira, FogBugz, etc.) could help us. The following features are essential:
Incredibly easy to use
Integrate with version control software (we use Subversion)
There are plenty of posts that suggest which bugtracker is "best"... but I suppose that I am more interested in:
What's your experience in whether or not the overhead of a bugtracker is worth it
How do you convince a physicist (who follows poor software engineering "best practices" from the 70's) that a bug tracker is worth the extra effor?
I get the feeling that if I install a bug tracker, I'll be the sole user. Has anyone else experienced this? Is it still useful? It seems like the team would need a certain amount of "buy-in" to make a bug tracker worth the additional overhead.
Bug trackers are definitely worth it, in part because they formalize the work-flow required to implement new features and fix bugs. You always have a central place for your work load ("My bugs", "My tasks", etc). Pretty much every environment that I've worked at in the last few years has had a bugtracker of some sort so I'm not sure what to recommend in terms of buy in. Do you have more than one scientist coming to you for feature requests/bug fixes? If so, then perhaps you could use the bug tracker as a conflict resolution system of sorts. Do you have a boss/manager? Then having a bug tracking system would provide a lot of insight for your boss.
In general, as a software developer, bug trackers have been very useful. My suggestion would be to think of ways that a bugtracker would enhance your & your coworker's life. Maybe do a quick demo.
HTH.
In my experience, the overhead of a bugtracker is noticeable but definitely worth it! The catch is that if you decide to use a bug tracker, it can only succeed if everyone uses it. Being the sole user of such a system is not quite as useful.
Having said that, even if I am the sole user (which tends to happen a lot), I still install the bugtracker (typically trac). If you use it religiously (enter every thing that comes in through different means as a bug and ALWAYS refer to bug# in your replies), the team generally tends to pick it up over time.
Enter milestones (or whatever your tracker of choice calls them) and link bugs to them. Whenever someone asks what the progress of something is, call up the milestone report or equivalent and SHOW THEM. This helps convert people from thinking of the bug tracker as a nuisance to realizing that it can be a source of invaluable information.
I suggest taking a look at Strategy 2 in this Joel On Software article. He basically argues that if your company doesn't use bug tracking software, you should just start using it for yourself, and demonstrate how it helps get things done. Also ask other people to use it to submit bugs so they see how easy it is to use.
Even if you're the sole user (it happened to me once), it's worth it. You can start saying things like, "Bug 1002 is blocking. Who can help me with that so we can move on to this and that feature."
We found redmine to be a better than trac simply because it is easier to use. It does lack some of the features found in some of the other systems, but this also means there is less stuff for non-programmers to have a problem with. It's also very nice because it allows someone other than the programmers to get a feel for the current state of the system. If there is a large number of critical unclosed bugs it is easier to make people understand that their requested feature will have to wait a little.
This is a similar question.
What's the Most Effective Workflow Between People Who Develop Algorithsm and Developers?
It does NOT speak to which bugtracker is best, but it does speak to how to convince the physicists to buy-in.
Using subversion? Here's a /. post that is helpful:
Best Integrated Issue-Tracker For Subversion?
An in general, here's a Comparison of Issue Tracking Systems.

Ethics of using a "fringe" language for your job? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Just want to ask for some opinions here. How do you feel about using a language (and/or framework) that isn't widely used in your location to write software for a company? For instance, I live in an area dominated by .NET, with the occasional PHP job. Let's say that I'm learning Python and decide to use it to write software for my job (I'm a "Team of One" so I can pretty much use anything I want).
Now their software is written in a language that pretty much nobody in the area uses or knows; if I were to leave the company, they'd basically have nobody to maintain/add to it unless they retain on me as a consultant. While that's really good for me, it seems a bit "crooked" - granted, that's how the business world works.
What are your thoughts?
I should mention that this is a very small company and I'm the only IT person, so I have full reign to choose our development platform. I'm not specifically using Python, but chose it as an example since my area is almost entirely .NET based; I don't care for .NET anymore though, which is why I don't want to consider using it. Also, the company is.. how shall we say... extremely frugal and wouldn't purchase the required resources for .NET (e.g. server licenses, SQL licenses, Visual Studio, components). I personally have an MSDN subscription but I can't use that for them.
Also FWIW there are people in the area who use the language I'm considering using (Ruby on Rails), but nowhere near as many people as .NET developers. It's not like I'm using something that only I know.
You may think that this approach is good for you. But in fact all this does is paint you in to a corner. The best way to get promotion - within an organisation is to make yourself unnecessary in your current position. That might seem like nonsense, but it is in fact true. Think of it like this, if it is essential to the company that you continue to maintain the python code you wrote for them, and they can't go to anyone else to get that skill, then they will continue to pay (maybe a little above market rates) to maintain that code.
If however, you write that code in .net where there is a plentiful supply in your area, then as the company expands and the code you've written proves successful, you will be able to hire people to maintain that code and you can move on to designing other systems. Or moving in to managing a team of .net coders - if that's your want.
Even if you want to leave, the best thing for your career is going to be to get the best possible reference. To do that, write them some code that is easy to maintain. Help them hire someone to replace you to maintain it. They will be grateful and recommend you as a consultant to their friends.
Code in something esoteric - for which there is little support in your area - and they will be saying to their friends on the golf course "no don't hire that guy, he wrote this system for us which does the job, but no one else can maintain it. We're stuck with him forever and now he's too busy to look after us properly!"
Do what's best for the business, not what might be of most interest to you - or appear that way on the face of it. You'll win out in the long run.
I think that you're responsible to decide on the language that's best suited for the job. That includes an objective evaluation of the merits of the language and framework, it includes your own personal skill with the language (since you're the one doing the work) and it includes maintainability by others. Only you and your company can decide how much importance to place on each of those.
For your own personal development, if your area is dominated by .net, why don't you want to get up to speed in that instead of Python?
From an ethical standpoint, I would not write something that could not easily be maintained by someone else.
A lot of responses seem to be a poor fit for the question. We're not talking about using an unapproved language in an environment with existing standards. We're talking about a situation where the poster is the entire IT and development department for his company.
It's certainly important to keep in mind availability of talent, but Ruby is hardly a fringe language these days. In an environment where there's only one developer, productivity is also a very important consideration. Being able to build and maintain software quickly and easily without a large team requires tools with different characteristics than a large team might require.
I think what's more important than whether to use Ruby or (something else) is to try to pick something as general-purpose as practical and use it for everything unless there's a really good reason to use something else. If you go with Ruby, stick with Ruby for your utility scripts, cron jobs and that little GUI app the boss wanted to automatically SMS the intern when he takes more than five minutes to bring him his coffee.
I think using python would be the right thing to do if it would meet the clients requirements, and save them money over the alternative. Whether or not there is a wide assortment of characters to work on the application down the road is irrelevant, unless they've specified this as a non-functional requirement.
As usual, using the best tool for the job at hand will serve you well.
It indeed is a bit crooked IF you use it only for that purpose.
However, if you use it because it IS the best solution, youre in the clear.
Also, they can just hire someone else who knows python.
My work ethics dont allow me to do something like this just to keep me in business.
My personal opinion is you should try where possible to respect the working practices of wherever you are - whether that's indentation style, naming convention, testing procedure or programming language.
If you feel strongly that a different language would be better suited for a certain task, then lobby to have it accepted (with the required re-training of others).
Purposefully leaving an app that no one else can maintain is very bad professional conduct, IMO.
We recently had a bad hire at my shop and he decided out of the blue he was going to use Perl instead of any version of .NET to do some simple reporting stuff (That could have just as easily been done in .NET). It was atrocious. I would suggest using the platform as specified and clearing any deviation with the people who run the joint...
Plenty of answers have touched on this, but here's my take based on production application support.
My company had a startup phase where code hustlers whipped up solutions in whatever the personal preference or flavor of the week was. Bad for maintainability and supportability.
Making a change is ok, though, as long as it's consistent. If Python is going to pave the way to the future, then go for it. Don't forget that the legacy .NET and PHP code still needs to be supported until end of life. Building yourself a hodge podge of platforms and frameworks will just create more difficulty for you on the job and the company when you're no longer around.
If you feel in your heart you are acting dishonestly, then you probably are.
No one likes a dishonest person. That can't be good for your reputation.
Do your best to choose based on what is actually best, not what satisfies some underhanded motives.
It depends. I did some of what would normally just be a bash script, in Java instead at one place. Why? Because they're all Java programmers and frequently have interns/coops coming through that may or may not know anything else (and may or may not even be all that great with Java).
Other places though tend to have more experienced programmers and I expect that they'll be able to figure out another language without too much effort. So, I would go with what's "best" for the project.
I agree with what mquander says above, but you may also have to be prepared to justify why you want to use this other language to your development manager. If he/she then agrees, perhaps the language could become more widely adopted within the company.
Think of it in terms of business benefit you bring to the company, now and in the mid-term.
If you can deliver something much faster using a different technology, and it still achieves the goals, I'd go for it - but I'd still let some other people know and respect the company's final decision. If however, it's purely for yourself, then I'd probably be a litte more careful.
I think it's a really bad idea. For you, it means there's no back up in case you want to have a day (or week) off. For them, there's no one else if you leave or are taking a day off. It's a well known ploy, and, honestly, might be reason to not keep you around.
However, this could also be a chance to introduce Python into the environment. You could teach others about it, and explain to management while it's a good third language to have at the group's disposal.
I used to think that you should always pick the right language for the job at work. I'm reversing my opinion though.
The problem arises when some other guy picks a language you don't want to learn. I am concerned that I might be the guy who picks the language no one else wants to learn. Just because I think that Erlang might be the right choice for something doesn't mean that everyone else will want to learn Erlang or respect my decision for using Erlang.
"if I were to leave the company, they'd basically have nobody to maintain/add to it unless they retain on me as a consultant."
Are you saying no one else can learn Python? I find that hard to believe.
New technology is often introduced in small projects by knowledgeable people and diffused through the organization because the small projects were successful.
Use Python. Be successful. Make your case based on your successes.
I had this same problem very often. Coincidentally, it was with those two languages you mention: .NET forced on me, when I preferred to use Python (among others). Could be the opposite, I don't judge.
I refrained to use Python, because of the reasons already mentioned in other answers. I did what I thought was best for the company. Using IronPython won't make your python code any more maintainable for an unexperienced Python programmer.
However, I left the company and now I work in something more in line with my tastes. I'm much happier. In this economy you may not have this option... but it will pass. Do the right thing.
Cheers.
There is a large difference between 'prototype' or 'one-shot' code and production code. For prototyping I use whatever works fastest, but I'm very clear about its status. Production code is written in one of the approved and supported environments.
The ethics is to use the best tool for the job. If there is a tool that takes you only 20% of the time to code vs other choices, and next to no maintenance, and easy to re-factor, you have a duty to pick that tool, assuming it's extensible as you may need in the business.
If you do a good job, hiring future people and training them in terms of HOW your workplace does business should be the practice of any growing business. They will be able to learn the code if they're the right person for the business.
In your case I'm not sure if you want to use Python, unless it has native .NET support to allow your .NET world to interact with it.
Other posters have made some good points, but here's one I've not seen: Communicate the situation to management and let them decide. In other words, talk with your boss and tell him or her that there currently are more .NET developers in your area, so that if you're hit by a bus tomorrow it would easier to find someone else to maintain your code; however, there are tools you need to do your job more efficiently and they cost money (and tell them how much). Alternatively, you could do this in Python or RoR (or whatever) and use free tools, but from what you know, there aren't currently that many people in the area who know those languages. I've used "currently" a couple times here because this may change over time.
Before having this conversation, it might be good to see if you can find user groups for the alternative technology in your area, and how large they are. You could also ask on listserves if there are people who know the alternatives in your area.
Of course, the boss may tell you to keep using .NET without any tools, but in that case it's their decision to shoot themselves in the foot. (And yours to decide if you want to find a new job.)
Regarding the question as asked, I see nothing unethical about it, provided that:
It is a freely-available language. Although I am something of a FOSS partisan, that's not the point of this criterion. It needs to be freely-available (not necessarily FOSS) so that it doesn't impose costs on the company and so that others will have the opportunity to learn it if you ever need to be replaced (or if they want to compete with you for your job).
You are changing languages for solid reasons and not for the sake of creating vendor lock-in (or, if you prefer to think of it as such, "job security"). Ethics aside, you really don't want to have a job where they hate you, but are stuck with you because you're the only one who can maintain the mess you've created anyhow.
In the particular case you've described, I would suggest that switching to RoR may be the more ethical choice, as it would be decidedly unethical (not to mention illegal) to use .NET if there are required resources which are for-pay only and your employer is too cheap frugal to purchase proper licenses for them.
When in Rome... do as the Romans.
You might not be the one who as to maintain this code in the long term and not everyone wants to learn a "fringe language" to make bugfixes or enhancements.
I migrated some VBA stuff over to Perl for processing at a previous job and increased the efficiency by several orders of magnitude, but ultimately no-one else there was willing to learn Perl so I got stuck with that task longer than I wanted it.
I did that, it was Delphi in my case. I think Delphi was used often however when i was looking for a job .... i saw 3 delphi job offers in my whole life. i also saw more java/j2ee/php offers that i can remember. i think its bad idea, with the time i wasted in learning advance delphi programming i could get better with j2ee and start in better company and maybe make now more money.
If they cant find somebody to maintain the app you will always do it and when you quit they will have to re-write it. i think consultant thing is not used often.
I used to be in the "use the best tool for the job" school, but I've changed my mind. It's not enough to just ask "how can I do this job the fastest." If you think you're the only one who will ever need to look at some code, there's a good chance you're mistaken. The total cost of introducing a new language into an environment is higher than you might imagine at first.
If you just need to produce a result, not a program, then you can use whatever you want. Say you need a report or you need to munge some files. If the output is really all that matters, say it's something you could have chosen to do by hand, you can practice using any language you want.
With the release of the MVC Framework I too have been in a similar ethical delema. Use WebForms or switch over to MVC Framework for everything. The answer really is you have to do the right thing and use whatever the standard of the company is. If you deviate from the standard it creates a lot of problems for people.
Think how you would feel if you were dumped a project on VB6 when all you have been doing for years is .Net. So these are the two solutions I have come up with.
Use your fun languages for consulting contracts you do on the side. Make sure the client knows what you are doing and if they agree go for it.
Try and convince your current company to migrate over to this great new language you are working with.
If you follow these routes you will learn your language and not piss anyone off in the process.
Ruby on Rails is certainly not a fringe language. If the company is too cheap to pony up for the appropriate licensing for Microsoft's tools, then you would have no choice but to find an alternative. RoR certainly would be a reasonable choice and if helps move your career along as well, then it's win-win for both of you!
You can develop .NET adequately with free tools; cost is not a good reason to avoid that platform. Ruby on Rails is becoming reasonably mainstream for building data-driven internet websites. You haven't even told us if thats the sort of software you are building though.
There is really no way with the information that you have provided that anyone can give you a single correct answer.
If you are asking is it ethical to do your work in such a way that the company is dependent upon you, of course the answer is no. If you are asking is it ethical to develop in RoR then the answer is "we don't know" - but my opinion is that probably it would be fine if its the right tool for the job.
Don't underestimate the ability of someone else to support your work or replace you though - if you do your work reasonably well once the solution is in place any programmer worth their pay should be able to learn the platform well enough to maintain it. I've debugged, migrated and supported a few PHP applications for example without ever hardly learning the first thing about PHP. I'd be lost building a new PHP app from scratch and would never even try but its no problem to support one. I think the same would be true of the languages you mention as well - they've got the critical mass that means there is plenty of books and forums etc. Of course if its written badly enough in any language then it may be difficult to support regardless of anyone's skill in the language...
So much discussion for such a clear-cut situation...
It's not up to you, it's up to them. If they're not technical enough to make the call, as it seems, then you have to make it for them in good faith. Anything less is dishonest, and I'm fairly sure that's not in your job description ;)
You've muddied the waters with all the wandering about in the thickets of personal motivations. The answer to that one is that your personal motivations are irrelevant unless and until you've formulated the business case for the possible decisions. If you've done that and the answer still isn't clear-cut, then sure, choosing the answer you like the best is one of the nice things about being in a position to make technical decisions in the first place.
As far as the actual question goes, to my mind if the most technically apt choice is also one that very few people work with, one of two things is happening: a) It's a good choice, and the number of people working with it is going to be exploding over the next 18-24 months (e.g. Django), or b) There's something wrong with my analysis. Technologies may be on the fringe because people are slow to adopt them, but that's generally not why they stay on the fringe.
If you find yourself thinking "I can't choose technology X, that'll make it easier for them to replace me!" you're in the wrong line of work. In almost any enterprise that's not actually failing, the IT guy who makes himself easy to replace tends to move up to harder and more interesting and more lucrative work.
I would not bring a new language/framework/whatever into the place unless they understood that's what I was doing, and that if I left/was fired/was hit by a bus, they'd have to find/train someone to work with it.
I have some experience in a contractor pulling in things just because he felt like it. In some cases they were the best tool for the job (in other cases they were not), but in all cases they were not the best tool for the team that had to maintain the code. In my case the contractor was a serious jerk who didn't really give a darn about anyone else and I believe WAS trying to make himself harder to replace.
In your case, talk to your bosses. If they really don't want to spend the needed money on .NET framework tools/libs, then switching to something else may well BE the right thing to do for them, long term.
And, as someone who has spent his career walking into the middle projects that others have already started - thank you for thinking before you add a new tool to the mix.

How much time and effort do you spend on IE6? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I've often read that corporations will save millions once Internet Explorer 6 is finally laid to rest, but reading quotes like that make me wonder just how much time people are spending on IE6 development. I'll be happy once IE6 goes, but I've found that if I write valid HTML and CSS, use a JavaScript framework, keep the transparent images issue in mind, and don't try to over-complicate my design, I only need to write a few lines of IE6-specific CSS, which usually takes about 10-15 minutes. So I'm curious, how much time, effort, and money do you or your corporation spend on preparing your sites for IE6? Also, at what point will you drop IE6 support? If you've already dropped support, what has been your experience in terms of saved time and money and what has the switch done to your conversion rates and customer satisfaction?
According to some - browser - statistics, IE6 market share is still bigger than Chrome, Safari and Opera together, nearly as much as IE7.
Unless you target a very specific market (indeed check your stats to know for sure), neglecting to make your site looking at least decent with IE6 seems a bit foolish today...
I won't take the road to tell visitors what browser to use, for sure!
I'm already phasing it out. Every second spent on debugging for an outdated (7+ years old!!) browser is a second wasted in my books. What I've started doing is when an IE6 user first comes to the site (determined by cookies and some dodgy browser sniffing), I pop up an alert informing them that they are using an old browser which does not support much of the functionality required by many of today's web sites. I inform them that their experience might be slightly downgraded by continuing, but that can be easily alleviated by upgrading to a modern web browser (even if it sucks).
Don't go out of your way to make it crappy for them (though they might deserve it), but don't go out of your way (with non-standard CSS hacks etc) for these users either. There's only one way they'll learn.
I don't like it but I still support it. For small sites it isn't a problem, just make stuff work in Firefox first, then IE7, and IE6 last. I've used IE6-only css a number of times, and those only had a few rules in them.
For a larger project with complex layouts, I have wasted a lot of time on IE6. I'd be very happy to drop it entirely if it was impossible to provide one of my major features on it. So far, it's close enough that I'm still supporting it.
According to what I read online, about 1/4 people still use it, so it's probably not wise to drop support.
http://www.w3schools.com/browsers/browsers_stats.asp
Use your own judgement, based on your application and what you think you can expect from your users. I do not believe that a typical web user will upgrade/switch their browser just for one site. I think those people who have not upgraded from IE6 by now will never be motivated to do so. The number of IE6 users is dropping, but I think we'll be waiting for them to replace their computers rather than upgrade their browsers.
At my job all of our projects are for large corporations that aren't willing to drop support for a browser with such a large market share. Also, the designs we have are dictated to us by a third party design company so, even conforming to standards, there are still issues with complex designs in IE 6.
I would say for any given page about 5%-50% of the CSS development time is devoted to IE 6, depending on who the developer is and how complex the design is. The more experienced the developer and the simpler the design the better your odds are at hitting that 5% mark :) But even myself, with a good amount of IE 6 web-dev experience, have spent 3 hours on CSS for a page only to spend another 3 hours ironing out small quirks in IE 6.
Another thing that comes up is that certain markup + CSS approaches that seem so intuitive and simple in more modern browsers don't work at all in IE 6. If you go down one of these paths, you generally have to start from scratch once you realize that your code that works beautifully in FF and IE 7 doesn't have a chance in IE 6. More lost time...
I agree with the rest of you that if you can control your project and don't care about the IE 6 market, by all means forget about it. Unfortunately, some of us don't have that luxury quite yet.
We've already dropped it, but it depends on who you're marketing to. Only other companies will ever see our product, so we can be fairly certain they're at least using an operating system that can support IE7. If you're marketing to the entire internet then you may want to make certain that nothing breaks for a while yet.
Depends on the project. If I write the code conforming to web standards usually I don't have many issues.
If I'm using a template downloaded from the web, it often spells out very clear in bold letters: "manifest destiny is a bitch. don't trade blankets with anyone."
Unfortunately, I have a bunch of friends in other businesses that are sticking with IE6, and don't have a plan to upgrade.
They don't like the tabs in IE7, they don't want to go with another browser, etc, etc, etc.
There is enough of this that filters back to me, that I continue to test against IE6, and will do so for the indefinite future. Doesn't make me happy...just do it.
The vast majority of our internal corporate users are still on IE6. Until the powers-that-be decide to push out an update with IE7 or IE8 we will continue to support IE6 as our primary browser.
As far as I know, there are no immediate plans to upgrade.
Zero. Dead and gone as far as I'm concerned.
Really, you should be answering this question for yourself.
If you don't have a decent web log statistics package such as AWStats, then there's the first thing you need to do.
Otherwise, decide how much time you spend supporting IE6, and see what percentage of your users that is. If the time-to-customers ratio doesn't balance out, then you can decide to ditch IE6. Another factor to consider is how important your product is to your customers. If you're working on Salesforce.com, you can probably assume that they'll be willing to upgrade if you prompt them to do it. If you're talking about a server that injects ads into webpages, then you'll probably be at their mercy of browser choice.

Which is better: shipping a buggy feature or not shipping the feature at all?

this is a bit of a philosophical question. I am adding a small feature to my software which I assume will be used by most users but only maybe 10% of the times they use the software. In other words, the software has been fine without it for 3 months, but 4 or 5 users have asked for it, and I agree that it should be there.
The problem is that, due to limitations of the platform I'm working with (and possibly limitations of my brain), "the best I can do" still has some non-critical but noticeable bugs - let's say the feature as coded is usable but "a bit wonky" in some cases.
What to do? Is a feature that's 90% there really "better than nothing"? I know I'll get some bug reports which I won't be able to fix: what do I tell customers about those? Should I live with unanswered feature requests or unanswered bug reports?
Make sure people know, that you know, that there are problems. That there are bugs. And give them an easy way to proide feedback.
What about having a "closed beta" with the "4 or 5 users" who suggested the feature in the first place?
There will always be unanswered feature requests and bug reports. Ship it, but include a readme with "known issues" and workarounds when possible.
You need to think of this from your user's perspective - which will cause less frustration? Buggy code is usually more frustrating than missing features.
Perfectionists may answer "don't do it".
Business people may answer "do it".
I guess where the balance is is up to you. I would be swaying towards putting the feature in there if the bugs are non-critical. Most users don't see your software the same way you do. You're a craftsman/artist, which means your more critical than regular people.
Is there any way that you can get a beta version to the 4-5 people who requested the feature? Then, once you get their feedback, it may be clear which decision to make.
Precisely document the wonkiness and ship it.
Make sure a user is likely to see and understand your documentation of the wonkiness.
You could even discuss the decision with users who have requested the feature: do some market research.
Just because you can't fix it now, doesn't mean you won't be able to in the future. Things change.
Label what you have now as a 'beta version' and send it out to those people who have asked for it. Get their feedback on how well it works, fix whatever they complain about, and you should then be ready to roll it out to larger groups of users.
Ship early, ship often, constant refactoring.
What I mean is, don't let it stop you from shipping, but don't give up on fixing the problems either.
An inability to resolve wonkiness is a sign of problems in your code base. Spend more time refactoring than adding features.
I guess it depends on your standards. For me, buggy code is not production ready and so shouldn't be shipped. Could you have a beta version with a known issues list so users know what to expect under certain conditions? They get the benefit of using the new features but also know that it's not perfect (use that their own risk). This may keep those 4 or 5 customers that requested the feature happy for a while which gives you more time to fix the bugs (if possible) and release to production later for the masses.
Just some thoughts depending on your situation.
Depends. On the bugs, their severity and how much effort you think it will take to fix them. On the deadline and how much you think you can stretch it. On the rest of the code and how much the client can do with it.
I would not expect coders to deliver known problems into test let alone to release to a customer.
Mind you, I believe in zero tolerance of bugs. Interestingly I find that it is usually developers/ testers who are keenest to remove all bugs - it is often the project manager and/ or customer who are willing to accept bugs.
If you must release the code, then document every feature/ bug that you are aware of, and commit to fixing each one.
Why don't you post more information about the limitations of the platform you are working on, and perhaps some of the clever folk here can help get your bug list down.
If the demand is for a feature NOW, rather than a feature that works. You may have to ship.
In this situation though:
Make sure you document the bug(s)
and consequences (both to the user
and other developers).
Be sure to add the bug(s) to your
bug tracking database.
If you write unit tests (I hope so),
make sure that tests are written
which highlight the bugs, before you
ship. This will mean that when you
come to fix the bugs in the future,
you know where and what they are,
without having to remember.
Schedule the work to fix the bugs
ASAP. You do fix bugs before
writing new code, don't you?
If bugs can cause death or can lose users' files then don't ship it.
If bugs can cause the application to crash itself then ship it with a warning (a readme or whatever). If crashes might cause the application to corrupt the users' files that they were in the middle of editing with this exact application, then display a warning each time they start up the application, and remind them to backup their files first.
If bugs can cause BSODs then be very careful about who you ship it to.
If it doesn't break anything else, why not ship it? It sounds like you have a good relationship with your customers, so those who want the feature will be happy to get it even if it's not all the way there, and those who don't want it won't care. Plus you'll get lots of feedback to improve it in the next release!
The important question you need to answer is if your feature will solve a real business need given the design you've come up with. Then it's only a matter of making the implementation match the design - making the "bugs" being non-bugs by defining them as not part of the intended behaviour of the feature (which should be covered by the design).
This boils down to a very real choice of paths: is a bug something that doesn't work properly, that wasn't part of the intended behaviour and design? Or is it a bug only if if doesn't work in accordance to the intended behaviour?
I am a firm believer in the latter; bugs are the things that do not work the way they were intended to work. The implementation should capture the design, that should capture the business need. If the implementation is used to address a different business need that wasn't covered by the design, it is the design that is at fault, not the implementation; thus it is not a bug.
The former attitude is by far the most common amongst programmers in my experience. It is also the way the user views software issues. From a software development perspective, however, it is not a good idea to adopt this view, because it leads you to fix bugs that are not bugs, but design flaws, instead of redesigning the solution to the business need.
Coming from someone who has to install buggy software for their users - don't ship it with that feature enabled.
It doesn't matter if you document it, the end users will forget about that bug the first time they hit it, and that bug will become critical to them not being able to do their job.

Resources