The grand irony as we debate the importance of validation and what web standards are is this little bugaboo:
Web "standards" aren't.
The W3C provides specifications and recommendations which have been coined by practitioners as "standards" when they are not precisely standards, but de facto standards. ISO, for example, is a standards organization with a full compliance set that if not met - well, products don't ship, period. With a true standard, compliance is mandatory. With de facto standards, what we have are browser compliance problems up our collective wazoo.
What we also appear to have is confusion as to what qualifies as a "web standard."
Taking a closer look
In HTML and XHTML there is an implication in the specs that working in a strict environment is the ideal. That using meaningful markup is ideal. But neither of these are a real or even de facto standard. So semantic markup is an implied goal, not even a measure of compliance, and something we are trying still to understand. Semantic markup is a best practice, not an explicit recommendation.
Separation of presentation and content? An implied ideal, not a measure of compliance, and something we are still working toward perfecting despite the user agent concerns. Documents using table-based layouts can be completely conforming and even Strict DTDs contain what could be interpreted as presentational elements and attributes (
cellspacing). Separation of presentation and content is a best practice, not an explicit requirement across the boards.
Media types? The W3C uses very specific language in its recommendations such as SHOULD, MAY, STRONGLY RECOMMENDED and MUST NOT. This language is always written in upper case and presented in bold. Media type purists need to read this and weep:
“'application/xhtml+xml' SHOULD be used for XHTML Family documents, and the use of 'text/html' SHOULD be limited to HTML-compatible XHTML 1.0 documents. 'application/xml' and 'text/xml' MAY also be used, but whenever appropriate, 'application/xhtml+xml' SHOULD be used rather than those generic XML media types. ”
This means you should serve XHTML 1.0 as application/xhtml+xml, but that you also may use text/html as a media delivery type for XHTML. XHTML 1.1 differs in that it SHOULD NOT be served as text/html. The only MUST NOT issues we see in the specs is that HTML 4 MUST NOT be served as any form of XML. Managing media types is explicit, and therefore a "web standard" although there is an implication of best practices in the case of XHTML 1.0 here, too.
Validation? Here's a surprise! Validation is in and of itself not a requirement per se. What is required is conformance. So you theoretically don't have to validate a darn thing but if you want to test its conformance, which IS an explicit requirement, you have to validate.
Add it up
So, here's what we have:
- Separation of presentation and structure: implied as ideal: a best practice.
- Semantic markup: implied: a best practice.
- Content delivery via media types: explicit, a "standard."
- Well-formed markup: explicit, a "standard."
- Conformance: explicit, a "standard."
Clearly a difference between what is an explicit part of a specification or recommendation and what is the ideal goal that's implied by either the specification and practitioners (or both).
Separation of practice and science
There is also a separation within the industry of practice and science. This is an idea that applies to a lot of professions, but at the recent UI9 Conference, Jared Spool presented an interesting keynote in which he separated usability into practice and science, and I immediately noticed the relationship this has to web standards.
In our case, the science does not always provide everything required for practice, but it sure does provide lots of help. The opposite is also true: our practice doesn't always follow the science (and we have plenty of evidence of that).
Ideally, the practice should follow the science wherever it can, and spurning the science as being unimportant is like telling a doctor to not treat a patient with antibiotics where there is a clear case of bacterial infection. The possible result in not bridging that gap is death.
I am deeply concerned about practitioners and advocates who blow off conformance when they can and should conform and claim it's okay to do so.
When we cannot conform - for reasons beyond our control - there is a great need there for us to discuss why and try to help solve some of those problems (suck CMSs, ad servers, time factors), but that is no reason to say that validation and conformance are unnecessary, and certainly no reason to point a finger at the science as having failed. In those cases, the practitioners are failing the science, I'm sorry to say. Of course, the science - and our practice of it - can and should be improved as time goes on, and I believe that it will.
As an educator, and at this point in my own understanding of things (once upon a time I wrote all about web design practice without understanding any of the science, and I'm plenty guilty of not applying the science even when I know I should) what concerns me most is that anyone who takes the stand that conformance is an unnecessary part of practice is essentially acting as a doctor not prescribing the proper medication when there's a clear scientific solution.
Such a message to other practitioners is a dangerous one. With it, we degrade our goals, and ultimately end up with poorly educated practitioners, poorly educated software developers, and severely compromised user agents and development tools as a result.
What's more, there seems to be a belief out there that best practices are wrapped up with de facto standards when they are implied but not explicit. This is a problem with semantics (sorry to put it that way, but it is). The term "web standards" is and always has been a misnomer, and we are suffering many of these problems because of that fact.
So what can we do?
Obviously, the terminology itself has caused problems, but trying to name what we all refer to as "web standards" something else at this point is as impossible as trying to tell people it's not an "alt tag" when they've been using that terminology for 8 years or more.
What we can do is work together more effectively to hone in on what should explicitly fit into a standard, and what is a best practice, and come up with some useful terms that we as professionals understand. These terms should also be more friendly to marketing departments, non-technical support people, and the lay public, and they should adequately describe the marriage of science and practice within our industry.
What we absolutely must do is take care to spread a message that encourages rather than discourages using the best science and best practice. It is holding that goal high that makes us professionals, after all.
Last week was great for chatter — not just about history or U.S. politics, though there was plenty of chatter about those things as well.
Yes, folks, it’s official: validation is a must-have, as explained with a few friendly caveats in the latest Web Standards Project Opinion.
In summary, it appears that you don’t like it, and we don’t like it, but there are still many obstacles to to creating a standards friendly web. Valid markup, and tools that produce valid markup, are still the exception and not the rule. Standards friendly design and development techniques are gaining currency, but haven’t yet come into the mainstream. Most importantly, complaining about the obstacles, or making excuses around them, are sorry substitutes for overcoming them.
For six years, the Web Standards Project has been available to those interested in applying standards toward making site implementation a straightforward affair, and leaving the hard parts to intanglible points of interest, such as user experience, return on investment, and aesthetics.
That availability is not something we see changing anytime soon.
Another consequence of last week’s dialogues is that they exposed a misconception held by many of those who are aware of our efforts.
Put simply, the only positions endorsed by the Web Standards Project as a body are expressed in our press releases and Opinions, which go live with the approval (sometimes reluctant) of every Project participant with an interest.
Everything else you see on this site is published on the initiative of a single author, or perhaps a small group, possibly over the objections of other Project participants (a rare outcome, but not unheard of, either).
Does it seem like anarchy? On occasions few and far between, it is.
The only lines we toe are the ones we draw together, always with an eye on the practical.
That may seem to contradict the stance taken in the previously linked Opinion, but the simple fact of the matter is that valid markup is not a challenge for developers who have the time, patience, and skill to coax it out of their tools — coaxing that becomes easier with practice.
With too many tools, it’s the coaxing that’s the hard part.
That, however, is another battle for another day...
Netscape is 10 years old, and C|net is celebrating with a special section on the once-mighty brand. Their retrospective on the browser itself is especially good.
While you're there, you might also want to read what nice things Netscape founder Marc Andreeson had to say about Firefox's potential to challenge IE.
Also worth a look is an interview with lead Mozilla engineer Ben Goodger. In addition to his thoughts on the future of the web ("The browser is back — Longhorn is irrelevant in a Web application world."), Ben discusses the origins of Firefox, including a mention of the WaSP's contribution thereto.
Mike Davidson, art director behind the now-legendary ESPN.com CSS redesign, announces the redesign of ABC News, touting it as a success of
real-world web standards means. It apparently doesn’t mean valid: the validator spews hundreds of errors at the time of this writing. Granted, Davidson admits that
the overwhelming majority of errors…are ampersand-related and [they’re] fine with that, perhaps since valid code isn’t really required to meet their users’ needs.
And surely unencoded ampersands couldn’t louse up a web application.
But if Davidson and Disney are “fine” with the current state of their code, that’s of course their prerogative. After all, for a brand that size to launch an online property that validates right out of the gate? Surely that’s expecting too much.
I’m of the mindset that validation isn’t an ideal, it’s a necessary baseline. The true costs of a software project are incurred in maintenance; ensuring that your code is standards-compliant — and yes, valid — will keep support issues to a minimum, and your clients happier. That’s not to say that it’s easy, not by a long shot; Davidson and his team have done a stellar job of applying (nearly valid) CSS to a media-rich web site, and I’m sure site owners and users alike are reaping the benefits of lighter pages that are easier to maintain.
However, to say that validation is unimportant and then call your work
web standards? Real world or otherwise, it smells like semantic snake oil to me.
To address a challenge, How to help others understand and use web standards, individuals at Pennsylvania State University have formed the Penn State Web Standards Users Group. Group membership includes people from various colleges within the university who meet monthly on campus to discuss topics and issues, share knowledge, and to promote the use of web standards at their workplace or university.
In a recent article by Christian Vinten-Johansen and the
group, Why Adopt Web Standards?, several key points and benefits are identified and explained.
Group membership is open to all staff, faculty, and members of the university web development community. Member Rob Dickerson is hoping to connect with other university groups having similar goals.
In just a few minutes from the time this post is made, the users group will be meeting. Great work everyone, it is good to see web standards topics addressed and promoted at the work place and in the schools.
Wow. My post on ASP.Net and standards seems to have touched a nerve. I received a pile of feedback via email on that one, and with a currently crazed work schedule it's taken me until now to sort through all the good info provided.
The upshot: yep, ASP.Net's built-in functions are tone-deaf when it comes to standards. And yep, there are irritating restrictions on
<form>s in ASP.Net. But the latter aren't so bad as they seem, and the former are slated to be addressed in ASP.Net 2.0, which should be along Real Soon Now.
The Form Thang
Phil Baines notes that while there is no requirement to put the entire page in a
<form> tag, there is an irksome ASP.Net limitation regarding
<form>s: only one per page can use ASP.Net's custom validation and the like.
According to Rick Mason, Web Technical Officer for East Essex County, this is due to the way Microsoft has chosen to maintain state across form submissions. When a user submits a
<form> with an error, ASP.Net automagically returns the
<form> to the user with the user's input still in place.
That's a very good thing for
<form> usability. What maybe isn't so good is that Microsoft implemented this functionality by adding a hidden
_VIEWSTATE. The way that is implemented precludes using more than one
<form> per page, thereby making it necessary to wrap every element whose state is to be maintained in a single
While this requirement doesn't do anything good for document semantics, it's really not the end of the world, and in any event is to be addressed in ASP.Net 2.0.
Milan Negovan of ASP.Net Resources, a site dedicated to using ASP.Net and web standards (bookmark!), does a great job demonstrating the problems ASP.Net's built-in controls create when trying to create valid XHTML (real XHTML, with an
application/xhtml+xml MIME type and everything) or even just HTML (BTW, be sure to read to the end for his thoughts on the suitabilty of XHTML for web applications).
It gets worse, though. According to Milan, ASP.Net has a notion called 'adaptive rendering.' The idea is that ASP.Net will automagically write markup tailored to the specific user agent requesting the page. Setting aside for a moment the fundamental stupidity of UA-based markup switching, there's still a problem with the implementation: it considers every browser but IE/Win to be 'downlevel' ('old and crippled', in the parlance of our time) and serves them up crufty ol' HTML 3.2. Ironic, no?
To be completely fair, ASP.Net was under development right around the turn of the century. Back then, the main 'alternative' browser was Netscape 4.x, which was most definitely 'downlevel' in anyone's book. But that's cold comfort to those of us developing web pages today.
So what to do? Well, in the article cited above Scott Mitchell of 4 Guys from Rolla.com suggest editing some config files so ASP.Net writes HTML 4 for other modern browsers.
Yay? Not yay. When writing 'modern' markup, ASP.Net insists on using Microsoft's propietary
For this reason, Rick echoes Nick Vrillo's sentiments that creating custom controls is the way to go, and notes that ASP.Net's
HtmlGenericControl is mighty helpful in this respect.
HtmlGenericControl does is allow you to manipulate the contents of an elment via ASP.Net. All you do is insert an
id and the ASP.Net attribute
runat="server" (don't get your panties in a wad — it's removed before the page is sent to the browser) and away you go.
Useful, but it seems as though it'd be more useful if it didn't rely on an
id and could therefore use more than one such control per page.
Other Odds & Sods
Phil has also posted a mini-rant about the difficulties of developing to web standards in Visual Studio 2003. Phil is particularly frustrated by Visual Studio's insistence on changing the formatting of his source code, a glitch whose source is explained by Mikhail Arkhipov in a blog post.
Peter Ankelein also wrote in to express his frustration with ASP.Net's postback. Postback is Microsoft's term for when an ASP.Net
<form> assigns its
action attribute to itself, so the
<form> gets submitted to itself. This is common practice in PHP as well, and in most instances it's a pretty useful trick. However, ASP.Net apparently forces this behavior unless you jump through hoops to avoid it. That makes things like posting
<form> input both to a local database and to a remote handler more difficult than it ought to be.
What does that have to do with standards? Not much, except that it's a great example of why it's a Bad Idea to subvert standard attibutes and elements. By appropriating the
action attribute of the
<form> tag, Microsoft has altered its functionality and made things that should be easy less so.
While you're in a reading mood, be sure to read to the bottom of Milan Negovan's post for some interesting thoughts on the suitabilty of XHTML for web applications.
While ASP.Net is a powerful, and in many ways convenient, server-side scripting language, its fundamental philosophy can make life unnecessarily difficult for standards-aware web developers.
For starters, Microsoft has 'integrated' markup with the language by appropriating attributes like
id and the
action. While it's certainly a far cry from the mishmash of server-side code and markup that characterized old-skool ASP or much of the PHP out there, there are times when this integration turns around and bites the very developers it's supposed to help.
More fundamentally, ASP.Net still views markup as primarily a presentation tool. Even when generating markup for 'uplevel' browsers, it just substitutes
<table>. That might have made sense back in the bad old tag soup days, but today there's a better way. Keeping document structure separate from style pays big dividends in page weight, flexibility and maintainability. By working against this separation, ASP.Net again complicates the lives of the very developers at whom it is aimed.
Likewise, ASP.Net attempts to work around browser incompatibilities via the discredited technique of browser-sniffing. Browser sniffing is attractive in some cases, and in fact most CSS hacks are just a sophisticated form of browser sniffing. However, the fact remains that when you go down that road you're practically begging for your code to break when a new browser arrives. More, you dramatically increase the complexity of your code as you try to keep straight all the browsers, versions and micro-versions around.
A more flexible and robust approach is to sniff for browser capabilities, and then write to those. That's easier to do on the client side than on the server, of course. But good defensive coding practices like WaSP project leader Steven Champeon's progressive enhancement can minimize the frustration.
Virtually every person responding to my post made a point to say Microsoft has seen the error of their ways — at least as far as ASP.Net's standards-unfriendly behavior is concerned. They've promised to rectify all of the frustrations listed above, and then some.
While we won't know what actually gets addressed and how until Microsoft actually releases the final ASP.Net 2.0 engine, I'm optimistic that ASP.Net will soon be as strong a platform as there is for developing standards-compliant web sites.
Update: Paul Watson of the Bluegrass Group has written in to tell me my newbie is showing.
id attribute on an ASP.Net custom tag (AKA 'custom control') is just an instance name. That is, it's a handle to identify a unique (as in one and only one) occurence of the control in an ASP.Net page — rather like an
id attribute on an HTML tag identifies a unique occurance, or 'instance', of that tag on a page. So Microsoft looks to be doing the Right Thing there (some folks may wish to debate whether server-side code should pay any attention at all to attributes that are meaningful on the client side; I'm not one of 'em).
What identifies which code defines the behavior of a custom control, then? Well…thereby hangs the tail of a cat, it would seem.
The short answer is that custom controls are defined by the tag name + prefix, which looks something like:
<chris:MySuperControl>, which looks an awful lot like an XML namespace. But it isn't. At all. Let's put a stop that rumor, right now.
What it is, is a tag prefix (in the example above,
chris) and a tag name (in the example above,
MySuperControl). From there, things get a mite hairy. I'll let Paul explain:
When you want to include a custom control in your ASPX page you have to tell ASP.NET to expect custom control tags in your HTML. You do this using
<%@ Register TagPrefix="chris" TagName="MySuperControl"
Src="SuperControl.ascx" %> right at the top of your ASPX file.
May be blindingly obvious, but;
SuperControl.ascx is your custom control code file. It is the only bit which has to map through to something, in this case a physical file. The TagPrefix is the bit before the
: and the TagName is the bit after the
:. The latter two don't have to have any correlation to what is in
SuperControl.ascx though it makes sense to use the same name.
If you have two different custom controls then you would have that declaration twice. The TagPrefix could stay the same but the TagName and the
src will have to be unique. If you have three custom controls then you need three declarations, and so on.
It gets even more confusing when you consider there are two levels of custom control. The ASCX type I mentioned above and then a more abstracted type which comes straight from a C#/VB.NET class file. E.g.
SuperControl.ascx would normally have an associated
SuperControl.ascx.cs file; that is the code-behind file which contains the logic of the control. But with the second type of custom control you wouldn't have the .ascx file at all, you just have a
SuperControl.cs file. With the latter type you use a different declaration to tell ASP.NET to expect custom control tags and in this case the TagName will map to the class name so it can't be anything you like. The TagPrefix can still be whatever you like though.
asp:TextBox and other built-in ASP.NET controls are treated slightly differently in that you don't need to tell ASP.NET what tags/prefixes to expect in the ASPX file. Strangely
asp: is not reserved to built-in controls. I just did a custom control and used
asp: as the prefix and it worked fine. I wouldn't normally do that though so as to distinguish custom controls from built-in.
So there you have it.
Now, if you'll excuse me I'm going to go grab some pepto to quell that queasy feeling I got from reading that Microsoft is employing a syntax that looks just like a significant part of the XML standard (albeit a controversial one), but is actually completely different. I'm sure there's a reasonable explanation, and I'm not at all sure what they've done actually hurts anything, but it makes me uneasy nonetheless.
C|net is reporting that Google board member and VC extraordinaire John Doerr said Google won't offer its own browser, contrary to recent speculation. Of course, Doerr also remarked that "that just because he was on the board of Google didn't necessarily mean he knew what they were doing."
Doerr does, however, believe the browser market — and the web in general — is set to hot up again:
Doerr said that he believed that many services of the first iteration of the Web were being reinvented now, much in the way Google reinvented search while many other Internet portals abandoned search R&D in the late 1990s. "Most of the old Web-based services are in the process of being systemized reinvented, even the browser," he said.
Andy Clarke just announced the standards-based redesign of Disney Store UK — and yes, folks, the new site even validates right out of the gate. The site's yet another compelling argument for how easily it is to build a high level of standards compliance and accessibility into a well-established brand. Congratulations to Andy and the entire team for a job well done.
WaSP Molly Holzschlag has posted an announcement for her new book, co-authored by fellow WaSP Dave Shea.
Just what I need — one more reason to blow my milk money on books. Thanks a lot, Molly. ;-)
Update: Dave Shea has added a very complete previewing the book.
Eric Meyer has released a second beta of his web standards-based slideshow system.
I haven't mucked about with it yet, but having developed a couple of slideshows myself I'm cognizant of the issues involved. This promises to be a valuable, lightweight alternative to the omnipresent PowerPoint slideshows we all know and cough love.
As I mentioned in my previous post, my employer has adopted Microsoft's ASP.Net as their server-side technology of choice.
For the most part, this decision has only tangential impact on me: I spend most of my time on client-side development, project management, IA and the odd incursion into the worlds of graphics and Flash/ActionScript. The server-side tech just stays out of my way, which is how it should be.
Once in a while, I do end up doing some lightweight server-side stuff, usually when a client is hosting their own site on an Apache server and using PHP or PERL. But last week I was pressed into service hacking together an email form in ASP.Net.
It was an extremely simple task, really — especially as I was able to crib most of what little code was needed from forms developed by our regular programmer. Even so, I have to admit being a little excited: it was a chance to develop some (very) basic chops with a widespread technology about which I've heard many good things.
The good feeling didn't last long, though.
First, I discovered a very curious practice among ASP.Net developers: wrapping the entire page — everything within the
<body> tags — in a
<form> tag. Apparently, if you want ASP.Net to process the dynamic bits (for example, to insert text from a database where you've stuck a variable name), said bits must live within a
<form> tag with a
I don't come close to knowing enough about ASP.Net to know if I've been sold a bill of goods, but if the foregoing is accurate it really rankles. Maybe the whole-page-is-a-
<form> thing makes sense to Microsoft developers accustomed to the WinForms metaphor, wherein each and every UI is a form. But to this webhead, it makes no sense whatever. In fact, it seems downright hostile to good document structure.
Here's hoping the whole thing is just a misunderstanding caused by my extreme ignorance regarding ASP.Net best practices.
The second issue I encountered went beyond curious to being downright frustrating. One of the oft-touted benefits of ASP.Net is it's labor-saving built-in functions. While slapping together my primitive form handler, I employed one that promised to be seriously nifty: automated form validation controls.
I know from the dozen or so PHP form handlers I've written that validation is easily the most time-consuming bit in a simple email form. So the prospect of inserting a few tag-like bits of ASP.Net code and being done had me giddily cackling away. Until I saw the error output.
You see, the error messages generated by ASP.Net are all wrapped in — eeyew —
<table> tags. So much for that nice semantic markup.
To be fair, the
<table> markup was valid, and it did include an
id attribute for easy styling with CSS. But still. Why throw in an extraneous
<table> at all? The validation control offers the option of producing error messages that are enclosed in a
<p> tag or marked up as an unordered list. It would be easy enough to mark-up the optional error header with an
<hn> + an ID. What on earth do I need the extra
Maybe there's an option to change the
<table> to a
<div>, or even to suppress it altogether. But after 30 minutes of furious googling, I handn't found it and the deadline was looming. So
<table> it was.
Like I said at the outset, I've heard lots of good things about ASP.Net: the separation of server-side code from markup, the ease of developing reusable components, its OO architecture and, of course, the built-in functions.
Those last, in particular, could be real time savers. But if using them means sacrificing semantic integrity, then they look like a faustian bargain to me.
Ah, well. I suppose in the worst case, I'll just have to roll my own replacements — something I'd be doing anyway if I were using PHP or Perl (depending on which modules/libraries were available on the server, of course). And I still hold out hope that it's just my own ignorance that's making ASP.Net look bad, rather than any flaw in Microsoft's implementation itself.
Update: After I posted the above, something was nagging me about the list of ASP.Net benefits. It just seemed I was missing something. Turns out, I was: XML support. While I've not done a whole lot of XML wrangling (apart from XHTML, of course), from what little I know ASP.Net is at least as good as any other server-side language when it comes to handling XML, and probably better than most. PHP, for example, is pathetic without the installation of additional modules, and my (quite limited) experience with those indicates they're rather awkward — assuming you can even get them installed, which isn't a given if you're on a shared server.
So whatever standards-related kvetches there may be about ASP.Net, it's most definitely got it's good points, too.
Update the 2nd: Another thing about the post was bothering me, too: my assertion that the markup for ASP.Net's form validation error messages was, if not semantically kosher, at least valid.
Guess what? I screwed up. It isn't. I'd forgotten that to color the error messages red, ASP.Net inserts a
<font> tag inside the
<table> tag. Apart from just being a semantic no-no, the
<font> tag is also an inline element. But as noted, ASP.Net further wraps error messages in either a
<ul> (the default) or a
<p> — both block elments, and therefore invalid when placed inside the inline
So the markup ASP.Net generates by default is not only a semantic mess, it isn't even valid (X)HTML.
Update the 3rd: Via email, Nick Vrvilo confirmed my worst fears regarding ASP.Net's HTML generation functions. Nick says they do largely generate semantic nonsense, and invalid semantic nonsense at that.
Worse, Nick says there's no easy way to override the default markup. To do so, one has to write one's own libraries. Nick helpfully points to a tutorial on creating custom ASP.Net libraries but also observes that it's an awful lot of work just to create a simple valid HTML form. Just so.
On balance, Nick says that eschewing the built-in goodies in ASP.Net is the easiest way to get valid pages, and that it's really not that bad since you end up in the same place you'd be with other languages. True enough, and ASP.Net still has other advantages.
On a brighter note, Nick confirms my suspicion that the entire-page-inside-the-
<form>-tag number is nothing more than a carryover from developers used to working on Windows apps, where each screen of an application is a form. He says developers used to that metaphor just slap the
<form> element around everything out of habit; ASP.Net only requires it around the actual form elements — which is just as it should be.
So a mixed review, then. ASP.Net has much to recommend it, but Microsoft still gets raspberries for encouraging presentational, invalid markup with their default ASP.Net functions.
Update the 4th: Due to the volume of responses I've received, I've just rolled all the rest of my updates into a new post.
As part of a series on IE, C|net has an article on the problem of browser incompatibilities.
According to the article, users of 'minority' browsers like Firefox and Opera are unable to escape IE entirely due to IE-only sites:
For many people, using a non-Microsoft browser such as Firefox is now a must for secure Web surfing — but most still keep a copy of Internet Explorer around just in case.
The problem is that many Web developers create their sites so they work best with Internet Explorer (IE), but not to work as well with browser software used by relatively tiny groups of potential visitors.
C|net's not entirely wrong: there are indeed sites that don't work in browsers other than IE. However, I almost never open IE for anything but testing.
I've been using a Gecko-based browser for most of my surfing since 2001 and Mozilla 0.9.4; returning to IE now is just painful. Besides which, my primary personal computer is a Mac, so IE/Win isn't a realistic option for me at home. If a site requires IE/Win I take my business elsewhere and, if I can find a contact addy, let the losing vendor know why.
I can think of only two execptions to the rule. First, my electric and gas supplier, Powergen. Their offline incompetence (speaking organisationally; no offence to any particular employee intended) has me looking to better-deal them as soon as they get my account straightened out anyway. The second is MaximumASP, the web host used by my employer. Their account management app simply falls down in Firefox — no surprise given their focus, I suppose. But if I were in charge of selecting hosting vendors at work, I'd have us at another host faster than you can say 'best practices.'
What's particularly frustrating about sites like Powergen and MaximumASP is that it really doesn't have to be that way — C|net's assertions notwithstanding:
These incompatibilities between browsers are as big a headache for developers as they are for Web surfers, some professionals say.
"It's definitely a problem," said Noel Briggs, a developer at Web design company NetTensity. "The time we waste on addressing browser incompatibility problems easily amounts to a significant percentage of our payroll."
Pardon me if I call 'baloney.'
When the WaSP was founded in the dark days of 1998, we estimated that browser incompatibilities added 20% to the cost of developing a web site. Back then, Netscape Communications was by far the worst offender in terms of standards support. The flaws in their Navigator browser and Communicator suite are the stuff of legends. IE was no picnic either, but Microsoft was trying and it showed: IE was neck-and-neck with Opera in terms of standards support, even a bit ahead where the DOM was concerned.
Today, both Navigator and Communicator have given way to a new Netscape browser based on The Mozilla Foundation's outstanding Gecko rendering engine. Meanwhile, web developers have developed standards-based methodologies like WaSP project leader Steven Champeon's 'progressive enhancement.'
Together, better browsers (including IE 6/Win, warts and all) and better practices have slashed the time I spend hacking around browser incompatibilities to something like 10-15% of total development time. And that's just 10% of development time; it doesn't account for design, IA, project management, photography, copywriting and so on. Those aspects of site development are only minimally impacted by browser incompatibilities.
So the percentage of total site development cost spent on browser incompatibilities these days is probably under 5%, at least in my experience. On scripting-intensive projects the total may be a bit more — as much as 10-15% of the total, even. But that's still a significant reduction from the bad old days.
This reduction is almost entirely due to the adoption of web standards by browser makers and site developers. How do I know? Because testing and debugging all but the most complex sites in the most standards-compliant browsers — those based on Mozilla's Gecko engine, KDE's KHTML and the Opera browser — can be done in an hour or six. On a large project, the team probably spends more time than that making coffee or ordering pizza. The lion's share of the debugging time and effort is spent on the less standards-compliant IE/Win and, to a (much) lesser extent, the officially-moribund IE/Mac.
Bottom line, if IE/Win were brought up to snuff on standards, browser incompatibilities would largely be a non-issue. And even as it stands, devoting 5% of costs to support a group of browsers that comprises more than 5% of the audience (according most recent web statistics I've seen) looks like a bargain to me.
In my last post, I observed that the action on the web in the next few years would be its development as a platform for developing and deploying applications. In fact, the fun has already started.
Web mail clients like Gmail, Hotmail and Yahoo! mail were among the earliest, but the list has grown to include photo management apps, news feed management software and, of course, publishing software. Joel Spolsky offers perhaps the best history and explication of the move in his oustanding article "How Microsoft Lost the API War":
Web Applications are easier to deploy because there's no installation involved. Installing a web application means typing a URL in the address bar. Today I installed Google's new email application by typing Alt+D, gmail, Ctrl+Enter. There are far fewer compatibility problems and problems coexisting with other software. Every user of your product is using the same version so you never have to support a mix of old versions. You can use any programming environment you want because you only have to get it up and running on your own server. Your application is automatically available at virtually every reasonable computer on the planet. Your customers' data, too, is automatically available at virtually every reasonable computer on the planet.
The usual suspects have been rushing in to stake out their turf in the emerging 'rich application' market: Sun with Java Web Start, Macromedia with their Central product for Flash, Mozilla with XUL, Microsoft with Avalon/XAML and the W3C with SVG + XBL. As well, a group of developers including representatives from Mozilla, Opera and Apple have been working on an evolution of HTML under the auspices of a group called WHAT Working Group.
While the immediate impact of most of these technologies on the working web developer is minimal, over the long haul one or more of them will define the way we work for the next decade. In the near term, they will have tremendous impact on developers working on purpose-built applications inside corporations, and to a lesser degree on developers working on consumer applications as well.
The competition for defining the next generation application platform will, to a large extent, determine whether we have an open computing environment with robust competition or are chained to a proprietary platform where innovation is determined by a single entity. As is usual in computing, Microsoft has the early advantage. From the aforementioned C|Net article:
Microsoft needs only a small percentage of Internet companies to offer Windows-specific tools to have succeeded in giving the platform a leg up in a world in which all operating systems with standards-compliant Web browsers are equal.
Even so, WHATWG includes a veritable who's who of working web and browser developers: Ian Hickson, Anne Van Kesteren, Dave Hyatt, Håkon Wium Lie, Dean Edwards, Fantasai. It's hard to bet against that group.
There's also a wildcard in this game: a little company called Google that has been moving more and more into web applications and has the brand recognition and user base to make waves in this space that would rock even Microsoft's boat.
Obviously, none of these technologies is a formal standard just now — even SVG + XBL is just at the draft stage. But things are moving quickly. WHATWG has promised working implementations by the end of the year, Sun and Macromedia's solutions are available now, and Avalon/XAML should be with us sometime in 2006.
I haven't really sorted out who I think the winners and losers will be as yet, but my gut tells me WHATWG has as good a chance as anyone. The most significant factor for me is that they are designing their specification to degrade gracefully in older browsers, and intend to implement much of it in IE as behaviors. That avoids the 'boil-the-sea' problem Joel cites with regard to XAML in his 'API War' article, which to me is a very big deal. Even Microsoft, rightfully famous for the lengths to which they go to ensure backwards-compatibility for Windows, doesn't have that edge in this battle. I must admit, too, that WHATWG is partly a sentimental favorite: not only do I respect and admire the drivers of the project, but they're committed to handing off their specification to a formal standards body such as the W3C or the IETF.
Whichever technology comes out on top, the important thing to me is that it be developed openly, with input from many disparate sources, and be available for use royalty-free. Without those attributes, the web will never achieve its full potential as an applicaton platform.