Don’t get me wrong. If you know me, you know that I’m the first one to push for valid markup. I’m a perfectionist at heart, and nothing pleases me more than seeing that beautiful message on the W3C’s Validator: “This page is Valid XHTML 1.0 Strict!” But the more I think about it and the more I talk to people about it, the less I see a need for it. Yes, I know, it ensures accessibility, it ensures compatibility across multiple platforms, it’s machine readable and it follows all the rules, but so long as the mainstream browsers don’t stop accommodating bad markup, what’s the point?
I wish there were a more solid reason for conformity. Like, say for example, if the browser behaved like a compiler. Rather than doing its best to patch up faulty markup, what if it spat out a list of errors that need correcting. Then maybe people would make more of an effort. Or if the server simply refused to serve up documents that didn’t validate. But as things are right now, there’s no real incentive to make sure all of your tags are closed and all of your ampersands are escaped. The worst of it is when a document is declared to be XHTML, no matter the flavour, and doesn’t validate, but the browser* still parses it! I mean, isn’t XHTML
a reformulation of HTML 4 in XML 1.0? And isn’t XML unforgiving to the point that an XML parser must stop if it reaches a fatal error?
So I ask you, what’s the deal? We fight for standards, we expend all this energy, and yet the major browsers in use today could care less if the document they’re parsing is well formed. How are we supposed to be taken seriously as professionals if as far, as the browser’s concerned, tag soup is as good as a valid document?
I say let’s start a movement, “a zero tolerance for invalid markup” campaign of sorts. Let’s really take back the web, let’s browse happier. Any takers?
* I don’t specify browsers by name because all of the popular ones are equally guilty.
Read more from the archive.