The Folly of Depending on CSS Parsing Bugs

I meant to write something about Zeldman’s post earlier, but never ended up getting around to it (being very tied up putting off all kinds of other, more important things). However, Hyatt’s CSS Parser update reminded me about it.

My two cents: Zeldman is completely and totally wrong on this issue. I would not compensate for CSS rendering bugs by exploiting CSS parsing bugs except as a last resort. Think about it from a standardized test perspective: what strong relation does CSS rendering bugs have with CSS parsing bugs? There’s no reason (nor right!) to assume that all future browsers with the same rendering bugs will have the same parsing bugs (and vice versa). In fact, if you look at the recent past releases (Safari, Opera, IE), even within browser families you’ll see that this is absolutely not true!

By Zeldman’s own argument, using CSS parsing/rendering relation-hacks are a bigger money-pit than per-browser rules. Instead of if-statements for fixes, you’re dealing with a cartesian product of pain. This is a magnitude more complex than plain ol’ browser sniffing!

At the end of the day, everyone has to weigh for themselves the importance of ease of maintenance and user-agent specific rendering code, but if you need to implement the latter (and chances are, if you’re doing this web stuff for a living, you will), then your two choices are really either client-side or server-side sniffing . If done properly*, it’ll save you much headache.

* This, of course is the central issue. While for DOM support I’m much more of a browsercaps type of guy, for CSS rendering fixes, it’s really about specific versions. Organize the conditions into family’s and then create per major-version branch which contain the fixes (I do agree with Zeldman here that you probably don’t want to be sending completely different files). It’s then up to you whether to restrict these conditions to the specific version (v5) or to future versions as well (v5+); both have their pros and cons (MSN caught flack for doing the latter). If dealing with a bug, I generally pick the former, holding the optimistic view that the next version will fix the rendering bug.

In summary, if your designer tells you to use CSS parsing hacks to do filtering, please give him a good smack.

If you’re in the market for a USB -> TOSLink (Digital Audio Output), I’d highly recommend skipping over overpriced crap like the Extigy and instead save yourself a chunk of cash and pick up this USB Sound Box for $25. This thing has worked flawlessly over the past few months on both Windows and OS X machines. It also has AC97 analog output built in, although if you’re hooking it up to anything worthwhile, you really want to use the TOSLink. (Made by Shin Kin Enterprises Co., Ltd., although their website seems to be offline)

You know, I still find it surprising that after all these years, Netscape still doesn’t have an answer to MSDN’s DHTML Reference. Trying to find stuff in the Mozilla DOM document (TOC) is pretty trying, and even when you get to it, the information you’re likely to want won’t be there (lists of properties or interface names, much less any working examples). Sure Mozilla is a group effort, but you’d think that over the past, say 4 or 5 years, that AOL/Netscape could have spared the budget for at least 1 intern to at create a shell of a document or install a simple search engine or, well, anything…*

(I’ve been tackling some event code again tonight. event.keyCodes of course can be gotten via introspection (and of course by now are fairly well documented elsewhere), but how about for example, finding out the Event interface types used for something like dispatchEvent()? Google doesn’t turn up anything and beyond the one example type (“click”) the Mozilla docs sure as heck don’t.

* No promises, but if I can ever get this event muck worked out, I might be launching something that could eventually help the situation

I want my Dual DVI! Petition – this page also has a pretty complete list of the Dual DVI cards out there. Matrox also has a Millenium G550 Dual-DVI at a reasonable price, although not without limitations (both primary and secondary desktops must be set at the same resolution and color depth, and the maximum available resolution is 1280×1024). Still, if you’re running W2K, this or the Parhelia is pretty much your only decent multi-monitor option.

I picked up the recently discontinued Gainward Ti 4600 card a few weeks ago. Originally I had intended to pick up a pair of LCD screens by now, but it turns out the screens I were looking at (NEC-Mitusbishi LCD1760NX) are using the same 18-bit AU Optronics screens as the Hitachi CML174 series [Tom’s Hardware has some info: Hitachi CML174SXW, LCD 17: Samsung 171N & NEC LCD1760V vs. Everyone Else], which well, just really isn’t acceptable.

LG.Philips have recently announced some new screens with impressive brightness, color saturation, and contrast ratios as well as 12-16ms response times that according to Tom’s Hardware will be out by this summer, so I’ll probably be waiting a while before I drop a thousand dollars on displays. Who knows, maybe Samsung will even have something new out.

List of consumer Dual DVI cards: