Shallow Thoughts

because deep thoughts smack of effort

Return to the Web of the 1990′s?

Posted in Web Stuff by Bridget on December 17th, 2007

The web development community continues to comment on Opera v. Microsoft, the W3C, and what is needed to propel this industry forward in the next decade. The latest installments are by Jeff Croft and Alex Russell and I simply can’t agree with their take on what would make things better.

As Jeff Croft said in his post titled: Do we need a return to browser wars (emphasis mine):

This is one of my biggest pet-peeves within the standards movement: this idea that if something isn’t compliant, it must suck. We’ve completely lost the innovative, experimental, lets-try-something-crazy attitude of web designers in the 90s, because we’re too damn concerned about making things that are compliant.

Reliving bits of glory from the days of the “Wild, Wild Web” isn’t helpful in any meaningful way. The fact is, the 90′s are over. This decade is already winding down, too. It’s time to move on.

The past is good for studying the effects of history and to learn from past developments. Reviewing that which has been can help inform that which may yet come. However, the browser wars of the 90′s were not good for web designers and developers. If they had been, the standards movement would never have been born.

If you’re going to review the past — it helps to look at the whole picture, not just a slice.

Jeff also said (emphasis mine):

Once in a while, we should be saying fuck standards and trying something out of the box. Obviously, that site you’re working on for a major client in the education sector probably isn’t the time to try this, but we do need to find the time. It’s the only way to move our industry forward.

Where should we do all of this experimentation? On our personal sites and blogs? I mean, I don’t really understand where the appropriate place is for displaying and/or parading this kind of alleged web-coolness.

In the era when Target is being sued because people are hindered from making purchases online and the days when Opera is filing claims against Microsoft for lack of standards support and monopolizing the market, how is moving away from standards compliance supposed to make things better?

Bemoaning standards as being boring overlooks the much larger issue that those very same standards compliment: accessibility! Just when the standards “movement” begins to gain some traction advocating that the web should be open for everyone, Jeff makes comments about standards in the context of browsers and validation. It’s incredibly short-sighted.

To focus is on what shiny new toys designers/developers can play with to dazzle and delight themselves and like-minded geeks is very 1990′s! Haven’t we moved beyond that as an industry? Haven’t we grown up at all? I think we should be focusing on the people who will ultimately being using websites. That is to say, everyone else.

To be fair, Jeff’s comments are based on what he read in Alex Russell’s latest blog entry, The W3C Cannot Save Us. So, I went and read what Alex said (edited to pull out my point — and emphasis mine):

…there are huge tracts of the HTML, CSS, and DOM spec’s that you simply can’t use. IE’s bugs combined with its market share conspire to ensure that’s true…Mozilla, Opera, and Safari all have their own warts as we get to the edges of what’s even theoretically possible w/ current specs. And that’s not even taking into account how utterly wrong, broken, and silent the specs are in several key areas. Even if the specs were great, we’d still be gated by the adoption of new renderers.

So, it is still like the pet vitamins
90's but to a lesser degree, right? When it comes to building something, anything for the web, designers and developers have to take the different browsers and how they render things into consideration.

The difference between the 90's and now is that the industry began to care about making websites that didn't have badges that stated "Best viewed in [insert browser name.version]." Web designers and developers didn't want to write markup and/or code specifically aimed at particular browsers or, God forbid, multiple versions of the same site/application to target all of them.

The more this discussion/debate continues, the more reasonable the idea becomes for there to be one core browser engine that renders "web code", allowing companies to bolt onto that core whatever features and frills they would like. The question is, should it be the responsibility of the W3C to create that engine? Is that what will solve the woes of the web as it grows and matures?

12 Responses to 'Return to the Web of the 1990′s?'

Subscribe to comments with RSS or TrackBack to 'Return to the Web of the 1990′s?'.

  1. Jeff Croft said,

    on December 17th, 2007 at 2:36 am

    Thanks for the response!

    I think the “one core browser engine” idea is great, but entirely unrealistic. Microsoft, Apple, Mozilla, Opera, and the others are in this game for one reason: world domination. They would never settle for a situation where they weren’t in charge. It’s a great idea, but it’s just a bit too “hippie” to be practical. Peace, love, and happiness sounds great, but rarely actually happens.

    Hopefully you understand that neither myself nor Alex are suggesting we abandon standards and let all browser makers do their own thing. Rather, we’re suggesting that browser makers should be encouraged to innovate *alongside* their implementations of standards (which have finally gotten to a pretty good place, overall).

  2. Alex Russell said,

    on December 17th, 2007 at 6:42 am

    Hey Bridget:

    So one of the things I was tempted to cover in my original post was some mention of how standards bodies can effectively shorten the time between deploy and standardize. WHATWG has gone one route (BDFL, short iterations, deep implementer involvement) and the EcmaScript 4 working group has gone another (baseline reference implementation maintained by the WG). The ES4 WG example is perhaps most similar to what you’re proposing and I do have some hope that it’ll work.

    The idea of a single core browsing engine is also interesting from the perspective of economics. Browser vendors make what money they get from chrome and search engine integration, leaving renderer development as a “loss leader” to hopefully entice users and organizations to switch (or at least not run screaming for the hills). One can imagine the browser vendors aren’t keen to keep re-running this race for every new set of features either. Your idea might have legs.

    One thing I’m not clear on yet is how such an arrangement (particularly if it happens at W3C) would end up in a state any different than Amaya (http://www.w3.org/Amaya/). Since Amaya’s first and only loyalty is to W3C specs and not to the real web, it fails miserably at enough real-world tasks that it wouldn’t ever for the basis of a real product. Without competition, it’s unclear to me what force would provide the strong impetus to improve the renderer over the long haul. Giving maintenance responsibility to the W3C seems like a fast path to a tragedy of the commons.

    That said, if a way can be found to encourage healthy competition and improvement, a single renderer could indeed get us out of the sticky spot we’re in (at least in the short term).

    Regards

  3. Bridget said,

    on December 17th, 2007 at 9:34 am

    Gents, thanks for responding on my silly little blog.

    I can’t take the credit for the core browser engine idea. I read a comment on Andy Clarke’s blog where Mike Loizides brought up the idea. I’m merely putting out feelers to see if it is feasible or reasonable as a solution.

    I originally thought the same as Jeff — that it might not be practical. However, the more I read from the vocal web community, the more I question that it may be a solution for us all.

    Jeff, you said:
    >Hopefully you understand that neither myself nor Alex are suggesting we abandon standards and let all browser makers do their own thing.

    I would certainly hope not, but having the browsers competing to provide the new and shiny you want, tends to cause that kind of drift, don’t you think? Isn’t that why sites were being built to accommodate Netscape and/or IE? Their versions of the shiny didn’t play nice together. I’m not so certain it would be any different this time.

    To Alex’s point about Amaya failing miserably, I’m not at all familiar with Amaya but will read up on it to be better informed. A product that abandons real world usability is certainly not something I would favor. I hope that much remains clear in any commentary I might make.

    I would like to understand one comment that Alex made in his article. Exactly how does Zeldman hurt me? I just don’t see it.

  4. beth said,

    on December 17th, 2007 at 12:02 pm

    Totally in agreement that adoption time needs to be shrunk in a big way!

  5. Jeff Croft said,

    on December 17th, 2007 at 12:23 pm

    > I would certainly hope not, but having the browsers competing to provide the new and shiny you want, tends to cause that kind of drift, don’t you think? Isn’t that why sites were being built to accommodate Netscape and/or IE? Their versions of the shiny didn’t play nice together. I’m not so certain it would be any different this time.

    Bridget: To me, the namespacing we have now solves this problem. The WebKit team has been adding new features left and right, and it’s not hurting anyone, because they’re doing it in their own namespace, and not moving it in into the public namespace until it’s a standard (or at least so widely implemented that it’s a de facto standard).

  6. Bridget said,

    on December 17th, 2007 at 12:52 pm

    Jeff,

    Agreed: the namespacing isn’t hurting anyone.

    Yet, people balk at conditional comments for implementing CSS now, so how long do you think it would take before people complain about what would eventually be called “namespacing hacks” in the CSS files?

    Just thinking out loud on that aspect, really.


  7. on December 17th, 2007 at 12:59 pm

    “I would like to understand one comment that Alex made in his article. Exactly how does Zeldman hurt me? I just don’t see it.”

    I think it’s more an allusion to Standardistas in general, eschewing some of the new shiny hotness in CSS3 and/or what the Webkit and Gecko teams are building into their builds because they might cause a validation error.

    Why reward users using a 5 year old browser with an “identical cross-platform” degraded experience and punish those using a modern one that can support the shiny new hotness?

    For how long do we let Internet Explorer hold us back? Does anyone honestly think IE8 is going to magically offer full support for current standards?

  8. Bridget said,

    on December 17th, 2007 at 1:10 pm

    Brendan said:
    >I think it’s more an allusion to Standardistas in general, eschewing some of the new shiny hotness in CSS3 and/or what the Webkit and Gecko teams are building into their builds because they might cause a validation error.

    Yeah, but CSS3 isn’t even a nailed down draft yet, is it? I sort of understand eschewing some of that pure awesome when it is picked up too soon. Then again, validation isn’t the end all be all of a working site. Plenty of sites function just fine without being valid.

    That’s not to say that validation should be ignored. We know the nightmare that ensues when people don’t used some sort of touchstone for comparison.

    As a side note, I’ve seen Zeldman (the man, not the “image” of Standardistas) be less of a zealot in some areas than the stereotypical standards nazi. So, I think the guy might appreciate a little, be it ever so slight, disassociation with that image.

  9. Jeff Croft said,

    on December 17th, 2007 at 1:55 pm

    > Yet, people balk at conditional comments for implementing CSS now, so how long do you think it would take before people complain about what would eventually be called “namespacing hacks” in the CSS files?

    People would certainly do namespacing hacks, but that doesn’t bother me much. If you’re using “namespacing hacks,” then you’re well aware of the fact that you’re doing so at your own risk. To me, anything that is prefixed by -renderingengine is a “use me at your own peril” sort of thing. It’s a red flag to the developer that these things may still be “beta,” so to speak, and if you use them for a production site, you do so at your own risk. The important thing is that they’re there for designers and developers to play with, provide feedback on, and learn about.

  10. Jeff Croft said,

    on December 17th, 2007 at 3:58 pm

    FWIW, I would absolutely agree that Zeldman is not the type of zealot standardista we have to be worried about. He’s a pragmatist, for sure.


  11. on April 28th, 2008 at 5:01 pm

    [...] crowd has responded with fire and passion, sending up a rallying cry against what they see as a return to the browser wars of the ’90s. All of this begs the question – what are we really trying to do with web [...]


  12. on October 9th, 2013 at 2:29 pm

    [...] crowd has responded with fire and passion, sending up a rallying cry against what they see as a return to the browser wars of the ’90s. All of this begs the question – what are we really trying to do with web [...]

Leave a Reply