Killed By Complexity

If this is the death of Wall Street as we know it, the tombstone will read: killed by complexity” it was suggested on the front page of the Guardian today (Tuesday 16 September 2008).  A similar question might be asked about the roadmap for a number of Web developments.  Is Tim Berners-Lee’s vision for the Semantic Web over-complex?  Are the metadata standards which are being developed too complex to be used by many software developers? The abstract for a panel session at WWW 2005 suggested that “It has been estimated that all of the Web Services specifications and proposals (“WS-*”) weigh in at several thousand pages by now”. And one of the many objections to ISO’s decision to standardise the OOXML file format was that, at 6,000 pages, it was too complex for developers in small organisations to implement.

So now’s the time for more lightweight approaches, it could be argued.

Not so, comes the counter-argument. We will need to have comprehensive, well-grounded and unambiguous standards and specifications in order to build robust services.

The current uncertainties in the financial markets of course provide more than just a analogy  – they are also giving rise to uncertainties in the IT sector.  This is often used as an argument to point out dangers of the dependencies on externally-hosted Web 2.0 services, as my colleague Paul Walk pointed out recently. But as I mentioned last year in a post entitled “Universities, Not Facebook, May Be Facing Collapse“, universities themselves are not immune to the financial difficulties which the banks and airline sectors are currently facing.   

But into such discussions we should also add the financial stability of the standards-making organisations. Organisations which have government backing may be able to weather the storm, but what about those member consortiums whose sustainability is dependent on the financial backing of the commercial sector. And as the W3C is one such organisation, can we be confident that the development and maintenance of complex standards will be sustainable in the long run.  In light of suggestion in a recent interview with Ian Hickson, editor of the HTML 5 standard, that the standard is unlikely to be a “Proposed Recommendation in 2022″, should we not now be asking the difficult questions regarding the sustainability of such standards which seem to have a long gestation period before they can be regarded as stable.  

Or am I being unduly pessimistic?  Might not any current financial uncertainties be a mere blip, and perhaps will not affect standardisation development processes along the lines I’ve hinted at? Or will a legacy of George W Bush’s economic mis-management (or Tony Blair’s if you are of a different political hue) be the failure of the HTML 5 standard to achieve its proposed recommendation status by 2022?


  1. Universities, Not Facebook, May Be Facing Collapse
    I missed this post the first time around (actually, I wasn’t reading your blog, so I have an excuse).

    Interesting, given the strongly-held belief (in some circles, and referenced again at Repository Fringe ’08) that Google/Flickr/etc should not be used in preference to in-house solutions as they are not going to be around as long as your University

  2. The collapse of any major external resource would be wide-ranging and, therefore, understandable in the short term. If the problem wasn’t addressed quickly, however, it’s more difficult to forgive, regardless of where the problem originated.

    The level of uncertainty that you mention means there is no clear answer whether or not to go external or adopt an in-house approach to services. Different – yet relatively equal – risk is attached either way you look at it.

    I don’t think you’re being unduly pessimistic. However, do you not believe that anything more dramatic than a cautious and informed approach would involve moving from the unknown to just another unknown?

  3. I tend to agree that complexity does kill the cat. Think back 15 years ago to the point when there were arguments on whether JANET should stay on a Coloured Books -> ISO-OSI track, or move to TCP/IP. The simpler standard was adopted, and it was absolutely the right choice!

    But I’m interested in your comment about HTML5 by 2022. Why HTML5 and not XHTML? What’s currently broke that absolutely needs fixing? Is extra complexity needed, or is HTML5 about refining back towards simplicity?

  4. The comment about HTML5’s timetable seems to be based on a misunderstanding — the spec itself is likely to be stable by late 2009, and mostly “done” by 2012. The ten years after that is the time to get two complete and bug-free implementations in browsers, something that hasn’t been tried for any major Web spec before.

  5. It is not the companies that we need to fear will collaspe. It doesn’t matter if they do. If Facebook the company disappears or MySpace the Company, or even the mighty Goggle, new companies will arise to fill the void because social networking and the internet will not collaspe. The communication networks, functions and needs have changed forever.




  1. Web 2.0 In Troubled Economic Times « UK Web Focus - [...] in their uses of Web 2.0 services at a time of a global recession?  In response to a recent …

Submit a Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>