I now feel compelled to write in this subject after holding my breath and counting to 10 several times. I recently read YABA* about building a "Web 2.0 Meter". A not bad idea *if* you had some sort of defined criteria that those being judged could adhere to. Another site claims it is a Web 2.0 validator
Web 2.0 Validator.
Enough! I can even hear Homer Simpson saying “D’oh!!” when he thinks about this. Time to rant a bit from Logic 101. You cannot measure something by two independent “meters” without some distinct set of metrics around the subject. Sorry folks, it is that simple. I would like to point out that the Web 2.0 validator site had the sense to state the rules they use and that the gist of the article at Oreilly was not about the web 2.0 meter. It was a realization that the concepts we have come to “associate” with the Web 2.0 were really the Web 1.0’s original goals. GAH!!! I just had an unpleasant realization that now I am trying to quantify the Web 2.0. To solve this problem, I think we have to look to the past as well as the future (yeah yeah – so what does that rule out? Thinking about the exact present moment?). The Oreilly folks are pretty smart IMO so please don't take this as some petty stab - more of a friendly prod :-)
If “Web 2.0” is to be used as a catch all term for where we are going, let's put some substance behind it. If not, it will suffer the same symptoms as SOA and Web Services - both very meaningful to most people, just with differing semantics. In fact, the lack of clarity around SOA lead a group of almost 200 people to get together and write a formal Reference Model for SOA under the auspices of OASIS. Similarly, a group got together within W3C and worked on a Reference Architecture for Web Services. I am proud to state that I worked on both projects.
So what can be done to put substance in the Web 2.0?
1. Write an abstract reference model to show the components of the Web 2.0 (abstract); and
2. Create sets of high level abstract patterns, mid level patterns and low level idioms to illustrate what is really meant by the web 2.0; and
3. Create reference architecture (in plural and somewhat generic) for all components of the Web 2.0, describing their externally visible properties and relationships with other components.
An example would be to illustrate the syndication-subscription pattern using some architectural patterns template. The abstract notion is that subscribers notify a syndication component of their wish to receive content. When the syndication component has content it is ready to push out, it configures a list of recipients based on some criteria then proceeds to push the content out. A lower level idiom could show this implemented using Apache components, perhaps even with options for content formatting based on device, reliable messaging protocols, security and end user authentication with a persistent security model for the content itself.
The cool thing about this approach is that it still gives each and every implementer the freedom to make their own black box components whilst preserving a common layer of understanding. It also provides documentation about what is really meant, granted, those who cannot distinguish abstract from concrete may still be confused.
Of course to do this, you would require an architectural patterns meta model and template that allowed you to go from the very abstract to the very concrete, but I think I know where one is that can be donated to some organization.
Why should this be done? Simple – without this, “Web 2.0” is nothing more that a marketing term. Sure – several people will say “no – it means X and exactly X”, but the chances of Boolean Y = eval(personA.X == personB.X) evaluating to “1” in every instance is very low IMHO.
The Web 1.0, aka the “internet”, has achieved a common definition though. Even though it is not concisely written, there is general consensus on what a web server does, what the layered wire protocols do, how security works and how people interact with websites (via browsers). If someone says “this server is internet enabled”, people imply that it means it can take HTTP requests and return text in compliance with the requesters requirements.
Sorry – folks. I just don’t believe that the Web 2.0 will inherit an implied reference model the way the Web 1.0 did. The culprit for this is the Web 1.0 as it exists – it allows anyone, almost anywhere, to write what they think the Web 2.0 is and share it with others. Also, unlike the Web 1.0, the Web 2.0 is not mandatory. The basic components of the Web 1.0 such as HTTP, TCP/IP, SMTP, MIME, HTML etc all were mandatory, therefore it was fairly easy to draw a box and state – this is the Web 1.0. One could even through in some common non-mandatory components and still make a solid statement (example – scripting languages (ASP, VBScript, JavaScript, ActionScript plus CSS, XML et al).
A Reference Model for the Web 2.0 might want to declare some form of compliancy and conformancy statements. Such might be a weighted test of it could be a bar that you must pass. Regardless, before this exists, what is the point of building “validators” and “meters”. Harrumph – end of rant. Take it all with a grain of salt – the Oreilly folks are smart and I’m sure we’ll see something soon ;-)
If not – does anyone feel compelled to take a stab at a formal definition? I will gladly jump in the fray and donate my time to help.
*Yet Another Blog Article – in case you didn’t figure it out ;-)
Canadian Cybertech assists with Clean Technology adoption ranging from software systems architecture, system design and advancement of user experiences/security. We have over 25 years of experience helping companies gather the full and auditable requirements for IT projects to ensure success.
Tuesday, April 18, 2006
3 comments:
Do not spam this blog! Google and Yahoo DO NOT follow comment links for SEO. If you post an unrelated link advertising a company or service, you will be reported immediately for spam and your link deleted within 30 minutes. If you want to sponsor a post, please let us know by reaching out to duane dot nickull at gmail dot com.
Subscribe to:
Post Comments (Atom)
From their website:
ReplyDelete"Who makes the rules?
All the rules of web 2.0 are provided by users of this site. The definition of web 2.0 changes on a daily basis. Now you can keep up with your web 2.0-ness since this site checks randomly against the most recent rules decreed by it's users."
Changes on a daily basis??
How can anyone take the idea of Web 2.0 seriously when seeing this?
I agree with you, but some are happy without a functional definition of the term:
ReplyDeletehttp://radar.oreilly.com/archives/2005/08/not_20.html
http://radar.oreilly.com/archives/2005/09/what_is_web_20.html
Duane, check this out ;-)
ReplyDeletehttp://www.tomrafteryit.net/oreilly-trademarks-web-20-and-sets-lawyers-on-itcork/