Putting a Face on Web 2.0

0
shares
Be First to Share ->
Share on Twitter
Share on Google+
Share on LinkedIn
Pin to Pinterest
Share on StumbleUpon
+
What's This?

Montreal, QC (PRWEB) April 21, 2006

Sponsored by the Association for Computing Machinery, CHI 2006–the premier international conference for human-computer interaction–is being held here in order that researchers and practitioners from all segments of the CHI community–design, education, engineering, management, research, and usability–can interact, inform and inspire each other.

If you haven’t heard the phrase “Web 2.0” yet you will soon enough. It has been adopted by countless website marketers with hopes of sounding cutting edge and “forward thinking”. The problem with this moniker is that it doesn’t really exist. Unlike proprietary software like Adobe Photoshop or Microsoft Windows that have releases and updates, or even open source languages like PHP and Perl, there is no governing body that determines what is and, more importantly, what is not “Web 2.0” compliant.

Originally conceived of by O’Reilly Media and MediaLive International, Web 2.0 was introduced to the world at a conference in 2004. So what exactly was Web 2.0? The answer, at the time, was unclear. The primary message most conference-goers took away from the event was “the web as a platform”. Since then, O’Reilly has endeavoured to expand this concept into a set of ideals that would more clearly define Web 2.0.

If you visit the Web 2.0 conference website (http://www.web2con.com), you will find an impressive list of speakers, moderators, seminars and workshops worthy of any international web-based conference. What you won’t find, however, is a definition of Web 2.0. For this we must search the O’Reilly website (http://www.oreilly.com) to find an article written by founder and CEO Tim O’Reilly from September 2005 (http://tim.oreilly.com/news/2005/09/30/what-is-web-20.html).

In the article O’Reilly states, “You can visualize Web 2.0 as a set of principles and practices that tie together a veritable solar system of sites that demonstrate some or all of those principles, at a varying distance from that core.”

Essentially, Web 2.0 is more about guiding principles than rules or even technology. The seven guiding principles O’Reilly sights are:

1. The web as a platform

2. Harnessing collective intelligence

3. Data is the next Intel Inside

4. End of the software release cycle

5. Lightweight programming models

6. Software above the level of a single device

7. Rich user experience

While all of these principles are very user-oriented, notice only the last one actually contains the words “user” and “experience”.

The entire Web 2.0 philosophy centres around the idea that data streams will provide the next model of what the web should be. One of the biggest omissions in that model, however, is the end-user experience regarding the presentation of this data. If we examine the principles we see the gap between “back-end” and “front-end” that will mean the difference between novelty applications and usable, robust applications and services that will rise to the top of the Web 2.0 crop.

The quintessential example given for using almost every guiding principle in any Web 2.0 discussion is search engine giant Google. During the days of Web 1.0 Google was already using the web to essentially run their “application” which, to most users, was simply a website that was a search engine; a really good search engine. Not many users stopped to consider why or how Google was doing this. It didn’t matter that there was an enormous back-end database or that Google wasn’t even hosting the data but rather acting as a middle-man between the user and his/her online experience. Users were unaware that Google was acting like a software application that had no download, installation, updates or fees. They just used it because it worked really well. While this example is often used as the epitome of “the web as a platform” in Web 2.0 discussions, the focus is exclusively on how Google technologically harnessed this data and delivered it to its users. It is sighted for using PageRank to provide better search results. Consider what would have happened, however, if Google had ignored basic user experience principles and made it difficult for users to see and/or act on their search results. Users would have been just as frustrated as if they hadn’t received the results in the first place and, in most cases, even more frustrated knowing that the results were there, but that they weren’t accessible because of poor design.

By adding services besides their search engine Google has been able to “end the software release cycle”. You don’t download or install new versions of Google; it just has new features from time to time. Virtually all of these extra services rely on some type of data-driven delivery system. It was their integration of NAVTEQ and DigitalGlobe data to provide a mapping system that allowed them to effectively enter the mapping arena with MapQuest (arguably the pioneers in the field). Yet when this example is noted it is usually in reference to the fact that the data providers will hold the keys in Web 2.0: “Data will be the new Intel Inside”. It is used as an example of a company’s ability to license the same data as another to offer a competing product or service. It is never mentioned that Google’s delivery system is also a much better user experience. Try mapping a specific location or general area in both systems. Google’s maps are draggable; and the intuitive, continuous-zoom slider along with the Map, Satellite and Hybrid views put it head-and-shoulders above MapQuest’s results. Not to mention the method for marking your location (MapQuest uses a star, Google Maps uses a virtual pushpin with a collapsible information bubble telling you about your location and giving you several options about handling the data). It is the user experience above all that makes Google’s mapping system superior. If MapQuest had offered a better user experience it would have been much more difficult for Google–or Yahoo! or Microsoft–to bring in a successful competing service.

Another widely touted Web 2.0 model is Apple’s iTunes. While admittedly not really a web application, it uses the power of a web-based, data-driven back end and encompasses many of the Web 2.0 principles including “software above the level of a single device”. In the Web 2.0 paradigm (yes, the word actually applies appropriately in this case) users can access software on a plethora of devices running myriad operating systems. iTunes certainly fits this bill by seamlessly integrating PC and Mac towers and laptops and the many different versions of the iPod using essentially the same software (there are, of course, Windows and Mac versions of the software as well as the “invisible” software that gets downloaded to the device when connected to your computer). iTunes is proprietary and does have release versions, but it is free and virtually platform-independent.

Apple has also been able to gain access to data that was previously released solely through the data providers’ controllable channels. Record companies want the power to tell you where you can buy songs and how much you’re going to pay for them. Now, with the iTunes store, you can buy a single song for ninety-nine cents. Any song in the library, no separate price points based on availability or popularity. These songs would not be available without the agreement of the record companies and they are, of course, making money from each sale. However, they have lost the power to make you buy an entire album at their fixed price point because you happen to like one song. While Web 2.0 evangelists will have you believe this is a result of the ability to harness and exploit a huge repository of data, they seem to forget the actual reason iTunes is so popular: the iPod.

It is the iPod alone that accounts for the enormous success of iTunes. There has never been an explosion of a single computer-related device, nor the unprecedented growth from quarter to quarter, as big as the iPod phenomenon. iPods sell because people like them and people like them because of their design. The integration of industrial and software design has made the iPod the item to have. The success of iTunes is a natural extension of the success of the original device created to run the software. Thus, good user experience design and not Web 2.0 principles is responsible for iTunes’ high adoption rate.

Another hot Web 2.0 topic is RSS and similar technologies which allow users to be notified when information that is important to them changes. This could include stock market quotes, blog posts, annotations to images a user has uploaded to an image-sharing service such as Flickr, or the top news stories as reported by the user’s favourite news source. While this information is becoming more easily accessible, there is still something to be said for the experience of receiving the information from the data provider in a user-oriented fashion. Having streams of news items pour in through a text-only, license free web application is certainly convenient, but when I go out to get my newspaper in the morning I get (some) exercise, chat with the magazine stand owner and I can always take my newspaper to the beach. Of course one could argue that through chatrooms, blogs and message boards I could easily chat with any number people while exercising at home and then take my iPod to the beach while listing to a podcast of the latest news, but for me it’s just not the same. Reading lyrics and liner notes on my computer screen will never replace opening a CD for the first time and reading the insert as the artist intended me to see it. The overall user experience will always determine whether users continue to access content from a particular source. Record sales certainly haven’t faltered as a result of Napster or iTunes–regardless of what the record companies would like you to think–because people still like the experience of buying records.

Web 2.0 sees the user as a co-developer and “harnesses the collective intelligence” of the users to build larger data models as in the widely accessed wikipedia (http://www.wikipedia.org). The theory is that every user is able to edit any entry and that the more users view and change an entry, the more robust and, hopefully, accurate it becomes. There is a grassroots level to this that has an intriguing notion–indeed I have more independent news writers bookmarked now than “legitimate” sources such as CNN–but it ventures dangerously close to the realm of “design by committee”. Once Hollywood started prescreening features to “test audiences” and re-cutting movies based on the audience’s reactions and input, we started getting some very mediocre–at best–movies that made predictable box office results. The idea is that the many know what is good for the many. When it comes to politics and lunch, I’m on board. When it comes to design I have reservations about this idea. If artists began creating demos and then sending them out to their fans to get input before creating the final recordings every record would start to sound the same. People know what they like; but sometimes they don’t know that they like it until they hear it. When an artist I like changes gears in their career it is sometimes a richly enjoyable experience (there is scarcely a David Bowie album I don’t like). When they go in a direction I don’t like I simply don’t listen anymore. While this can lead to disappointment in some cases, I certainly have no desire to tell the artist what or how to play just to be confident that I’ll like it. It is out of the surprise of hearing something different that we get the most enjoyment art and design can offer–discovery.

“Rich user experiences” can only be accomplished if there is user experience in mind when designing these converging technologies. By implementing “mashups” (taking two or more distinct technologies and merging them into one experience), web service providers have the ability to create engaging content. Take the often used example of housingmaps (http://www.housingmaps.com). By harnessing the realty listings of Craigslist (http://www.craigslist.com) and the mapping technology offered by Google, housingmaps has the power to potentially crush many classified listing services. While this is an ingenious use of two separate systems, it will take a much more elaborate and intricate system to create a very usable design solution that will last. When a DJ creates a “mashup” using Kylie Minogue and New Order, it’s a novelty; but it’s certainly not going to appear on my iTunes playlist.

At the eminent Conference on Human Factors in Computing Systems (CHI 2006) in Montréal this April (http://www.chi2006.org), it will be interesting to see the impact of Web 2.0 on user experience design principles. Here is the premier international conference for computer-human interaction taking place in the midst of Web 2.0 which basically implies that front-end design is going to give way to back-end technology. Web 2.0 theorizes that it’s the content, not the container, which will drive the evolution of the web. But even Don Norman admits that “attractive things work better”. I can’t wait to see how researchers and practitioners from all segments of the CHI community address the issue of Web 2.0. Of particular interest will be the session entitled “Add a Dash of Interface: Taking Mash-Ups to the Next Level” in which panelists Ben Metcalfe–project Lead of the BBC’s developer network (http://backstage.bbc.co.uk)–and Bret Taylor–product Manager for Google Maps (http://maps.google.ca/) talk about how the remix community is going to take mash-ups to the next level-by combing data sources with innovative interfaces to produce consumer-friendly mash-ups.

There will always be innovators that see two unrelated and often seemingly diametrically opposed ideas and find a way to harmonize them to create a whole that is greater than the sum of its parts. While there is no doubt that the guiding principles of Web 2.0 have definite merit and are certainly a good step towards what the web should–and probably will–be, I believe it will take such visionaries to combine the principles of Web 2.0 with the principles of good user experience design and just the right amount of art (both in back end and front end design) to create the true future of the web.

ACM, the Association for Computing Machinery (http://www.acm.org), is an educational and scientific society uniting the world’s computing educators, researchers and professionals to inspire dialogue, share resources and address the field’s challenges. ACM strengthens the profession’s collective voice through strong leadership, promotion of the highest standards, and recognition of technical excellence. ACM supports the professional growth of its members by providing opportunities for life-long learning, career development, and professional networking.

###







Leave a Comment

Your email address will not be published. Required fields are marked *