The Pattern


In the technology world innovation is the first step to creating new technology, but that's only the start. When technology becomes mature it is a different beast, and it can be hard to pin down when it is mature. There is however a pattern that is being followed by technology during its growing up phase.

Phase 1: Innovation

The first step in any new technology is when it is created. This step is when it is innovative and people start jumping on the bandwagon. For my example I will use the personal computer. When personal computers were first created they were cool and new. A good idea. At this stage people are reluctant and the uptake is slow.

Phase 2: Competition

The competition phase is when the technology becomes more viable and more companies or entities start develop their own spin off. Now people are buying pc's from vendor x,y,z etc. This is a bad growth stage because each entity makes a unique product and most of the products have unique form factors, unique systems that make them possible etcetera. They don't work together nicely across the different types or spin-offs.The idea in this stage is to establish a market leader. Whoever makes the best product wins(in a perfect world). In the real world it is the product that has the best marketing and the best shady tactics for kicking competition in the balls.

Phase 3: Chaos

At this stage many different systems are running but are unable to talk with each other or work together. Frustration grows and people want systems that are the same, but different. In the context of a PC you could compare it to having a system running dos, and another running Unix. You cant run Unix applications n DOS, and you can't run DOS applications in Unix. At this stage standardisation can take place next, ad should, but often it doesn't.

Phase 4: A winner emerges

In the case of operating systems for PCs Windows won the battle and most people started using windows. Suddenly everyone could communicate and there was a de facto standard. Competition is crushed and everyone seems pretty happy.

Phase 5: More competition

At this stage alternatives to the mainstream are developed and they make effort to interoperate with the de facto standard. Suddenly choice is there, but it is hard to make the choice because it has to play nicely with the de facto standard. Wine and Samba are some examples of Linux trying to play nice with Windows.Wine fights the good fight but suffers tremendously. In this phase the competitors try to topple the de-facto standard in order to move to the next phase.

Phase 6: Standardisation

Because the de facto standard tries to make the rules the alternatives gain enough ground to make rules of their own. They make an effort to work together to make their own standards. The leader of the pack(Like Windows) now has to either choose to play along or to try and make their won standard win(OOXML vs ODF). This transformation is slow, but the real standard will usually win because people realise the value of the open standard eventually and want to make it easy to switch between alternatives based on their needs and not some scary vendor lock in.

Phase 7: Commoditisation

At this stage it doesn't matter which alternative you use anymore, because they all play nicely together and conform to open standards. This is what is happening with browsers now. These days you can (almost)choose between browsers for what they offer and not what they support because they all support the same technology. This is when a technology is really mature and real competition flourishes.

Rinse and repeat

In open source the 5th phase falls away and is rather followed by standardisation and commoditisation. In both open and closed development the cycles starts from the beginning again with innovation. The cycle is quicker when open technology is involved because there is no need to keep a standard if it is obsolete. When I was dreaming up these phases I did realise that it doesn't always happen exactly like this, although the model seems quite solid to me. The browser wars make it work and also the operating system wars. Linux is and Mac are gaining traction and other operating systems are also popping up seemingly out of nowehere. This phase is long and painful and the winner is still undecided. Some application vendors are already making multi-platform applications because they want to cater to the chaos.

Spamming yourself

The internet brings us social networking and information at our fingertips. Or does it? The truth is that you can search your ass off for something the one day and not find it. Two days later you try again with different search terms and you get it. The web is just that. A web of information, and I dont mean a beautiful orb web. The web is currently more like a web woven by a spider on caffeine: Observe:

Spider caffeine before after photo

What this means

This means that we search in multiple places for the same things, we sign up to multiple websites for different things and end up with many different accounts at different places spamming us with updates in our inboxes. Luckily e-mail clients have evolved to a point where we can filter and categorise our mail, but the web itself fails to keep itself organized.


The internet is about data exchange. The web is about data presentation. Often we are interested in the data so we go to the web to have it displayed. The displays are however inconsistent and spread all over the place. Now if we just had access to the data in a uniform display... if only. This is where aggregators and data APIs step in. The problem with these things is that they are few and far between, and their implementations are different. So we cant just plug one application into multiple internet sites and suck information into our own user interfaces. A desktop application running multiple forum sites which you are on will for instance work much faster than a website that you have to go to. It just pulls the data from the various sites. The interface and functions are the same. It's easy to work with.

I Know I bitch alot

I know I bitch alot about the web vs the desktop. The reason is that standards do matter. One standard like RSS has made a big difference in the usability of websites. Now you can see updates to your favourite sites very easily, without loading all the pages. Now imagine if facebook and other sites worked similarly. Why are the web and the desktop seperate in the first place? Why are we constantly booting up our computers just to launch a browser to do what is important when the softare we have should be capable of that. How can we achieve that?

Dynamic UIs and Scripted applications

What makes the web the web is 2 things:
- Dynamic user interfaces that are stored at a single place and can be updated centrally
- Dynamic code that runs on a central server
So why cant desktop applications do the same? There were attempts at this. Java was one, but failed to an extent because java application where big downloads if they had to happen every time. Ad revenue is a big driver for internet sites. With apis and local running applications ads might dissappear, leaving no revenue stream for websites.

If I Was tasked with solving the problem

I would make an application framework that allowed developers to write applications stored on a central server that are downloaded once and used many times. The applications script would be written in python and/or javascript and would be cached locally until an update to the application is issued. There would be an option to run some script server side(for security reasons maybe or to do something like upload files to the server).

I believe that there are frameworks and ideas trying to solve this problem currently. Like Adobe AIR and and others... I don't know if these will take off and what the implication will be for web applications.

And there goes more of my rabbling