We’re coming to a season of a major change in computing. People use computers in different ways, for different reasons and yet users are forced to interact uniformly with them. Since geeks, like myself, design software we often times assume that our users interact with it in the same way we do. That they understand our terminology, our processes, our methods.
A software developer is not a regular user, Apple knows this.
iOS has been heralded since its inception way back in the introduction of the iPod Touch. People switch on a Touch or an iPhone and instinctively know how to use it. I’m a web developer and I can’t even say that for Windows’ Control Panel.
Techies are power users and probably make up about 5% of the computer user landscape. Yet the other 95% are crammed into using the same axioms. Computers in a few years time will be no more about point and click than floppy disks were 20 years ago. In fact, these computers won’t even be known as computers (since computers are inherently geeky). They’ll be known as tablets or pads or pods.
Computers will be truly ubiquitous when we don’t realise they’re there.
I think there’s a good lesson for web developers here. The vast majority of users will avail of your application if it gets out of their way and lets them do what they want. There will be a small percentage of people who will want more functionality but don’t let their desires come at the expense of the majority (see: multi-tasking on the iPhone). If you do that simple job with a degree of elegance, then you’ve got a good application.