For most of the personal computer generation, success of computers and their operating systems have been ruled by people’s ability to do what they needed to do on or with the computer. We’re starting to enter an era where the success of a computer and/or it’s operating system is going to hinge on it’s ability to reduce usage friction compared to another.
We’re entering an era where software has effectively become a commodity—where the majority of people don’t differentiate between the major operating systems or the most frequently used software.
To a certain degree computers and computer technology is becoming a bit of a fashion; but that’s due in large part to the fact that their software has become a commodity. Someone can surf the web, write a document, analyse data in a spreadsheet on any computer. Most major OSes and other software are largely the same. They have all the same features, roughly the same non-functional features, etc. Anyone can go into a computer and pick one of several OSes and begin doing, effectively, anything they need. The success of an OS is no longer driven by the fact that the software you need/want to use is only available on one platform
For the most part, most of the computer generation has been largely ruled by people willing to put in the time to “build” computers in order to use software. They started out physically building computers; getting the hardware parts and putting them together to form a workable computer. That evolved to off-the-shelf computers that people had to get the OS and software they wanted up and running to do what they want. That evolved to turn-key systems that you could take home, turn on, and be using software in no time—every-day software was already installed and ready to use.
Now we enter a new era. Anyone and their sister seems to be supplying hardware and software combinations allowing people to do almost 99% of all software-based tasks. We’ve got Windows-based computers, various Linux-based computers (PCs, phones, tablets, etc.), Mac-based computers, iOS-based computers and phones, etc. This era is truly about usability. Whatever OS or software does what you need it to do with the least amount of friction is king. We’re even seeing hybrid systems. I can run a Mac running virtual machines to get Mac software and Windows software on the same computer. I’m not limited to using software for any one particular platform…
So, what does this mean? This means software developers, manufacturers, and OS designers need to pay a renewed attention to usability. We had a quantum leap in usability back when Xerox basically created the GUI. But, that’s largely stagnated in the past 20-30 years. That’s not to say there needs to be much improvement in the basic GUI; but there needs to be a *huge* quantum leap in basic software’s usability. We take for granted that our OS provides all the usability we need, piggy-backing, for the most part, on what it has to offer. But, that time is well on it’s way to moving to the past.
We have put with software that is unresponsive, uses huge amount of power, is flakey, requires weird unintuitive actions, has poor performance, or poor reliability and robustness.
No longer is it acceptable for things to eventually work through some weird series of steps and incantations. Software needs to work and it needs to work well and in a way we expect it to. No longer can software just stop responding to our requests, not longer can software require a complex series of actions to do what we want it to do, no longer can one suite of software drive the success of an entire operating system. No longer is it acceptable to expect people to reinstall, reboot, or reformat. People are able to do what they want to do with a multitude of options. The differentiating factor is starting to become usability. Software, computers, and technology that make it easier for people to do what they want to do (from running their operating system to surfing the web, to using a “word processor”) is quickly rising to the top of the market.
Case in point has been the Mac. It doesn’t really offer anything new or anything innovative. It’s just another computer with the same typical software selection. The differentiating feature of the Mac has been usability. The Mac makes thing easier to use. Multi-touch on the Mac, for example, works. You don’t get a jerky half-assed implementation of scrolling with multi-touch on the Mac, you get something smooth and it’s intuitive. I can surf the web, write documents, send email, watch videos, listen to music, instant message, etc. on the Mac; and I don’t have to deal with drivers from x number of companies that don’t really give two licks about usability.
In the next little while either software and hardware companies are going to realize that usability is the feature that people are spending their money on and continue to be, or become, differentiating players in the market, or they’re simply going to fall to the wayside. It’s going to matter if they’ve been in the industry for almost 40 years. There is no loyalty in business, if there’s something better, easier to use, faster, cheaper, whatever; people are going to use it. It’s time for some of these companies to wake up and see the writing on the wall.
It’s also time for us, the users of software, to take a stand and be vocal about this stuff. We don’t have to sit in silence and accept the whims of some software designer who hasn’t seen the light of day in 6 months. If I purchase a piece of hardware that advertises “multi-touch” I expect to be able to use it, or software that implies common metaphors by how they appear should function like those common metaphors, etc.
Got any gripes about software or computers? Post them here, let’s start and continue a dialogue on what usability should be…