Thursday, April 1, 2010

Harvard Magazine: What Every Business Can Learn from Apple's iPad

by Peter Merholz

It is likely that the computer you're using to read this is actually not very well suited to the task. The personal computer and graphical user interface (GUI), with keyboard and mouse for input, and a separate display as output, was developed in the early 1980's (and popularized by Apple beginning in 1984). The predominant non-gaming use at the time was the creation of documents, such as with a word processor, spreadsheet, or desktop publishing program. Unlike text-based systems such as MS-DOS, the "What You See Is What You Get" (WYSIWYG) GUI allowed you to see on screen something that closely resembled what you got when printed.

Over the past 15 years, due to the increased penetration of the Internet, the percentage of computing use spent creating documents has plummeted. Instead, most time is spent either communicating (originally just email, then adding IM, Skype audio and video, and social networking) or consuming media (text, images, and video). However, our computing tools haven't appreciably changed.

Which leads to the question, "What would be the best computing interface for communication and consumption?" If you were willing to forgo legacy, and design a device specifically for these uses, you could very well arrive at something like Apple's iPad. This has been a recurring theme for Apple. Whether it was the original Macintosh, or iMac, or iPod, or now iPad, Apple is surprisingly cavalier about supplanting an existing cash cow with a next generation product that responds to how the market is moving.

The one thing to learn from iPad is to ask yourself the question, "What assumptions are we, and the rest of our industry, making about customer behavior that might simply no longer be true?"

Read the rest here.

1 comment:

  1. This guy is writing about things he hardly knows or understands. Just a few things:
    1. MS-DOS could print exactly the same thing that was on the screen. It simply looked bad anywhere: a fixed-width font on a dot-matrix printer was as "reader-friendly" as it was on a low-resolution display. In addition, in the early days the latter had a migraine-inducing 60Hz frame rate and an eye-burning CRT. No later than 10 years ago all that was fixed: laser printers with 1200dpi resolution could produce press-quality documents; CRTs were improved to flat screens with 100Hz+ eye-safe frame rate, high resolution and a stack of RF emissions standards. We were HAPPY with all that, long before iWhatever cult was conceived. So now, years after LCD displays became commonplace, when $100 laser printers produce press-quality documents, and people have several comm gadgets on them at any given moment, even texting while they are driving, asking "What would be the best computing interface for communication and consumption?" is a little late, about 10-20 years late to be precise. It is FINE as it is.

    And one to the head: if the industry wants to know what to do next, here is a tip. Make things of high QUALITY, as they used to be made. I don't want a crappy-by-design product that will be obsolete in 6 month. I will pay several times more for a good one that will last 5+ years. The quality is gone, period. Made in China is not that fun any more. I want it to be made in Japan (yes, pay more and use it longer) the way Toyota makes things -- with kaizen and all that. At the moment things are normally designed (gasp!) by the marketing department and made by trained monkeys who don't really know what they are doing, then sold for peanuts. I want that "legacy" quality back, and I would pay more any time to have it back.

    (30+ old-school-educated PhD electronics engineer, so I know what I'm talking about here)

    Robert, sorry if this is misplaced.