Jobs called Fadell, Rubinstein, and Schiller to a secret meeting in the design studio conference room, where Ive gave a demonstration of multi-touch. “Wow!” said Fadell. Everyone liked it, but they were not sure that they would be able to make it work on a mobile phone. They decided to proceed on two paths: P1 was the code name for the phone being developed using an iPod trackwheel, and P2 was the new alternative using a multi-touch screen.
A small company in Delaware called FingerWorks was already making a line of multi-touch trackpads. Founded by two academics at the University of Delaware, John Elias and Wayne Westerman, FingerWorks had developed some tablets with multi-touch sensing capabilities and taken out patents on ways to translate various finger gestures, such as pinches and swipes, into useful functions. In early 2005 Apple quietly acquired the company, all of its patents, and the services of its two founders. FingerWorks quit selling its products to others, and it began filing its new patents in Apple’s name.
After six months of work on the trackwheel P1 and the multi-touch P2 phone options, Jobs called his inner circle into his conference room to make a decision. Fadell had been trying hard to develop the trackwheel model, but he admitted they had not cracked the problem of figuring out a simple way to dial calls. The multi-touch approach was riskier, because they were unsure whether they could execute the engineering, but it was also more exciting and promising. “We all know this is the one we want to do,” said Jobs, pointing to the touchscreen. “So let’s make it work.” It was what he liked to call a bet-the-company moment, high risk and high reward if it succeeded.
A couple of members of the team argued for having a keyboard as well, given the popularity of the BlackBerry, but Jobs vetoed the idea. A physical keyboard would take away space from the screen, and it would not be as flexible and adaptable as a touchscreen keyboard. “A hardware keyboard seems like an easy solution, but it’s constraining,” he said. “Think of all the innovations we’d be able to adapt if we did the keyboard onscreen with software. Let’s bet on it, and then we’ll find a way to make it work.” The result was a device that displays a numerical pad when you want to dial a phone number, a typewriter keyboard when you want to write, and whatever buttons you might need for each particular activity. And then they all disappear when you’re watching a video. By having software replace hardware, the interface became fluid and flexible.
Jobs spent part of every day for six months helping to refine the display. “It was the most complex fun I’ve ever had,” he recalled. “It was like being the one evolving the variations on ‘Sgt. Pepper.’” A lot of features that seem simple now were the result of creative brainstorms. For example, the team worried about how to prevent the device from playing music or making a call accidentally when it was jangling in your pocket. Jobs was congenitally averse to having on-off switches, which he deemed “inelegant.” The solution was “Swipe to Open,” the simple and fun on-screen slider that activated the device when it had gone dormant. Another breakthrough was the sensor that figured out when you put the phone to your ear, so that your lobes didn’t accidentally activate some function. And of course the icons came in his favorite shape, the primitive he made Bill Atkinson design into the software of the first Macintosh: rounded rectangles. In session after session, with Jobs immersed in every detail, the team members figured out ways to simplify what other phones made complicated. They added a big bar to guide you in putting calls on hold or making conference calls, found easy ways to navigate through email, and created icons you could scroll through horizontally to get to different apps—all of which were easier because they could be used visually on the screen rather than by using a keyboard built into the hardware.