Innovation in Tech Evolves in New Ways
By Christopher Mims for WSJ
Early reviews of Apple Inc.’s new iPhone 7 were, in a word, “meh.” Pundits praised the many improvements in the device, but a consensus emerged that Apple had not given existing iPhone owners a compelling reason to upgrade.
Why is that? Why are the iPhone, and other computing devices like PCs and tablets, not changing as quickly as they once did? There are many reasons, but the central issue is this: It is harder than ever—more technically difficult, more expensive and more time consuming—to advance the state of the art. Our devices are so complicated that, at their most fundamental level, advancing them further pushes against the boundaries of physics.
That is no reason for despair. The technology inside these devices is evolving in a way that opens opportunities for new types of devices and new experiences.
Intel Corp., which makes the brains inside most personal computers, is in a similar boat to Apple. Both have traditionally made major upgrades to their core products every two years, but are slowing that pace.
Intel now says it will deliver new technology allowing it to cram more transistors on a chip every three years. It calls its latest processor, Kaby Lake, an “optimization” of prior technology. That means smaller improvements in PCs and related devices, contributing to slower sales in those arenas.
The iPhone, too, was on schedule for a major redesign this year, two years after the iPhone 6 and 6 Plus delivered Apple’s first large-screen phones. Instead, Apple offered improved battery life, a faster processor, water resistance and a better camera—two rear-facing cameras on the iPhone 7 Plus. In the words of my colleague Geoffrey Fowler, “practical, but not jaw-dropping.”
On closer inspection, though, both companies’ technology will enable an ever-growing list of new features and applications.
Take Intel’s latest microprocessor. It may be only marginally faster at traditional computing tasks, but it comes with a graphics processor that is estimated to be twice as fast as its predecessor. The chip can also handle 4k video, with four times as many pixels as traditional HD video. On its own, that may not mean much. But in a world in which virtual reality could be the next killer app for PCs, it’s big.
Apple, too, built an improved graphics processor into the A10 chip at the heart of the new phone. That supports the new camera, which Apple marketing head Phil Schiller calls a “supercomputer for photos.” Apple touted the iPhone 7’s significantly improved graphics as a boon for gamers, but it also could lay the groundwork for an iPhone supporting virtual or augmented reality, which the company is rumored to be working on.
These features are just the beginning. Hardware makers’ primary response to the slowdown in improvements in chip performance has been to create customized chips for specific tasks.
These chips can accomplish things much faster than the general purpose chips that have long been the “brains” of PCs and smartphones. They’re finding applications in artificial intelligence, machine vision, audio processing and other tasks, says Mark Hung, a vice president of research at Gartner Inc.
With improved imaging and display capabilities, for example, Apple and its competitors are moving us toward a 3-D future—3-D video and 3-D playback in the form of virtual reality. Other specialized chips enable the machine vision behind self-driving cars, and the deep learning, a form of artificial intelligence, powering virtual assistants like Apple’s Siri, Microsoft Corp.’s Cortana and Amazon.com Inc.’s Alexa.
In the words of Mr. Hung, the Gartner analyst, these specialized chips are part of a trend called “heterogeneous computing,” where devices serving different purposes are powered by different chips. At Apple’s event last week, for example, the company highlighted three new chips: the A10 at the heart of the iPhone 7; an improved S1 chip in Apple Watch Series 2; and the W1 chip in its AirPod wireless headphones.
In other words, Apple laid out an ecosystem of devices with custom chips, each intended for different tasks. It is a form of heterogeneous computing that transcends the chip, and is visible at the device level.
Apple is hardly alone on this path. Intel has been on an acquisition spree, buying chip companies that specialize in vision, artificial intelligence, servers and the Internet of Things.
With these advances, Apple and Intel are putting technology in the hands of developers who will create the next must-have capabilities, from 3-D interfaces and indoor mapping to the kinds of artificial intelligence that could give rise to new methods of interacting with computers.
It is worth remembering that “incremental” advances from Apple and its rivals have yielded some of the most valuable startups in history, from Uber Technologies Inc. to Airbnb Inc. My favorite example of how a seemingly trivial innovation can lead to a huge shift is an addition that appeared first on Android smartphones in 2003, but had its biggest impact when Apple introduced it on the iPhone 4 in 2010. Who would have predicted that adding a front-facing camera would give rise to Snapchat Inc.?
First appeared at WSJ