3) Nehalem and Swift Chips Spell the End of Stand-Alone Graphics Boards
When AMD purchased graphics card maker ATI, most industry observers assumed that the combined company would start working on a CPU-GPU fusion. That work is further along than you may think.
What is it? While GPUs get tons of attention, discrete graphics boards are a comparative rarity among PC owners, as 75 percent of laptop users stick with good old integrated graphics, according to Mercury Research. Among the reasons: the extra cost of a discrete graphics card, the hassle of installing one, and its drain on the battery. Putting graphics functions right on the CPU eliminates all three issues.
Chip makers expect the performance of such on-die GPUs to fall somewhere between that of today's integrated graphics and stand-alone graphics boards--but eventually, experts believe, their performance could catch up and make discrete graphics obsolete. One potential idea is to devote, say, 4 cores in a 16-core CPU to graphics processing, which could make for blistering gaming experiences.
When is it coming? Intel's soon-to-come Nehalem chip includes graphics processing within the chip package, but off of the actual CPU die. AMD's Swift (aka the Shrike platform), the first product in its Fusion line, reportedly takes the same design approach, and is also currently on tap for 2009.
Putting the GPU directly on the same die as the CPU presents challenges--heat being a major one--but that doesn't mean those issues won't be worked out. Intel's two Nehalem follow-ups, Auburndale and Havendale, both slated for late 2009, may be the first chips to put a GPU and a CPU on one die, but the company isn't saying yet.
4) USB 3.0 Speeds Up Performance on External Devices
The USB connector has been one of the greatest success stories in the history of computing, with more than 2 billion USB-connected devices sold to date. But in an age of terabyte hard drives, the once-cool throughput of 480 megabits per second that a USB 2.0 device can realistically provide just doesn't cut it any longer.
What is it? USB 3.0 (aka "SuperSpeed USB") promises to increase performance by a factor of 10, pushing the theoretical maximum throughput of the connector all the way up to 4.8 gigabits per second, or processing roughly the equivalent of an entire CD-R disc every second. USB 3.0 devices will use a slightly different connector, but USB 3.0 ports are expected to be backward-compatible with current USB plugs, and vice versa. USB 3.0 should also greatly enhance the power efficiency of USB devices, while increasing the juice (nearly one full amp, up from 0.1 amps) available to them. That means faster charging times for your iPod--and probably even more bizarre USB-connected gear like the toy rocket launchers and beverage coolers that have been festooning people's desks.
When is it coming? The USB 3.0 spec is nearly finished, with consumer gear now predicted to come in 2010. Meanwhile, a host of competing high-speed plugs--DisplayPort, eSATA, and HDMI--will soon become commonplace on PCs, driven largely by the onset of high-def video. Even FireWire is looking at an imminent upgrade of up to 3.2 gbps performance. The port proliferation may make for a baffling landscape on the back of a new PC, but you will at least have plenty of high-performance options for hooking up peripherals.
5) Wireless Power Transmission
Wireless power transmission has been a dream since the days when Nikola Tesla imagined a world studded with enormous Tesla coils. But aside from advances in recharging electric toothbrushes, wireless power has so far failed to make significant inroads into consumer-level gear.
What is it? This summer, Intel researchers demonstrated a method--based on MIT research--for throwing electricity a distance of a few feet, without wires and without any dangers to bystanders (well, none that they know about yet). Intel calls the technology a "wireless resonant energy link," and it works by sending a specific, 10-MHz signal through a coil of wire; a similar, nearby coil of wire resonates in tune with the frequency, causing electrons to flow through that coil too. Though the design is primitive, it can light up a 60-watt bulb with 70 percent efficiency.
When is it coming? Numerous obstacles remain, the first of which is that the Intel project uses alternating current. To charge gadgets, we'd have to see a direct-current version, and the size of the apparatus would have to be considerably smaller. Numerous regulatory hurdles would likely have to be cleared in commercializing such a system, and it would have to be thoroughly vetted for safety concerns.
Assuming those all go reasonably well, such receiving circuitry could be integrated into the back of your laptop screen in roughly the next six to eight years. It would then be a simple matter for your local airport or even Starbucks to embed the companion power transmitters right into the walls so you can get a quick charge without ever opening up your laptop bag.
6) 64-Bit Computing Allows for More RAM
In 1986, Intel introduced its first 32-bit CPU. It wasn't until 1993 that the first fully 32-bit Windows OS--Windows NT 3.1--followed, officially ending the 16-bit era. Now 64-bit processors have become the norm in desktops and notebooks, though Microsoft still won't commit to an all-64-bit Windows. But it can't live in the 32-bit world forever.
What is it? 64-bit versions of Windows have been around since Windows XP, and 64-bit CPUs have been with us even longer. In fact, virtually every computer sold today has a 64-bit processor under the hood. At some point Microsoft will have to jettison 32-bit altogether, as it did with 16-bit when it launched Windows NT, if it wants to induce consumers (and third-party hardware and software developers) to upgrade. That isn't likely with Windows 7: The upcoming OS is already being demoed in 32-bit and 64-bit versions. But limitations in 32-bit's addressing structure will eventually force everyone's hand; it's already a problem for 32-bit Vista users, who have found that the OS won't access more than about 3GB of RAM because it simply doesn't have the bits to access additional memory.
When is it coming? Expect to see the shift toward 64-bit accelerate with Windows 7; Microsoft will likely switch over to 64-bit exclusively with Windows 8. That'll be 2013 at the earliest. Meanwhile, Mac OS X Leopard is already 64-bit, and some hardware manufacturers are currently trying to transition customers to 64-bit versions of Windows (Samsung says it will push its entire PC line to 64-bit in early 2009). And what about 128-bit computing, which would represent the next big jump? Let's tackle one sea change at a time--and prepare for that move around 2025.
7) Google's Desktop OS
What is it? It's everything, or so it seems. Google Checkout provides an alternative to PayPal. Street View is well on its way to taking a picture of every house on every street in the United States. And the fun is just starting: Google's early-beta Chrome browser earned a 1 percent market share in the first 24 hours of its existence. Android, Google's cell phone operating system, is hitting handsets as you read this, becoming the first credible challenger to the iPhone among sophisticated customers.
In case you haven't noticed, Google now has its well-funded mitts on just about every aspect of computing. FromWeb browsers to cell phones, soon you'll be able to spend all day in the Googleverse and never have to leave. Will Google make the jump to building its own PC operating system next?
When is it coming? Though Google seems to have covered everything, many observers believe that logically it will next attempt to attack one very big part of the software market: the operating system.
The Chrome browser is the first toe Google has dipped into these waters. While a browser is how users interact with most of Google's products, making the underlying operating system somewhat irrelevant, Chrome nevertheless needs an OS to operate.
To make Microsoft irrelevant, though, Google would have to work its way through a minefield of device drivers, and even then the result wouldn't be a good solution for people who have specialized application needs, particularly most business users. But a simple Google OS--perhaps one that's basically a customized Linux distribution--combined with cheap hardware could be something that changes the PC landscape in ways that smaller players who have toyed with open-source OSs so far haven't been quite able to do.
8) Gesture-Based Remote Control
What is it? Compared with the intricacies of voice recognition, gesture recognition is a fairly simple idea that is only now making its way into consumer electronics. The idea is to employ a camera (such as a laptop's Webcam) to watch the user and react to the person's hand signals. Holding your palm out flat would indicate "stop," for example, if you're playing a movie or a song. And waving a fist around in the air could double as a pointing system: You would just move your fist to the right to move the pointer right, and so on.
We love our mice, really we do. Sometimes, however, such as when we're sitting on the couch watching a DVD on a laptop, or when we're working across the room from an MP3-playing PC, it just isn't convenient to drag a hockey puck and click on what we want. Attempts to replace the venerable mouse--whether with voice recognition or brain-wave scanners--have invariably failed. But an alternative is emerging.
When is it coming? Gesture recognition systems are creeping onto the market now. Toshiba, a pioneer in this market, has at least one product out that supports an early version of the technology: the Qosmio G55 laptop, which can recognize gestures to control multimedia playback. The company is also experimenting with a TV version of the technology, which would watch for hand signals via a small camera atop the set. Based on my tests, though, the accuracy of these systems still needs a lot of work.
Gesture recognition is a neat way to pause the DVD on your laptop, but it probably remains a way off from being sophisticated enough for broad adoption. All the same, its successful development would excite tons of interest from the "can't find the remote" crowd.
9) Radical Simplification Hits the TV Business
The back of most audiovisual centers looks like a tangle of snakes that even Medusa would turn away from. Similarly, the bowl of remote controls on your coffee table appeals to no one. The Tru2way platform may simplify things once and for all.
What is it? Who can forget CableCard, a technology that was supposed to streamline home A/V installations but that ultimately went nowhere despite immense coverage and hype? CableCard just didn't do enough--and what it managed to do, it didn't do very well. Enter Tru2way.
Tru2way is a set of services and standards designed to pick up the pieces of CableCard's failure by upgrading what that earlier standard could do (including support for two-way communications features like programming guides and pay-per-view, which CableCard TVs couldn't handle), and by offering better compatibility, improved stability, and support for dual-tuner applications right out of the box. So if you have a Tru2way-capable TV, you should need only to plug in a wire to be up and running with a full suite of interactive cable services (including local search features, news feeds, online shopping, and games)--all sans additional boxes, extra remotes, or even a visit from cable-company technicians.
When is it coming? Tru2way sets have been demonstrated all year, and Chicago and Denver will be the first markets with the live technology. Does Tru2way have a real shot? Most of the major cable companies have signed up to implement it, as have numerous TV makers, including LG, Panasonic, Samsung, and Sony.
10) Curtains for DRM
What is it? It's not what it is, it's what it isn't--axing DRM means no more schemes to prevent you from moving audio or video from one form of media to another. The most ardent DRM critics dream of a day when you'll be able to take a DVD, pop it in a computer, and end up with a compressed video file that will play on any device in your arsenal. Better yet, you won't need that DVD at all: You'll be able to pay a few bucks for an unprotected, downloadable version of the movie that you can redownload any time you wish.
Petrified of piracy, Hollywood has long relied on technical means to keep copies of its output from making the rounds on peer-to-peer networks. It hasn't worked: Tools to bypass DRM on just about any kind of media are readily available, and feature films often hit BitTorrent even before they appear in theaters. Unfortunately for law-abiding citizens, DRM is less a deterrent to piracy than a nuisance that gets in the way of enjoying legally obtained content on more than one device.
When is it coming? Technologically speaking, nothing is stopping companies from scrapping DRM tomorrow. But legally and politically, resistance persists. Music has largely made the transition already--Amazon and iTunes both sell DRM-free MP3s that you can play on as many devices as you want.
Video is taking baby steps in the same direction, albeit slowly so far. One recent example: RealNetworks' RealDVD software (which is now embroiled in litigation) lets you rip DVDs to your computer with one click, but they're still protected by a DRM system. Meanwhile, studios are experimenting with bundling legally rippable digital copies of their films with packaged DVDs, while online services are tiptoeing into letting downloaders burn a copy of a digital movie to disc.
That's progress, but ending all DRM as we know it is still years off. Keep your fingers crossed--for 2020.
11) Use Any Phone on Any Wireless Network
The reason most cell phones are so cheap is that wireless carriers subsidize them so you'll sign a long-term contract. Open access could change the economics of the mobile phone (and mobile data) business dramatically as the walls preventing certain devices from working on certain networks come down. We could also see a rapid proliferation of cell phone models, with smaller companies becoming better able to make headway into formerly closed phone markets.
What is it? Two years is an eternity in the cellular world. The original iPhone was announced, introduced, and discontinued in less than that time, yet carriers routinely ask you to sign up for two-year contracts if you want access to their discounted phones. (It could be worse--in other countries, three years is normal.) Verizon launched the first volley late last year when it promised that "any device, any application" would soon be allowed on its famously closed network. Meanwhile, AT&T and T-Mobile like to note that their GSM networks have long been "open."
When is it coming? Open access is partially here: You can use almost any unlocked GSM handset on AT&T or T-Mobile today, and Verizon Wireless began certifying third-party devices for its network in July (though to date the company has approved only two products). But the future isn't quite so rosy, as Verizon is dragging its feet a bit on the legal requirement that it keep its newly acquired 700-MHz network open to other devices, a mandate that the FCC agreed to after substantial lobbying by Google. Some experts have argued that the FCC provisions aren't wholly enforceable
12) Your Fingers Do Even More Walking
Last year Microsoft introduced Surface, a table with a built-in monitor and touch screen; many industry watchers have seen it as a bellwether for touch-sensitive computing embedded into every device imaginable. Surface is a neat trick, but the reality of touch devices may be driven by something entirely different and more accessible: the Apple iPhone.
What is it? With the iPhone, "multitouch" technology (which lets you use more than one finger to perform specific actions) reinvented what we knew about the humble touchpad. Tracing a single finger on most touchpads looks positively simian next to some of the tricks you can do with two or more digits. Since the iPhone's launch, multitouch has found its way into numerous mainstream devices, including the Asus Eee PC 900 and a Dell Latitude tablet PC. Now all eyes are turned back to Apple, to see how it will further adapt multitouch (which it has already brought to its laptops' touchpads). Patents that Apple has filed for a multitouch tablet PC have many people expecting the company to dive into this neglected market, finally bringing tablets into the mainstream and possibly sparking explosive growth in the category.
When is it coming? It's not a question of when Multitouch will arrive, but how quickly the trend will grow. Fewer than 200,000 touch-screen devices were shipped in 2006. iSuppli analysts have estimated that a whopping 833 million will be sold in 2013. The real guessing game is figuring out when the old "single-touch" pads become obsolete, possibly taking physical keyboards along with them in many devices.
13) Cell Phones Are the New Paper
What is it? The idea of the paperless office has been with us since Bill Gates was in short pants, but no matter how sophisticated your OS or your use of digital files in lieu of printouts might be, they're of no help once you leave your desk. People need printouts of maps, receipts, and instructions when a computer just isn't convenient. PDAs failed to fill that need, so coming to the rescue are their replacements: cell phones.
Applications to eliminate the need for a printout in nearly any situation are flooding the market. Cellfire offers mobile coupons you can pull up on your phone and show to a clerk; Tickets.com now makes digital concert passes available via cell phone through its Tickets@Phone service. The final frontier, though, remains the airline boarding pass, which has resisted this next paperless step since the advent of Web-based check-in.
Log in to your airline's Web site. Check in. Print out your boarding pass. Hope you don't lose it. Hand the crumpled pass to a TSA security agent and pray you don't get pulled aside for a pat-down search. When you're ready to fly home, wait in line at the airport because you lacked access to a printer in your hotel room. Can't we come up with a better way?
14) Where You At? Ask Your Phone, Not Your Friend