Ten years ago, we would have been blown away by a cellphone with far more computing power and memory than the average PC had in 1999, along with a built-in camera and programs to manage every aspect of our lives. Ten years from now, the iPhone and its ilk will be antiques.
Over the next decade, the evolution of computing and the Internet will produce faster, increasingly intelligent devices. More of our possessions will contain sensors and computers that log our activities, building digital dossiers that augment our memories, help us make decisions and tame information overload.
Such ideas may sound futuristic and excessive today, and technological predictions are notoriously off-base. Short-term forecasts tend to assume too much change, and long-term forecasts underestimate the possibility of sudden, major shifts.
Even so, this vision of interconnected devices that produce and filter massive amounts of data in the 2010s is a logical progression of the Web, computers and gadgetry that emerged in the 2000s. Moore's Law, the principle that computing power doubles every two years without increasing in cost, still rules.