Advertisement

Forward Into Past at Meeting

TIMES STAFF WRITER

The theme was “The Next 50 Years of Computing.” So why did the more than 1,800 computer engineers, physicists, business people and other students of high technology gathered here for a three-day conference last week spend so much time reexamining the last 50?

The answer came from Gordon Bell, the opening speaker, who was one of the developers of early minicomputers and now is a senior researcher at Microsoft.

Recalling then-IBM Chairman Thomas Watson Sr.’s permanently embarrassing 1947 forecast that there would be a world market “for maybe five computers,” Bell observed, “Predictions require some history.”

Advertisement

Indeed, computers were still so novel at the time of Watson’s proclamation that neither he nor anyone else had enough experience with them to understand their range of capabilities.

Today, however, their power and potential--tapped and untapped--are so familiar that in many ways the technological forecasts were the least interesting aspect of the conference. Far more diverting was watching the past and future front guard of technology grapple with two other issues: the cultural impact of their scientific achievements and the very methodology of predicting (in other words, not what to predict, but how).

The occasion was the 50th-anniversary conference of the nation’s oldest organization of computer professionals: the Assn. for Computing Machinery, founded in 1947 at the very dawn of the Digital Age. The organizers, led by networking pioneer Robert Metcalfe, who invented the ethernet, co-founded 3Com Corp. and is today a sharp-tongued columnist for InfoWorld, a trade newspaper, invited a dozen farsighted technologists to take a crack at the future.

Advertisement

Many hewed closely to the official theme.

“Artificial intelligence will happen,” said Nathan Myhrvold, group vice president for applications and content at Microsoft. He forecast that within 20 to 30 years we will have computers as powerful as the human brain.

Some predicted that 3-D imaging will allow doctors to operate on patients from thousands of miles away by remote control, and that computers will become even more integrated into household appliances, cars and other machines, even clothing and the human body itself (thus allowing us to upload and download information to and from not only our brains but our digital databases).

But the deeper questions behind such Jetson-esque images also attracted the panelists’ attention.

Advertisement

Will the Internet create a society of insular, housebound homunculi communicating with one another only by wire and infrared linkup? In the face of the predicted dominance of virtual, augmented or alternative realities, a few speakers saw renewed hope for the good old-fashioned physical variety.

“We’ll be doing more things on real time, with the real world, than on a network,” said semiconductor pioneer Carver Mead of Caltech. Echoed Vinton G. Cerf, one of the inventors of the Internet and now an executive at MCI Communications: “There will still be a great deal of need for people to get out and do things. There’s too much desire for human interaction.”

Will interactivity destroy the profession of storytelling? Bran Ferren, the technology chief at Walt Disney Imagineering, argued that the do-it-yourself cult of the Internet--in which the user becomes his or her own travel agent, clothing or book salesperson, librarian, screenwriter, author or even composer--underrates the value of professional intermediaries.

Further underlying the effort to predict were the important issues of how and why we look ahead. Bell proposed some basic principles, including: “For short-term predictions, bet against the optimists; they’re likely to be wrong,” and the Dilbertian “organizations always behave poorer than anyone can predict.”

As for why, the annals of computer science include several papers that forecast with amazing acuity the developments that lie around the next few corners. “As We May Think,” a seminal 1945 article by the distinguished scientist Vannevar Bush, foresaw the amazing power of computers as well as the vast computerized databases and networks of today.

Papers like Bush’s weren’t just exercises. For decades they remained guideposts for engineers and physicists trying to find their way among the myriad possibilities of digital computing. Today’s forecasters are hoping to accomplish the same thing: By peering audaciously but rationally into the future, they may help steer their juniors toward the Grail, whatever it may be.

Advertisement

Yet if all the attendees were convinced of anything, it was that developments over the next half a century are sure to outrace anything even the most visionary can anticipate.

As Cerf put it: “I will not live to see what happens in 2047. But I will live to regret these trivial predictions.”

Advertisement