IBM's centennial video features 100 people, each presenting an achievement from the year they were born.
A hundred years from now, will we be assimilated by the machines? Or will we assimilate them? These are the kinds of issues facing International Business Machines as the company begins its second 100 years.
Right now, most folks are thinking about the past 100 years at IBM, which is celebrating the centennial of its founding on Thursday. But for Bernard Meyerson, the company's vice president of innovation, it's all about the next century.
"That's pretty much what we think about," Meyerson told me today.
Meyerson has plenty to look back on, including his own not-so-small part in IBM's past innovations. When his cell phone dropped the connection during our telephone conversation, he called back and casually mentioned that he had a hand in creating the transistors built into that cell phone. And when I asked him to explain, he said, "I actually invented the technology known as silicon-germanium."
It turns out that IBM has played a behind-the-scenes role in all sorts of technologies, ranging from semiconductor development to barcodes to Wi-Fi. "IBM is a funny company," Meyerson said. "We don't force you to put a little sticker on anything that says, 'We're the smart guys.'"
But enough about the past: What about the future? "Going forward, you have tremendous opportunities," particularly when it comes to making sense of the huge databases that are being built up in all sorts of fields, Meyerson said. For example, imagine a system that can take medical records from the 285 million people around the world with diabetes, anonymize those records and analyze them, looking for potential new treatments or preventive measures.
"The fact is, there is no mechanism today that could do that, and the reason is that medical data is unstructured," he said. There's little consistency in how the records are kept, and medical conditions might be described in different ways by different doctors.
When you put together the volumes of data and the numbers of people that have to be covered in these massive, unstructured data sets, the figures mount up to quintillions of bytes. That's the challenge facing new types of computing tools — for example, the Watson supercomputer, which won a highly publicized "Jeopardy" quiz-show match earlier this year. Now Watson is being put to work on a tougher task: making sense of medical records, which is just the kind of job Meyerson has in mind.
Still other challenges await. Watson-style computers could digest the millions of data points involved in tracking the flow of highway traffic, then develop models to predict where the tie-ups could arise before they actually happen. The computers of the next century will have to handle a wide range of "big data" challenges, ranging from climate modeling to natural-language search engines for multimedia.
Meyerson doesn't expect Watson to answer that challenge completely. A hundred years from now, Watson will almost certainly be considered a quaint antique, much like the tabulating machines that were made back in 1911.
"Watson specifically is not the issue, as much as the combination of Watson's ability to interpret natural language, the capacity to store 'big data' and apply data analytics to come up with solutions for society," he said. "In the absence of natural language, you're going to have a short, unhappy life attempting this work. Without that key ingredient, how are you going to take the interaction of humans and machines to the next level and make it easy?"
"Star Trek" captain Jean-Luc Picard (Patrick Stewart) is fitted with gizmos for a fictional Borg transformation. The blending of humans and hardware will probably be more artful in real life by 2111.
What will the next level be in the year 2111? "Honestly, at 100 years I'm genuinely unsure," Meyerson said. The past century has shown that the pace of technological advancement can be highly variable, depending on what kinds of breakthroughs come to the fore. But if Meyerson had to bet on one particular game-changing technology, it would be coming up with a direct interface between computing circuits and the human brain.
"If it turns out that there is a very natural way to communicate data back and forth without being obtrusive, then the whole world changes," he told me. This wouldn't be a Borg-like assimilation, in which humans look increasingly like machines. Rather, the machines would blend into the human body.
Does that sound like a grand dream for the next century? Or a nightmare? Feel free to chime in with your comments below.
More about the future:
- Hits and misses in the five-tech forecast
- Physics turns from fission to the future
- Take a test drive through the next 100 years
- Deep thinkers speculate about the next 50 years
- Futurist Ray Kurzweil reaches for immortality
You can connect with the Cosmic Log community by "liking" the log's Facebook page or following @b0yle on Twitter. Also, give a look to "The Case for Pluto," my book about the controversial dwarf planet and the search for new worlds.