What kind of cookie do you want today? Almond, chocolate? Maybe a strawberry or tea-flavored biscuit? Researchers in Japan have created a virtual reality system that turns a plain old cookie into a cornucopia of looks and tastes.
The system works by fooling the eyes and nose with visions and scents that turn a bland, flat-baked treat into one of more than a half dozen styles of cookie.
To try it out, users don a high-tech skullcap outfitted with a camera, screen, and seven pump-driven tubes filled with scented air.
The system, called Meta Cookie, overlays images of several types of cookies on plain cookies piled on a plate. Once the user selects one, the system pumps air near the user's nose with the corresponding scent. The power of the scent is adjusted as the user brings the cookie closer to their mouth.
Upon taking a bite, the user really thinks the cookie is isn't just a bland treat, but whatever the system turned it into. The research team, led by Takuji Narumi at the University of Tokyo, report that users are regularly fooled by the system.
While the team doesn't yet have a commercial market for their system, one can imagine it as a way to appease a group of picky kids at snack time.
Using a game controller to interact with real-world objects is definitely spooky. You push around a glorified pencil to "feel" the contours of a hand resting on a faraway table. And if that faraway hand moves, you'll feel an unseen force push back. It's as if an occult hand were taking control of the magic pencil from yards or miles away.
The push of a ghostly hand, vs. the virtual sense of touch ... it's not easy for me to say which aspect of the University of Washington's Kinect-based robo-control system is spookier. But it's easy for Fredrik Ryden to say which aspect is more useful.
"We want to give robotic surgeons a sense of touch," the visiting graduate student from Sweden told me.
The point of Ryden's contraption is not merely to manipulate objects over far distances. Heck, even a monkey can use a thought-controlled robotic arm to pick up distant objects, and surgeons have been operating remote-controlled robotic scalpels for years. But it takes a more sophisticated kind of robot to give those surgeons tactile feedback about how deep they're cutting, and create a virtual force field to keep their scalpels from straying.
The fact that Ryden's robo-touch system could demonstrate that capability after just a weekend's worth of work, using a $150 motion-sensing game device, adds to the experiment's geek appeal.
"I realized what I was doing was really cool, but it was easy — so I was surprised that nobody else had done it," Ryden said.
Now that the feat has been publicized on YouTube, in the blogosphere and beyond, it seems as if everyone is trying to do it, said Howard Jay Chizeck, an engineering professor who's co-director of the University of Washington's Biorobotics Laboratory. "The sense I have is that we're just a little bit ahead of whoever is right behind us," he joked.
How it works Microsoft (which is a partner in the msnbc.com joint venture) sells the Kinect system as a "controller-less" controller for its XBox video game console. Players can interact with their games by gesturing, punching, jumping or even dancing in front of an infrared laser projector and a set of infrared depth sensors. Kinect's software analyzes the patterns of scattered infrared light to create a 3-D "cloud" of data points that reflect the players' changing positions in real time.
Under the direction of UW's Blake Hannaford, the lab has been working for years to develop better robotic surgeons for military as well as civilian use. Surgical robots are already widely used for delicate operations such as prostate removal, but medical experts in the military (and at NASA) would love to have robots that can do a wide range of surgical operations by remote control, from hundreds of miles away.
So the Biorobotics Lab was challenged to come up with a system that could provide real-time feedback to the surgeons at the robot's controls — including a way to warn the surgeons if they were getting too close to a vital artery or some other danger zone.
Think of it as a 21st-century, virtual-reality "Operation" game with real-world consequences. Bzzzzt!
"Essentially, you're projecting a sense of touch through an image," Chizeck explained. "We'd like to have images of things generate 'force fields' around things you don't want to touch."
When the Kinect system came out in November, Ryden saw it as the perfect platform for such a device. His software translates the cloud of data points into a virtual 3-D surface. When the magic pencil (actually, a software-controlled stylus at the end of a robotic arm) "hits" the virtual surface, it moves no farther — just as if it were hitting the real surface of a faraway hand. The same thing can happen if your stylus strays up to the edge of the force field. (Though actually, if you press hard enough, you can push the stylus through the force field. It feels as if you're poking a pin through a piece of virtual cardboard.)
What it's for The robo-touch system is currently being fine-tuned as part of the Biorobotics Lab's long-running project on surgical robotics. The beauty part is that buying Kinect systems doesn't strain the lab's hardware budget, Chizeck said. "It's 150 bucks for a system that would cost maybe $100,000 or $150,000 to reproduce," he said.
That doesn't mean low-cost Kinects will be showing up in operating rooms. The low-cost devices don't have anywhere near the resolution that the actual robo-touch device would require. Eventually, super-sensitive touch feedback systems will be built from the ground up and put through clinical trials, as part of a long and potentially expensive development process.
"You're talking at least a decade," Chizeck said. "I think in Fredrik's lifetime, it's a sure bet, but it's really hard to predict." (Fredrik Ryden is 22 years old.)
Hannaford told me that surgical robotics may turn out to be the "killer app" for the field of haptics, which focuses on methods for translating virtual-reality shapes into a real-world sense of touch. (Maybe "killer app" isn't the best phrase to use when talking about medical procedures, but you get the point.)
Chizeck had a slightly different take: "I'll make a bet with Blake," he said. "I think there'll be a game application using haptics before there's a patient operated on."
He said robo-touch technology could also be used to create more dexterous bomb-disposal robots and deep-sea autonomous vehicles. But there's one obvious application that no one in the lab was willing to discuss: the use of haptics for long-distance, virtual-reality sex.
"I'll let you think of your own apps," Chizeck said.
Mars500 crewmembers test Russian Orlan suits before their mission started in early June 2010.
By John Roach, Contributing Writer, NBC News
A six-man crew on a virtual 500-day trip to Mars reached "orbit" around the Red Planet yesterday after 244 days of fake interplanetary travel. The crew is scheduled to "land" on Feb. 12, and will make three trips out onto simulated Martian terrain.
The all-male crew — three from Russia, one from France, one from China and an Italian-Colombian — are really in a mock spaceship parked at the Moscow-based Institute for Medical and Biological Problems. They're part of an experiment to find out how a crew would handle the stress, claustrophobia and fatigue real astronauts would face in long-term interplanetary travel.
Yesterday, the craft "entered a circular orbit around Mars" — according to the mission scenario, which the virtual astronauts follow like a movie script.
Three of the crew members — Alexandr Smoleevskiy, Diego Urbina and Wang Yue — will enter the "lander" scheduled to reach the mock Mars on Feb. 12. The three trips onto the simulated terrain take place between Feb. 14 and 22. The crew will then head back to the mother ship on Feb. 23. The long journey home begins on Feb. 28, and the crew escapes their mock world in early November.