The Robots Are Coming!

UT researchers are building intelligent robots that can play soccer, handle weapons, help the disabled, and more. Let's meet five of them.

By Rose Cahalan and Andrew Roush
Photos by Adam Voorhes | Styling by Robin Finlay
Videos by John Fitch | Site by Anna Donlan

Scroll to Begin

Robots
that Play
Soccer

The power chords of “Eye of the Tiger” surge in the background as a 2-foot-tall robot walks toward a small orange ball in front of a soccer net.

Its movements are slow and halting, and after each step the robot sways slightly from side to side as it regains balance. Excited cries from the audience turn nervous as it becomes clear that the robot’s trajectory is off—if it continues straight, it’ll walk right past the ball. “Oh no!” someone shouts. “He’s gotta go left!” For a moment, the robot is still. Then it veers left, approaches the ball, and taps it into the net. The crowd goes wild.

Welcome to robot soccer. It’s an intensely competitive game that tests advanced programming skills, and UT is really, really good at it. Led by computer science professor Peter Stone, the Austin Villa Robot Soccer Team—the name is a play on the British soccer club Aston Villa—won the RoboCup World Championships in two divisions in 2012 in Mexico City, defending its title from the previous year. The Tower was lit orange in the team’s honor.

Entertainment value aside, robot soccer is a serious academic pursuit. The seemingly simple tasks required in a soccer game—turning, kicking, running, keeping an eye on the ball—prove incredibly complex things to teach robots. And they’re skills that have broad applications beyond soccer. “The general challenge with robotics is to create agents that can do three high-level things,” Stone says. “They have to sense the world robustly, make decisions, and execute actions. All of those are embodied in robot soccer.”

The students on the UT team are all doing advanced artificial intelligence work. In 2013 alone, Stone and his colleagues published seven scientific papers related to robot soccer research in some way. He marvels at how quickly the field has advanced. For one thing, the robots rarely catch on fire anymore. “At the first RoboCup in 1997, there was a robot that went up in smoke partway through the game,” he remembers.

This spring, Stone’s team will be busy preparing for the 2014 RoboCup in Brazil. “There’s always lots of things to fix, very little sleep, and a lot of adrenaline,” he says. He’ll take it over an academic conference any day. “When you go to a conference and present your research, people might ask questions,” Stone says. “But at RoboCup, when your research works out, everybody cheers!” —R.C.

"They have to sense the world robustly, make decisions, and execute actions. All of those are embodied in robot soccer.”

Name:

Aldebaran Nao


Height:

23 inches


Uniform Colors:

Magenta or blue


Tactical Advantage:

Body has two cameras, nine tactile sensors, and eight pressure sensors


Nicknames:

Michael Mozzarella, Gene Parmesan, Arnold Schwarzencheddar

Robots
That
Handle
Nukes

At the height of the Cold War, the United States had 30,000 nuclear warheads.

Today, there are roughly 5,000, and the U.S. promised in a 2009 treaty to eventually cut that number to 1,700. Dismantling and safeguarding all those highly radioactive weapons is a daunting task—one the Nuclear Robotics Group at UT is hoping to make a little easier.

Because nuclear work is dangerous for humans, robots are perfect for the job, says robotics expert Mitchell Pryor, MS ’99, PhD ’02, who shares leadership of the lab with nuclear engineer Sheldon Landsberger. “Robots are getting more sophisticated and less expensive,” Pryor says. “Meanwhile, the safety requirements dictating the dosage of radiation a worker can be exposed to are getting more stringent. So we’ve hit a crossroads where robots really become deployable.”

Pryor and Landsberger’s team is building highly advanced robotic arms that can retrieve nuclear materials from a vault, open and close canisters, and perform other tasks. In one demonstration, two large robotic arms gently hold a raw egg, applying just the right amount of pressure to maneuver the egg without cracking its shell.

Of course, the materials these robots will eventually be handling are a lot more sensitive than eggs. Security is one of the biggest concerns at nuclear sites, and that’s another issue the team is addressing. “We’re also working on vision,” Landsberger says. “Imagine a warden patrolling a prison, checking to make sure every inmate is in their cell, and raising the alarm if something goes wrong. There’s a great deal of potential for robotic technologies to do those kinds of tasks.”

These robots are meant to assist—not replace—human workers. In fact, the lab has become a much-needed pipeline of highly skilled (human) graduates. In a partnership between UT and Los Alamos National Laboratory in New Mexico, most doctoral students in the lab move from Austin to Los Alamos to finish their dissertations while working full-time at the facility, where they are encouraged to stay. “They desperately need new workers at Los Alamos,” Pryor says. “Many of the employees there are close to retirement. We have four of our people working there full-time right now.”

Pryor speaks about his work with a sense of national pride. He believes his robots won’t just minimize risk to humans—they’ll be one small part of the global effort to handle nuclear material safely. “We have this huge responsibility to the rest of the world,” he says, “to be good stewards.” —R.C.

Name:

Yaskawa Motoman SIA5 with Custom AX Controller


Estimated Cost:

$175,000 per dual-arm system


Claim to Fame:

Can collect data and send commands 1,000 times per second

A Robot That
Wants To be
Friends

“Go on, shake her hand,” urges Luis Sentis.

Tenatively, I take the robot’s hand and give it a shake. The robot, whose name is Dreamer, responds to my grasp and shakes heartily up and down.

To my surprise, it feels more natural than half of the human handshakes I’ve experienced—no awkward limp wrist or intimidating iron grip here. After I release its hand, the robot looks at me with puppy-dog eyes, then waves a cheerful hello.

Dreamer’s home is a lab on the third floor of UT’s Engineering Teaching Center, and Sentis, a mechanical engineering professor, is her creator. The art of the handshake is only one of many social skills she—everyone calls her “she”—has mastered. That’s because Dreamer was designed to interact comfortably and safely with humans. “We get people coming into the lab and before we even say anything, they want to high-five her,” Sentis says. “That doesn’t happen with most robots.”

From her broad forehead to her button nose, everything about Dreamer’s appearance was engineered to be cute. Sentis’ fondness for Japanese cartoons is visible in her wide-eyed face, as is his 10-year-old son’s favorite anime character, a sharp-witted heroine named Nausicaa. Add in some burnt-orange hair, a pair of emerald eyes, and color-changing ears that droop or perk up like a puppy’s, and the result is an adorable creature straight out of the movie Wall-E.

In fact, Dreamer is no stranger to Hollywood. Last summer, Sentis got a call from a producer working on Transformers 4 in Austin. “We heard about your robot,” the producer said mysteriously, “and we want to put her in our movie.” Dreamer obliged by shooting a scene alongside actor Mark Wahlberg for the film, scheduled to hit theaters in June.

What makes Dreamer special, Sentis says, is that she’s both approachable and functional. Industrial robots that can do work, such as the machines on factory assembly lines, are nothing new, and neither are robots built to look like us (just ask any Star Wars fan). But successfully marrying the two is one of the next big frontiers in robotics. “What is unique about us,” Sentis explains, “is that we are addressing safety together with performance.” In other words, Dreamer is a robot that can carry out tasks and move around—plus pat your back without crushing it.

Someday, Sentis says, robots built with lessons learned from Dreamer will do real good in the world. He reels off possibilities: delivering medication in refugee camps, preventing a nuclear meltdown by going where humans can’t go inside a compromised power plant, and even working alongside astronauts on a NASA mission to another planet. “I really believe,” Sentis says, “that robots will save money and save lives.” —R.C.

Name:

Dreamer


Special Power:

Color-changing ears can be pink, blue, white, or green


Sharp Vision:

Eyes are high-resolution Fire Wire cameras


Claim to Fame:

Appeared in Transformers 4 with Mark Wahlberg

"We get people coming into the lab and before we even say anything, they want to high-five her. That doesn't happen with most robots."

Robots that
Lend a Hand

When Ashish Deshpande broke his finger playing Frisbee, he was shocked that a plastic toy could do that much damage to his body.

A PhD candidate at the time, he went to the hospital, thinking doctors might reset what then seemed to be a dislocation. It turned out to be much more complicated.

“I had a small tumor in my hand,” Deshpande says. He was in disbelief. “A Frisbee shouldn’t break fingers,” he notes, a laugh simmering under his voice. It was a tiny fleck of abnormal tissue that, while benign, had sapped the strength from the bones in his right pinky. Surgeons removed the tumor, but he needed physical therapy in order to move his finger again.

That’s where things got interesting. The rehabilitation process meant he had to carefully flex his finger, breaking through the scar tissue that had developed under the skin. Over and over, he tore the fibrous knot of collagen, gradually regaining movement. It was painful, but Deshpande was fascinated, learning about his own hand. Six muscles reaching up into the arm pull on tendons that can finely manipulate the fingers to do everything from cradling a crying newborn to throwing the perfect knuckleball. It’s the most nuanced, most articulated marionette the world has ever known. Way better than Pinocchio. A mechanical engineer, Deshpande is fascinated by complicated devices—including his own fingers.

Not long after, Deshpande came to UT’s Cockrell School of Engineering and set up a lab where he and a team of students are developing technologies to help elderly and disabled people with mobility, including studying the intricate mechanics of the hand in hopes of eventually developing a human-like prosthetic. Their prototype is called ACT. It’s essentially a model skeleton hand with six separate motors mimicking arm muscles. Those motors pull on a carefully woven web of string that moves the fingers the way tendons would.

The challenge for Deshpande’s team is to understand the subtle calculations that make a hand sweep across a piano’s keyboard, or twist and stretch across the keyboard on your laptop. Deshpande holds a half-empty coffee mug in front of me. “If it were a really light cup,” he explains, “I wouldn’t apply as much force.” He lifts it again, his rehabilitated pinky curled around the handle. If it were full of coffee, he says, he’d need a “completely different strategy” to deal with the weight and movement of the liquid.

If current robotic hands are the loud, technical equivalent of a Wagner opera, the human hand is an expressive, dynamic Chopin concerto. Most robotic graspers are meant for precise, singular tasks. Deshpande hopes to create prosthetics that are more fluid and dynamic.

But most people who lose a hand have to adapt, to learn to live life without that expression. There are many robot hands that look like human hands, but none that truly behave like the real thing. “I want to change that,” Deshpande says. —A.R.

If current robotic hands are the loud, technical equivalent of a Wagner opera, the human hand is an expressive, dynamic Chopin concerto.

Robots For
Everyone

Scientific research is often slow, tedious, and unpredictable. But still.

At this point, you’d think we’d be able to do better than the Roomba, an appliance mostly used to produce cat videos on YouTube. You’d think we could cobble together something like Rosie from The Jetsons, a robot that carries your bag for you, brings you coffee, or just gives you directions. Computer science professor Peter Stone’s team thinks you may not have to wait much longer.

“We just submitted a paper related to this task,” Stone says, sitting in his neat, new office in the Bill & Melinda Gates Computer Science Complex, just past the indoor robot soccer field, “to have a robot greet you at the front door,” he continues, “and ask you where you’re trying to get to, and to give you some guidance.” Now that’s more like it.

Stone’s “robots for everyone” project, part of the College of Natural Science’s Freshman Research Initiative, has big ambitions. From freshmen to PhD candidates, the project gives researchers a sandbox to try out new ideas in robotics, from gesture recognition to navigating in three dimensions. The plan is to have the robots roam freely around the Gates Complex, helping students, staff, and visitors with simple tasks. An open-ended project, Stone’s group is looking to see what they can do with these helper bots—and see how people naturally interact with them.

The project is still looking for funding, but the team is already digging into their research. In a glass-walled room across from his office sits the physical embodiment of Stone’s electronic testing ground. In the brightly lit lab, a student is fiddling with one of the early models. It looks like an old boom box with wheels.

Clearly, they’re not building Terminators yet, but they are hoping to answer a lot of questions. What does it take for a robot to respond to voice commands? To recognize a familiar face? Or find its way home?

When I ask him how useful robots can be in everyday life, Stone reminds me of the (possibly apocryphal) quote from former IBM CEO Thomas Watson. “I think there is a world market for maybe five computers,” the future supercomputer namesake allegedly said. As Stone spoke, I counted at least three computers in the room, including the one in my pocket. Could robots become household fixtures, too?

“We still haven’t figured out what the great uses are going to be for them,” he says. But that’s exactly the point. Give the robots some basic skills, turn them loose, and most importantly, see how people use them.

“No one’s going to be inspired by a robot that’s just out there doing nothing,” Stone says.–A.R.  

The plan is to have the robots roam freely around the Gates Complex, helping students, staff, and visitors with simple tasks.

Back to Top