Conference Notebook

News Stay informed about the latest enterprise technology news and product updates.

Solving for human-robot communication deficits in autonomous driving

Self-driving cars are already doing amazing things, but gesturing to other drivers isn't one of them. That's a problem that needs to be solved, according to experts at the MIT Disruption Timeline Conference.

In the fast-moving quest to make vehicles self-driving, certain features of human driving, it turns out, have proved...

challenging to capture. Among them: automating the interactions that happen on the road between drivers, said John Leonard, professor of mechanical and ocean engineering and a member of the Computer Science and Artificial Intelligence Laboratory at MIT.

To wit: A couple of years ago, Leonard was taking his son to school and needed to make an unprotected left turn into traffic. "The way that I did it was to wave at another driver, and she waved back," he said.

Easy enough, but autonomous vehicles can't gesture or make eye contact -- at least not in a way that's understood as instantaneously as a wave between drivers. "Despite all of the recent advances in machine learning and visual perception and object recognition, interaction with humans is really hard," he said.

Leonard, who has taken an academic leave of absence to work with the Toyota Research Institute (TRI), relayed the anecdote at the recent MIT Disruption Timeline Conference in Cambridge, Mass. He was a panelist among an elite group of academics and industry experts who specialize in autonomy and who gathered to discuss the future of their industry. Together, they shined a light on the challenges human-robot communication presents and on the delicate business of creating vehicle-to-vehicle (V2V) communication standards for the self-driving car industry.

The power of nonverbal cues in safe driving is an argument, Leonard said, for augmentation rather than automation -- at least for right now. He described augmentation, where robots help humans to complete tasks, as a blend of "the best of machine perception and human abilities." But pinning down what interactions between robots and humans produce successful -- and safe -- results isn't as easy as it might seem.

In what the Society of Automotive Engineers refers to as Level 2 and Level 3 autonomous systems, the driver must be ready to play a role when necessary. In Level 2 systems, the driver always has to be ready to take control of the vehicle. In more advanced, Level 3 systems, the driver has to intervene less frequently and is given notice when to do so. The problem is that humans suffer from "the vigilance decrement problem," Leonard said -- a decline in effortful attention over time.

"Humans are actually not very good at monitoring autonomous systems," he said. Rather than rely on the driver to stay vigilant, TRI is testing a "guardian angel" system that does the surveilling and the intervening.

"Instead of doing the handoff from [autonomous system] to human, our philosophy is that the autonomy can be vigilant," Leonard said. In TRI's model, the handoff is from human to autonomous system -- and in that direction only -- "to try to develop a highly advanced safety system," he said.

'Symbiotic autonomy'

Manuela Veloso, the Herbert A. Simon Professor in the School of Computer Science at Carnegie Mellon University and the head of the school's machine learning department, wants robots to ask for help when they need it. She called this type of human-robot interaction "symbiotic autonomy."

"I've been working a lot on the problem of autonomy, which means that the AI [artificial intelligence] systems would have the ability to perceive the environments, make decisions and actuate them," she said. But the sensing technology to identify, for example, that a floor is wet, a plant has fallen over, or what a person's hand signals might mean "was kind of out of reach for our technology," she said.


Carnegie Mellon University's Manuela
Veloso on autonomous robots.

Her research group devised a robot that could operate in an office setting. CoBots, aka collaborative robots, are autonomous robots that can navigate the terrain and perform tasks, such as escorting guests around the building, but are also limited in their capabilities. For example, they have no arms and cannot grasp objects or press buttons for the elevator.

"When they are actually facing their own limitations like not having arms, they ask for help," Veloso said. "This was a major breakthrough for us -- the case of symbiotic autonomy in which they proactively ask for help."

Having the self-awareness to ask for help isn't enough, Veloso said. One of her objectives is to make AI decision-making more transparent to the humans who come in contact with robots. "With the CoBot moving around, people started wondering ... why the CoBot was going there, why it was late, what happened when it escorted someone back to the elevator?" she said. That puzzlement led to new research focused on getting the machines to explain how they make the decisions they make.

V2V communication

Communication challenges aren't isolated to human-robot communication. Car manufacturers, for example, are also experimenting with vehicle-to-vehicle communication and vehicle-to-infrastructure (V2I) communication.

Indeed, Toyota test sites in Japan have outfitted buildings at intersections with cameras "to allow cars to see around the corner," said Gill Pratt, Toyota executive technical advisor and CEO at TRI. Data is transmitted via the Dedicated Short-Range Communications, or DSRC, channel.

The standards for V2I communication have been established, and any car from any autonomous vehicle manufacturer can receive data packets through DSRC, according to Pratt. But standards for V2V communication are still being hammered out.

"The challenge for us is how ... we communicate with another robot that was not developed by us," Veloso said. She called it an "economic problem," because competing manufacturers "have to share information among them," so robots developed by different manufacturers can communicate with each other.

Regardless of the competition, Pratt said V2V standards in the automotive industry are being developed. "It's not that we share everything, but we try as much as we can to share things that are related to safety and are related to traffic," he said.

Next Steps

AI can benefit the business: Here's how

Impediments to AI implementations

A look at how AI apps have matured

Dig Deeper on Enterprise artificial intelligence (AI)

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

How much attention should roboticists place on building transparency into their decision-making systems?
Cancel

-ADS BY GOOGLE

SearchCompliance

SearchHealthIT

SearchCloudComputing

SearchMobileComputing

SearchDataCenter

Close