A robotics software company that integrates human behaviour with leading robotics hardware has been awarded a $100,000 research and development contract by the Small Business Innovation Research program.
5D Robotics, in collaboration with Charles Riber Analytics, have the goal to create an autonomous robot that can interact with humans as a team member by physically following its teammates and reacting to visual and gestured demands.
The project is called "Multinodal Interface for Natural Operator Teaming with Autonomous Robots (Minotaur)."
With the Minotaur project, 5D Robotics will integrate its special 5D Behavior Engine, software that includes "Follow Me" and "Guarded Motion" capabilities, with Charles River's vision-based tracking and gesture recognition technology to process specific commands.
The software enables any robot to autonomously follow its teammates through complex environments whilst avoiding collision with people or objects.
Integrating Charles River's visual recognition technology means the robot can now take cues from its human teammates and follow directions autonomously. Similar to how soldiers, police, and firefighters might get visual cues from their teammates via hand signals, the Minotaur project will enable those same recognition and response behaviors in robots.
The success of the project will first impact how war fighters interact with robots on the battlefield. Future commercialisation could mean human-robot teams in a variety of sectors including law enforcement and emergency response, with later applications in senior care support and hospitality.
"The Minotaur project is advancing how robots and humans work together," said David Rowe, 5D Robotics CEO.
"As U.S. Defense and commercial budgets shrink, robots will be called upon to take on more complex tasks and work in close cooperation with humans. This project is immediately appropriate for our war fighters and future-ready for the commercial sector. The team at Charles River has created software that integrates perfectly with our own, and we believe we'll have a robust, amazingly responsive robotic software within six months."