Imagine a robot that looks, well, human. Imagine that robot knows when you’re sad. Imagine it can help you reach things, join you in a game of cards, even remind you to call a friend.
Those who work in the field of human robot interaction can and are imagining it. Those like Anthony Whitehead, an associate professor and director of Carleton’s School of Information Technology, cross-appointed to the School of Computer Science (SCS) and current chair of the Human Computer Interaction (HCI) program. Whitehead, along with two graduate students, Colin Kilby and Ashleigh Fratesi, is building and fine-tuning the applications and human-like elements for a humanoid “assistive and companion” robot.
Kilby, an HCI master’s student, is working on human robot interaction through head-mounted devices such as Google Glass and other wearable motion tracking systems as a way to control an android through mimicking physical avatars. He has spent the last year manufacturing the basic machine and is beginning a study to determine the best control mechanisms.
Fratesi, who is pursuing a master’s in computer science, has been focusing on emotion recognition.
“You can categorize emotions into broad categories,” explains Whitehead. “A companion robot can use its vision system to determine if a person appears angry or sad and, for example, alleviate a loneliness problem by engaging in dialogue. It might ask, ‘How are you feeling?’ And it’s not unfeasible that it could play a card game or chess, or it could facilitate communication with other people by suggesting calling a family member.”
Whitehead’s long-term goal is to build a robotic assistive system the elderly can use and relate to. An older person who has strength or co-ordination issues could operate the robot by simply making a reaching motion, and the robot would act as a physical avatar by extending its arms towards an object. Whitehead also believes robotic control sets need to be simpler for people who may have movement or cognitive challenges that come with age.
“A lot of people find themselves in solitary environments, as their partners die, for instance. What can we do from an interface point of view to purpose the robot as a companion?”
Elements such as unified motion tracking and other “assistive” operations could be perfected within the next decade, or sooner, says Whitehead, and in 10 to 20 years, the “companion” elements could be fully developed.
“Twenty years from now, I would expect that assistive robots will be fairly common,” says Whitehead, who has built a predominantly plastic robot body for about $3,000 with a 3D printer.
He also expects that helper robots, combined with anthropomorphized companion androids, will be prevalent.
In general, Whitehead’s research focuses on practical applications of machine learning, pattern matching and computer vision. Unified motion tracking and gesture recognition systems have focused on gaming, but are applicable to other fields where body area sensor networks can be useful.
Whitehead also teaches undergraduate courses in interactive multimedia and design and graduate courses in systems and computer engineering, as well as human computer interaction and design. His students are involved in complex research into interactive applications and computer visions systems.
The implications of his current research could include multiple emerging technologies in humanoid robotics and 3D printing, as well as psychology and medicine, particularly geriatric care. His research on interaction and control of robotic devices will lead to opportunities for a better quality of life for those in need, he says.
The assistive-companion robot project, which has been in the workshop for about 18 months, will also attract large numbers of master’s and doctoral students interested in working on artificial intelligence and basic recognition systems for years to come, Whitehead says.
“With more capability in place, I expect a broader interest, especially from engineering, psychology and cognitive science, which will bring in new ideas.”
Many studies have explored the effectiveness of companion robots to a patients’ well-being, as well as robots that are intended to physically assist patients. One of the most famous examples is that of PARO, a robotic seal, which has been shown to help elderly patients cope with dementia and Alzheimer’s. These robots, however, tend to focus on either the emotional connection without physically assisting the patient, or tend to be solely assistive with no emotional connection.
“Usually,” notes Whitehead, “these robots provide only one specific need when it is possible they could be providing more. We are merging these two ideas.”