By continuing to browse our site you agree to our use of cookies, revised Privacy Policy and Terms of Use. You can change your cookie settings through your browser.
SITEMAP
Copyright © 2024 CGTN. 京ICP备20000184号
Disinformation report hotline: 010-85061466
SITEMAP
Copyright © 2024 CGTN. 京ICP备20000184号
Disinformation report hotline: 010-85061466
/CFP
Despite the summer vacation, Liu Xiaofeng, a professor at Hohai University in east China's Jiangsu Province, and his research team have remained immersed in the laboratory, focusing on developing humanoid robots with highly expressive facial features.
Eyeing on optimizing human-robot emotional interaction technology, the research team has developed a new algorithm for generating facial expressions on humanoid robots.
At its 26th annual meeting on July 2, the China Association of Science and Technology listed research on emotionally intelligent digital humans and robots at the top of the 10 major cutting-edge scientific issues of 2024.
On the same day, Liu's team published their findings, a new approach for action unit (AU)-driven facial expression disentangled synthesis, in the international journal IEEE Transactions on Robotics.
Humanoid robots often struggle to convey the intricate and authentic facial expressions characteristic of humans, potentially hampering user engagement, Liu said.
"To address this challenge, we introduced a comprehensive two-stage methodology to empower our autonomous affective robot with the capacity to exhibit rich and natural facial expressions," he added.
Liu explained that in the first stage, their method generates nuanced robot facial expression images guided by AUs. In the subsequent phase, they actualize an affective robot with multifaceted degrees of freedom for facial movements, enabling it to embody the synthesized fine-grained facial expressions.
Ni Rongrong, from Changzhou University who is a co-author of the paper, said that people may be more familiar with various "digital humans" and "virtual anchors," which can generate a variety of real-time expressions.
However, humanoid robots face specific constraints, such as the size and number of motors, which make this more challenging, Ni added. "For example, the humanoid robot we previously used had only nine micro motors beneath its facial surface, far fewer than the number of muscles in a human face."
Therefore, according to Ni, the team divided the nine motors on the humanoid robot's face into 17 AUs to enable richer expressions and smoother transitions through coordinated movements.
Liu said that the team of researchers plans to expand the number of facial AUs and endow a robot with delicate expressions autonomously.
Liu believes that as the emotional interaction capabilities of humanoid robots continue to advance, these robots, equipped with both high emotional and intellectual quotients, will become widely used in nursing homes, kindergartens, special education schools, and other settings.
"The humanoid robots will not only assist or replace humans in completing some tasks but also bring more emotional value," he said.