Involuntary Human Motion Acknowledgement to Assist Future Robotic Skills

Authors

  • Rohit kumar Dept. of Computer Science, Krishna Engineering College, U.P

Keywords:

Robots, Artificial Intelligence, Healthcare, Facial Expressions, Robotics

Abstract

In the ever-evolving landscape of technology, robotics and artificial intelligence have emerged as transformative forces that continue to reshape our world. The rapid progress in these fields has given rise to increasingly sophisticated and capable robots. No longer relegated to the confines of controlled environments like factories, robots now find themselves coexisting with humans in a multitude of domains, ranging from healthcare and retail to our very own homes. This shift from isolation to integration has ushered in a new era of human-robot interaction, one that is defined by the challenges posed by the nuanced and intricate dance between humans and machines. Amid these intricate interactions lies a fundamental concern – the need to comprehend, recognize, and appropriately respond to the intricate realm of human behavior, including what we refer to as "Involuntary Human Motion Acknowledgment." As technology continues to evolve, it is imperative to explore how we can harness these advancements in robotics to improve the way humans and machines interact in shared spaces. Involuntary human motions are, by definition, natural, subconscious movements and behaviors that humans exhibit in their daily lives. These can encompass a vast array of actions, from subtle gestures to shifts in posture, even the subtleties of facial expressions. While these actions may often seem inconsequential, they play a pivotal role in effective human communication and collaboration, reflecting our intentions, emotions, and states of mind. Acknowledging and interpreting these involuntary human motions is emerging as a critical facet of human-robot interaction. By doing so, we empower robots with the capability to grasp human intentions and emotions more profoundly, facilitating more effective communication, cooperation, and mutual understanding. The significance of integrating Involuntary Human Motion Acknowledgment into the fabric of robotics cannot be overstated, as it holds the potential to revolutionize the way robots operate and coexist with humans. It can enhance the safety of shared environments, enable more fluid collaboration, offer personalized assistance, and even lead to more empathetic responses from robots, thereby making them better suited to cater to human needs and preferences. However, like any transformative technological development, this progression is not without its challenges, including ethical considerations and privacy concerns, which must be meticulously addressed to strike the right balance between improving robotic skills and respecting individual privacy and autonomy.

References

L. Gabriella, G. Márta, K. Veronika et al., “Emotion attribution to a non-humanoid robot in different social situations,” PLoS One, vol. 9, no. 12, Article ID e114207, 2014.View at: Publisher Site | Google Scholar

A. Rozanska and M. Podpora, “Multimodal sentiment analysis applied to interaction between patients and a humanoid robot pepper,” IFAC-Papers Online, vol. 52, no. 27, pp. 411–414, 2019.View at: Publisher Site | Google Scholar

M. Viríkova and S. Peter, “Teach your robot how you want it to express emotions,” Advances in Intelligent Systems and Computing, vol. 316, pp. 81–92, 2015.View at: Google Scholar

X. Ke, Y. Shang, and K. Lu, “Based on hyper works humanoid robot facial expression simulation,” Manufacturing Automation, vol. 137, no. 1, pp. 118–121, 2015.View at: Google Scholar

F. Azni Jafar, N. Abdullah, N. Blar, M. N. Muhammad, and A. M. Kassim, “Analysis of human emotion state in collaboration with robot,” Applied Mechanics and Materials, vol. 465-466, pp. 682–687, 2013.View at: Publisher Site | Google Scholar

Z. Shao, R. Chandramouli, K. P. Subbalakshmi, and C. T. Boyadjiev, “An analytical system for user emotion extraction, mental state modeling, and rating,” Expert Systems with Applications, vol. 124, no. 7, pp. 82–96, 2019.View at: Publisher Site | Google Scholar

J. Hernandez-Vicen, S. Martinez, J. Garcia-Haro, and C. Balaguer, “Correction of visual perception based on neuro-fuzzy learning for the humanoid robot TEO,” Sensors, vol. 18, no. 4, pp. 972-973, 2018.View at: Publisher Site | Google Scholar

A. Zaraki, D. Mazzei, M. Giuliani, and D. De Rossi, “Designing and evaluating a social gaze-control system for a humanoid robot,” IEEE Transactions on Human-Machine Systems, vol. 44, no. 2, pp. 157–168, 2014.View at: Publisher Site | Google Scholar

J. Wainer, B. Robins, F. Amirabdollahian, and K. Dautenhahn, “Using the humanoid robot KASPAR to autonomously play triadic games and facilitate collaborative play among children with autism,” IEEE Transactions on Autonomous Mental Development, vol. 6, no. 3, pp. 183–199, 2014.View at: Publisher Site | Google Scholar

L. Tang, Z. Li, X. Yuan, W. Li, and A. Liu, “Analysis of operation behavior of inspection robot in human-machine interaction,” Modern Manufacturing Engineering, vol. 3, no. 3, pp. 7-8, 2021.View at: Google Scholar

Z. Li and H. Wang, “Design and implementation of mobile robot remote human-computer interaction software platform,” Computer Measurement & Control, vol. 25, no. 4, pp. 5-6, 2017.View at: Google Scholar

T. Koolen, S. Bertrand, G. Thomas et al., “Design of a momentum-based control framework and application to the humanoid robot atlas,” International Journal of Humanoid Robotics, vol. 13, no. 1, 2016.View at: Publisher Site | Google Scholar

H. Huang, N. Liu, M. Hu, Y. Tao, and L. Kou, “Robot cognitive and affective interaction model based on game,” Journal of Electronics and Information Technology, vol. 43, no. 6, pp. 8-9, 2021.View at: Google Scholar

Lufei, Y. Jiang, and G. Tian, “Autonomous cognition and personalized selection of robot service based on emotion-spatiotemporal information,” Robot, vol. 40, no. 4, pp. 9-10, 2018.View at: Google Scholar

J. Law, P. Shaw, and M. Lee, “A biologically constrained architecture for developmental learning of eye-head gaze control on a humanoid robot,” Autonomous Robots, vol. 35, no. 1, pp. 77–92, 2013.View at: Publisher Site | Google Scholar

A. Cela, J. Yebes, R. Arroyo, L. R. Bergasa, and E. López, “Complete low-cost implementation of a teleoperated control system for a humanoid robot,” Sensors, vol. 13, no. 2, pp. 1385–1401, 2013.View at: Publisher Site | Google Scholar

E. Tidoni, P. Gergondet, A. Kheddar, and S. M. Aglioto, “Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot,” Frontiers in Neurorobotics, vol. 8, 2014.View at: Google Scholar

A. M. BaTula, Y. E. Kim, and H. Ayaz, “Virtual and actual humanoid robot control with four-class motor-imagery-based optical brain-computer interface,” BioMed Research International, vol. 2017, Article ID 1463512, 13 pages, 2017.View at: Publisher Site | Google Scholar

T. Sato, Y. Nishida, J. Ichikawa, and Y. Hatamura, “Active understanding of human intention by a robot through monitoring of human behavior,” in Proceedings of the IEEE/RSJ/GI International Conference on Intelligent Robots and Systems, IROS’94, pp. 405–414, IEEE, Munich, Germany, September 1994.View at: Google Scholar

H. Clint, Modelling Intention Recognition for Intelligent Agent Systems, DTIC, Mexico City, Mexico, 2004.

K. A. Tahboub, “Intelligent human-machine interaction based on dynamic bayesian networks probabilistic intention recognition,” Journal of Intelligent and Robotic Systems, vol. 45, no. 1, pp. 31–52, 2006.View at: Google Scholar

Published

2023-12-21