Animatronics Lab 

In this lab (at IMI-NTU) a mobile avatar will be developed. The avatar consist of a movable robotic platform on which a realistic, average sized human shaped mannequin is mounted. The mannequin has an actuated head and in later versions actuated limbs (i.e. arms and hands) as well. Onto the mannequin is projected the actual form and texture of a remote person (the expert). The result is a life-like appearance of that person, in a seated position, in the remote location. In this Lab, skeleton and basic animation will be given to any real corresponding person under his/her VH existence.

Key research topics:

  • Construction of a holonomic motion platform. Achieve a level of local autonomy while being under high-level control and guidance from the remote user. The local autonomy encompasses obstacle avoidance and goal oriented movement (e.g. "go to the hospital bed on the right")
  • Design of an average-size physically animated mannequin. Actuators for head and limb movements, ability to track the remote user and reproduce his body movements credibly
  • Projection of a real human onto this avatar, especially only the actuated limbs
  • Evaluation of the range in which a faithful illusion is achievable. Evaluation of the quality of that illusion.
  • Individual skeleton detection for VH animation
  • Basic body and face deformations for VH animation
  • Tracking of head and limb movements of the remote user. – Actuation of head and limbs according to remote tracking data.
  • Sensing and recording of the avatar's environment and displaying it in the remote "control room".