Resumen
Games and 3D movies are mostly supported by realistic character animation performance. The behaviour of the humanoid character it is depend on the motion data itself. Therefore the complexity of character movement will determine the realism of their behaviour. Generally, Motion Capture device will provide the raw data that previously recorded from human/actor movement. However there are some problems remains challenges such as controller, physic effect, or motion combination. Our proposed approach will read the data from motion capture device then transformed into realistic behaviour in virtual environment. However, there are few difficulties on realizing this idea, such as user objective and the appropriate behaviour of virtual human. Therefore, we solve this issue by providing the biped control to overcome the complexity of motion synthesis data when it will be applied into character animation. The controller is capable to perform motion blending with inverse and forward kinematics, as a result it able to generate the realistic behaviour along with user intention. There is three main behaviour walking, steady and jogging that has value 0-100. As a result of experiment, the biped interface control is able to read data from motion capture then load and control the virtual human by manipulating the joint forces power in every movement of the characters. As future works, the external physical forces can be added as additional forces in humanoid model to provide certain effect such as: falling down, jumping or kicking and punching to generate realistic motion synthesis.