Movies

Collecting robot training data from human movement & Autonomous robot control by a neural network

How we set up and take control of the Torobo humanoid robot using the Xsens MVN motion capture system in order to collect human movement data for training neural networks, and a demonstration of a neural network (PV-RNN) controlled Torobo where the robot reacts to the position of a red block placed in front of it in order to touch it with either its left or right hand. The neural network generates the position of each joint in real time in order to create smooth trajectories, like human movement.

Controlling the Sense of Agency in Dyadic Robot Interaction: An Active Inference Approach

Simulation study on dyadic imitative interactions of humanoid robots using a variational recurrent neural network model by Nadine Wirkuttis and Jun Tani.

Vision-based goal-directed planning of a robot with development of visual attention and visual working memory

 Vision-based goal-directed planning of a robot is studied using predictive coding and active inference. The video shows how a visual plan for achieving a specific goal state is generated by using mechanisms developed for visual attention and visual working memory.

Paper

On-line direct physical interaction with Torobo

A study on the relation between cognitive and motor compliance, behavior emergence, and intentionality, from the free energy principle theory perspective. A PV-RNN model represents proprioceptive information, whereas motor behavior is based on joint space control from torque feed-back. This experiment has been conducted by Hendry Ferreira Chame and Jun Tani.

Predicting future and reflecting past in terms of visuo-proprioceptive patterns

A simulated humanoid robot experiment using a predictive-coding and active inference model for hierarchical and associative learning of visuo-proprioceptive sequential patterns. This experiment has been conducted by Jungsk Hwang.

 

<Old Movies>

           LINK