Abstract:
This thesis proposes a new framework, namely Conditional Neural Movement Primitives (CNMPs) that is a learning from demonstration framework designed as a robotic movement learning and generation system built on top of a recent deep neural architecture, Conditional Neural Processes [1] (CNPs). CNMPs extract the prior knowledge directly from the training data by sampling observations from it, and uses it to predict a conditional distribution over any other target points. CNMPs speci cally learns complex temporal multi-modal sensorimotor relations with external parameters and goals; produces movement trajectories in joint or task space; and executes these trajectories through a high-level feedback control loop. Through simulations and real robot experiments, we showed that CNMPs can learn the non-linear relations between low-dimensional parameter spaces and complex movement trajectories from few demonstrations; and they can also model the associations between high-dimensional sensorimotor spaces and complex motions using large number of demonstrations. The experiments further showed that even the task parameters were not explicitly provided, the robot could learn their in uence by associating the learned sensorimotor representations with the movement trajectories. The robot, for example, learned the in uence of object weights and shapes through exploiting its sensorimotor space that includes proprioception and force measurements; and be able to change the movement trajectory on the y when one of these factors were changed through external intervention.