[2021-12-10] Dr. Chun-Yi Lee, National Tsing Hua University, "Virtual-to-Real Transfer for Intelligent Robotics "

Poster:Post date:2021-12-01
Title: Virtual-to-Real Transfer for Intelligent Robotics
Date: 2021-12-10 2:20pm-3:30pm
Location: R103, CSIE
Speaker: Dr. Chun-Yi Lee, National Tsing Hua University
Hosted by: Prof. Lung-Pan Cheng


In order to train artificial intelligence agents to perform actions in the real world, a popularly adopted approach is through the use of reinforcement learning, which encourages agents to interact with an environment to gain experience and reach certain goals, such as obstacle avoidance or target following. Training such agents typically requires extensive interactions with the environment to gain sufficient experiences, and may necessitate a significant amount of time in the real world, prohibiting them to be deployed efficiently. In addition, training in the real world may potentially cause damage to the robot agents themselves as well as other objects in the environment. As a result, in recent years, a number of researchers have leveraged computer simulation for synthesizing photo-realistic and diverse samples as training data in the target domain, and investigated methods to transfer the knowledge learned in such simulated worlds to the real world. This branch of research, which is commonly called “sim-to-real transfer”, has been studied in the robotic community due to the availability of well-maintained robotics simulators and a practical need that avoids real robots from wearing out with repeated use to collect large-scale data. Doing so allows us to effectively learn how to control robots in a variety of unseen environments, which is also crucial for our projects to effectively collect data by operating a mobile robot.
Despite the above advantages, a key challenge to enable sim-to-real transfer is the domain gap. If not properly addressed, the control policy of the robots learned in simulated environments are unable to be effectively migrated to the real environment. While the use of meta-state representations is a promising approach to bridging the gap between simulated and real worlds, it requires a visual perception model that can transform raw images to such meta-state representations. Especially when semantic segmentation is selected to provide meta-state representation, we need a model that is able to segment both simulation and real-world images in a consistent fashion. Another challenge arises here as it is not guaranteed if semantic segmentation models learned in a simulation world can perform satisfactorily in the real environment as well. In addition, ground truth data annotations are not necessarily available for real-world data, making it even more difficult to migrate such a model to the real world. This necessitates the use of a technique called unsupervised domain adaptation (UDA), which allows a model trained in a source domain with labeled data to be migrated to a target domain without any labeled data. In this talk, we intend to discuss these issues.

Chun-Yi Lee is an Associate Professor of Computer Science at National Tsing Hua University (NTHU), Hsinchu, Taiwan, and is the supervisor of Elsa Lab. He received the B.S. and M.S. degrees from National Taiwan University, Taipei, Taiwan, in 2003 and 2005, respectively, and the M.A. and Ph.D. degrees from Princeton University, Princeton, NJ, USA, in 2009 and 2013, respectively, all in Electrical Engineering. He joined NTHU as an Assistant Professor at the Department of Computer Science since 2015. Before joining NTHU, he was a senior engineer at Oracle America, Inc., Santa Clara, CA, USA from 2012 to 2015.
Last modification time:2021-12-01 AM 10:22

cron web_use_log