Are CMDPs Fundamentally Harder than MDPs?

Описание к видео Are CMDPs Fundamentally Harder than MDPs?

Lei YingProfessor
Department of Electrical Engineering and Computer Science
University of Michigan

Abstract: Data-driven learning and decision-making in complex systems are often subject to a variety of operational constraints, such as safety, fairness, and budget constraints. These problems can often be formulated as Constrained Markov Decision Processes (CMDPs). This talk discusses the hardness of solving CMDPs, especially when the transition kernels, rewards, and constraints are unknown and must be learned online. This talk will further present efficient online algorithms with regret and constraint violation guarantees.

Bio: Lei Ying is currently a Professor at the Electrical Engineering and Computer Science Department of the University of Michigan, Ann Arbor, an IEEE Fellow, and an Editor-at-Large for the IEEE/ACM Transactions on Networking. His research is broadly in the interplay of complex stochastic systems and big data, including reinforcement learning, large-scale communication/computing systems, private data marketplaces, and large-scale graph mining. He won the Young Investigator Award from the Defense Threat Reduction Agency (DTRA) in 2009 and the NSF CAREER Award in 2010. He was the Northrop Grumman Assistant Professor in the Department of Electrical and Computer Engineering at Iowa State University from 2010 to 2012. His papers have received the Best Paper award at IEEE INFOCOM 2015, the Kenneth C. Sevcik Outstanding Student Paper Award at ACM SIGMETRICS/IFIP Performance 2016, and the WiOpt’18 Best Student Paper Award; his papers have also been selected for ACM TKDD Special Issue “Best Papers of KDD 2016”, Fast-Track Review for TNSE at IEEE INFOCOM 2018 (7 out of 312 accepted papers were invited), and Best Paper Finalist at MobiHoc 2019.

Комментарии

Информация по комментариям в разработке