EVENT DETAILS
Bio
Harsha Honnappa is an Associate Professor in the Edwardson School of Industrial Engineering at Purdue University, where he runs the Stochastic Systems Lab. He is an applied probabilist with strong interests in the analysis of stochastic models, theoretical statistics, stochastic optimization and control. He is the recipient of the Lajos Takacs award for this PhD thesis on Transitory Queueing Theory. His research is supported by a number of grants from the National Science Foundation, including an NSF CAREER award, the Office of Naval Research, the Purdue Research Foundation, and through the Edwardson School of Industrial Engineering's Frontiers awards
Talk Abstract
Doubly Stochastic Point Processes: From Representation to Nonparametric Maximum Likelihood Estimation
Doubly stochastic point process models are invaluable for modeling discrete-event systems exhibiting nonstationarities and high variability. The sample path measure of doubly stochastic models can be viewed as 'infinite mixture' models. The doubly stochastic Poisson (DSPP) or Cox process serves as exemplar in this talk. Statistical inference for these models is challenging due to the need to estimate the mixing probability measure from point process observations. This challenge is known as nonparametric maximum likelihood estimation (NPMLE) in the statistical literature. While much of the existing literature focuses on mixing measures supported on finite-dimensional subspaces, doubly stochastic processes often involve mixing measures supported on infinite-dimensional or path spaces, further complicating statistical estimation and inference.
This talk addresses two key questions:
The "representation" power of DSPP or Cox models: What classes of point processes can be exactly represented using a Cox model? We will revisit JFC Kingman's classic result on representing renewal processes as DSPPs and present a new, simplified proof of his main finding. Estimating the mixing measure from sample paths: After reviewing classical work on estimating finitely parameterized mixing models using the expectation-maximization (EM) algorithm, we will explore tight lower-bounding variational inference (VI) approximations to the NPMLE objective in more complicated settings that use neural networks to parametrize the mixing measure. We will provide approximation guarantees and insights into the accuracy of these approximate solutions.
TIME Tuesday October 29, 2024 at 11:00 AM - 12:00 PM
LOCATION A230, Technological Institute map it
ADD TO CALENDAR&group= echo $value['group_name']; ?>&location= echo htmlentities($value['location']); ?>&pipurl= echo $value['ppurl']; ?>" class="button_outlook_export">
CONTACT Kendall Minta kendall.minta@gmail.com
CALENDAR Department of Industrial Engineering and Management Sciences (IEMS)