Campus Alert:
7:24 PM
May 1st, 2024

PSU ALERT: POLICE ACTIVITY at SOUTH PARK BLOCKS. Avoid the area. Updates will be sent via PSU Alert, when and if available.

MASEEH MATHEMATICS + STATISTICS COLLOQUIUM SERIES 2021-2022 ARCHIVE

Location: Fariborz Maseeh Hall (FMH) room 462
1855 SW Broadway
Time: 3:15pm

Friday, April 8, 2022

Speaker: Dr. Lisa Madsen, Oregon State University

Faculty Host: Dr.  Daniel Taylor-Rodgriguez

Title: N-Mixture Models with Application to Disease Surveillance

Abstract:
N-mixture models were originally developed to estimate animal population abundance from spatially and temporally replicated counts, where the probability of detecting an individual animal is an unknown number less than one. We will trace the evolution of N-mixture models from this original problem to the problem of estimating disease prevalence using a spatially explicit model. We apply the spatial model to estimate the number of chlamydia cases in Oregon using annual data from 2010 through 2018. This is joint work with Claudio Fuentes of Oregon State University and Ben Brintz of the University of Utah. 

Friday, April 22, 2022

Speaker: Dr. Stefan Steinerberger, University of Washington

Faculty Host:  Dr. Jeffrey Ovall 

Title:  The Mathematical Theory of Localization

Abstract:  I will discuss a problem of great importance in physics
which can be phrased in rather elementary mathematical terms:
sometimes eigenvectors of matrices have only very few large 
entries (and most other entries are either 0 or at least much, 
much smaller).  It would be nice if one could just look at a matrix
and quickly say whether this happens without actually having
to compute any of the eigenvectors -- and it would be even
better if one could also predict roughly where the nonzero entries are.
If the matrix happens to describe a physical system (in our case
this will be a Schrodinger operator), then this  question has great 
physical significance (for example for the construction of 
semiconductors but also in the construction of noise-cancelling
walls).  I will try to survey some of the things we know about these
problems as well as some of the (many) things we do not know.
The talk will be purely mathematical, no physics knowledge is
required (and none will be provided) -- there will be many pictures!

Bio:  Stefan Steinerberger is an Associate Professor in the Department
of Mathematics at the University of Washington, Seattle with an interest
in Mathematical Analysis and Applications (somewhat broadly interpreted).  
His research has been supported by the NSF, the Sloan Foundation, the
Austrian Science Fund and the Institute of New Economics Studies.

Friday, April 29, 2022

Speaker: Dr. Benjamin Erichson, University of Pittsburgh 

Title: Continuous Networks for Sequential Predictions

Abstract: Deep learning is playing a growing role in many areas of science and engineering for modeling time series. However, deep neural networks are known to be sensitive to various adversarial environments, and thus out of the box models are often not suitable for mission critical applications. Hence, robustness and trustworthiness are increasingly important aspects in the process of engineering new neural network architectures and models. In this talk, I am going to view neural networks for time series prediction through the lens of dynamical systems. First, I will discuss novel continuous-time recurrent neural networks that are more robust and accurate than other traditional recurrent units. I will show that leveraging classical numerical methods, such as the higher-order explicit midpoint time integrator, improves the predictive accuracy of continuous-time recurrent units as compared to using the simpler one-step forward Euler scheme. Then, I will discuss a connection between recurrent neural networks and stochastic differential equations, and extensions such as multiscale ordinary differential equations for learning long-term sequential dependencies.

Bio: Ben Erichson is an Assistant Professor for Data-driven Modeling and Artificial Intelligence in the Department of Mechanical Engineering and Materials Science at the University of Pittsburgh. Before joining U Pitt, he was a postdoctoral researcher in the Department of Statistics at UC Berkeley, where he worked with Michael Mahoney. He was also a postdoc at the Department of Applied Mathematics at the University of Washington (UW) working with Nathan Kutz and Steven Brunton. He earned his PhD in Statistics at the University of St Andrews. Ben's work is broadly interested at the intersection of deep learning, dynamical systems, and robustness. He is also interested in leveraging tools from randomized numerical linear algebra to build modern algorithms for data-intensive applications such as fluid flows and climate science.

Faculty Host: Dr. Jeffrey Ovall

Friday, May 20, 2022

Speaker: Dr. Maria Fox, University of Oregon

Title: Parameter Spaces: Algebraic & Analytic, Past & Present

Abstract: 
Sometimes it’s easy to describe the solutions to a mathematical problem. If we need to find the roots of a polynomial, say f(x) = x^2 − 3x + 2, it’s no problem to describe the solutions: 1 and 2. But what if a mathematical problem has infinitely many solutions? We’ll see how certain constants, called parameters, can be used to organize solutions to these types of problems, and how they can also reveal important structure underlying our mathematical problems.
We’ll start by exploring examples of parameters used to classify simple objects, like circles and rectangles. Then, we'll examine several methods used in the 1800s to create parameters for elliptic curves, from both an algebraic and an analytic perspective. These ideas naturally generalize to the study of certain parameter spaces, called Shimura varieties. We'll end by discussing a surprising recent development in the study of Shimura varieties which has important applications to active areas of research in number theory and representation theory.

Faculty Host: Dr. Liubomir Chiriac

Friday, November 19, 2021

Speaker: Dr. Yanyuan Ma, Penn State University

Title: Network Functional Varying Coefficient Model

Abstract:
We consider functional responses with network dependence observed for each individual at irregular time points. To model both the inter-individual dependence as well as within-individual dynamic correlation, we propose a network functional varying coefficient (NFVC) model. The response of each individual is characterized by a linear combination of responses from its connected nodes and its own exogenous covariates. All the model coefficients are allowed to be time dependent. The NFVC model adds to the richness of both the classical network autoregression model and the functional regression models. To overcome the complexity caused by the network inter-dependence, we devise a special nonparametric least squares type estimator, which is feasible when the responses are observed at irregular time points for different individuals. The estimator takes advantage of the sparsity of the network structure to reduce the computational burden. To further conduct the functional principal component analysis, a novel within-individual covariance function estimation method is proposed and studied. Theoretical properties of our estimators are analyzed, which involve techniques related to empirical processes, nonparametrics, functional data analysis and various concentration inequalities. We analyze social network data to illustrate the powerfulness of the proposed procedure.  
Bio:
Yanyuan Ma is a Professor of Statistics at Penn State. Ma received her Ph.D. in Applied Mathematics from MIT in 1999. She received her B.S. in Mathematics from Beijing University in 1994. Her research interest is in measurement error models, dimension reduction, mixed sample problems, latent variable models, selection bias and skew-elliptical distributions, missing not at random problems and more generally semiparametrics. She is currently a fellow of the Institute of Mathematical Statistics and the American Statistical Association.

Faculty Host: Dr. Ge Zhao