Campus Alert:
7:24 PM
May 1st, 2024

PSU ALERT: POLICE ACTIVITY at SOUTH PARK BLOCKS. Avoid the area. Updates will be sent via PSU Alert, when and if available.

MASEEH MATHEMATICS + STATISTICS COLLOQUIUM SERIES 2022-2023 ARCHIVE

November 10, 2022 (Thursday)
Location: Fariborz Maseeh Hall (FMH), room 462
1855 SW Broadway
Time: 3:15pm - 4:15pm

Speaker: Dr. Kristen Hendricks, Rutgers University
Faculty Host: Dr. Steven A. Bleiler

Title: Homology cobordism and Heegaard Floer homology

Abstract: The homology cobordism group consists of integer homology spheres under connected sum, modulo an equivalence relation called homology cobordism. We review some history of this group and discuss applications of Heegaard Floer homology to its structure. In particular, we show that the homology cobordism group is not generated by Seifert fibered spaces. This is joint work with J. Hom, M. Stoffregen, and I. Zemke.

January 20, 2023
Location: Fariborz Maseeh Hall (FMH), room 462
1855 SW Broadway
Time: 3:15pm - 4:15pm

Speaker: Dr. J.J.P. Veerman
Faculty Host: Dr. Steven A. Bleiler

Title: The Bak-Sneppen Model of Evolution
Abstract: We investigate a class of models related to the Bak-Sneppen (BS) model, initially proposed to study evolution. In this model, random fitnesses in [0, 1] are associated to N agents located at the vertices of a graph G, in our case a cycle. Their fitnesses are ranked from worst (0) to best (1). At every time-step the agent with the lowest fitness and its neighbors on the graph G are replaced by new agents with random fitnesses. This simple model after 30+ years still defies exact solution, but captures some forms of complex behavior observed in physical and biological systems.

We use order statistics to define a dynamical system on the set of cumulative distribution functions R : [0, 1] [0, 1 that mimics the evolution of the distribution of the fitnesses in these models. We then show that this dynamical system reduces to a 1-dimensional polynomial map. Using an additional conjecture we can then find the limiting distribution as a function of the initial conditions. Roughly speaking, this ansatz says that the bulk of the replacements in the Bak-Sneppen model occur in a decreasing fraction of the population as the number N of agents tends to infinity. Agreement with experimental results of the BS model is excellent.

January 27, 2023
Location: Fariborz Maseeh Hall (FMH), room 462
1855 SW Broadway
Time: 3:15pm - 4:15pm

Speaker: Dr. J.J.P. Veerman
Faculty Host: Dr. Steven A. Bleiler

Title: Primes!
Abstract: A non-technical review of some classical results in number theory. We identify some of the currents in number theory (analytic, algebraic, and ergodic), and informally discuss some results that had great impact on mathematical (and physical) thought.

On the analytic side, we look at the prime number theorem (or PNT), which tells us how dense primes are in the natural numbers. While its proof starts with some combinatorial estimates, it turns out, very surprisingly, that the full proof makes essential use of complex analysis (the Cauchy integral formula). For that reason, this branch is now called analytic number theory. We will touch on algebraic number theory. This theorem gives the density of primes in sequences of the form {a+ib} where a and b are fixed integers. Time permitting, we will very briefly mention ergodic theory.

February 3, 2023
Location: Fariborz Maseeh Hall (FMH), room 462
1855 SW Broadway
Time: 3:15pm - 4:15pm

Speaker: Dr. Stephen Portnoy
Faculty Host: Dr. Dorcas Ofori-Boateng

Title: When Are Unbiased Estimators Linear?
Abstract: The problem of fitting a linear model to data originated in the mid 18th century. The earliest approach was developed for simple linear models by Boscovitch and was based on minimizing the sum of absolute errors. Unfortunately, Boscovitch's geometric computational method was rather complicated, and did not apply when there were multiple x's. Thus, the least squares approach of Legendre and Gauss developed nearly 50 years later and became the standard method. In 1822, Gauss showed that the least squares estimator was a "Best linear unbiased estimator" (BLUE) in terms of variance minimization. Following the development of statistical theory in the mid-twentieth century, BLUE's are known to be optimal in many respects under normal assumptions. However, since variance minimization doesn’t depend on normality and unbiasedness is often considered reasonable, many statisticians have felt that BLUE’s ought to perform relatively well in some generality. 

The result here considers the general linear model and shows that any measurable estimator that is unbiased over a moderately large (but finite dimensional) family of distributions must be linear. Thus, imposing unbiasedness cannot offer improvement over imposing linearity. The problem was suggested by Hansen, who showed that any estimator unbiased for nearly all error distributions (with finite covariance) must have a variance no smaller than that of the best linear estimator in some parametric subfamily. Specifically, the hypothesis of linearity can be dropped from the classical Gauss–Markov Theorem. This might suggest that the best unbiased estimator should provide superior performance, but the result here shows that the best unbiased regression estimator can be no better than the best linear estimator, and non-linear estimators are often substantially superior.

The result appeared in The American Statistician in late 2022, but a technical error in part of the proof was discovered. A new and much simpler proof will be presented and implications and generalizations will be discussed.

Gauss probably used least squares around 1800, but didn't communicate it to other mathematicians until publishing it in 1809, after publications by Legnedre (1805) and the American Robert Adrain (1808).

Note: the idea of squaring errors does seem rather pointless and even unreasonable. It really makes sense only under normal assumptions. In fact, the idea of using absolute error to compare estimators was introduced by Galileo, and the use of absolute errors continued to be supported by Laplace and others. Nonetheless, methods to minimize the sum of absolute errors were not developed until the mid-twentieth century when they could be based on linear programming.

February 10, 2023
Location: Fariborz Maseeh Hall (FMH), room 462
1855 SW Broadway
Time: 3:15pm - 4:15pm

Speaker: Dr. J.J.P. Veerman
Faculty Host: Dr. Steven A. Bleiler

Title: Chemical Reaction Networks
Abstract: The study of the dynamics of chemical reactions, and in particular phenomena such as oscillating reactions, has led to the recognition that many dynamical properties of a chemical reaction can be predicted from graph theoretical properties of a certain directed graph, called a Chemical Reac- tion Network (CRN). In this graph, the edges represent the reactions and the vertices the reacting combinations of chemical substances.

In contrast with the classical treatment, in this work, we heavily rely on a recently developed theory of directed graph Laplacians to simplify the traditional treatment. We show that much of the dynamics of these polynomial systems of differential equations can be understood by analyzing the directed graph Laplacian associated with the system.

Our new theory allows a more concise mathematical treatment and leads to considerably stronger results. In particular, (i) we show that our Laplacian deficiency zero theorem is markedly stronger than the traditional one and (ii) we derive simple equations for the locus of the equilibria in all (Laplacian) deficiency zero cases.

March 10, 2023
Location: Fariborz Maseeh Hall (FMH), room 462
1855 SW Broadway
Time: 3:15pm - 4:15pm

Speaker: Dr. Mark Jackson
Faculty Host: Dr. Steven A. Bleiler

Title: Introduction to Quantum Computing
Abstract: Quantum computing is fast approaching commercial applicability. I will introduce the basics of this field including theory, technology, applications, and commercial status. These include solutions for drug discovery, material science, finance, and cybersecurity. I will describe how one programs a quantum computer and methods used to optimize its efficiency.

April 7, 2023
Location: Fariborz Maseeh Hall (FMH), room 462
1855 SW Broadway
Time: 3:15pm - 4:15pm

Speaker: Dr. Nathan Kutz
Faculty Host: Dr. Dacian Daescu

Title: The Future of Governing Equations
Abstract: A major challenge in the study of dynamical systems is that of model discovery: turning data into reduced order models that are not just predictive, but provide insight into the nature of the underlying dynamical system that generated the data. We introduce a number of data-driven strategies for discovering nonlinear multiscale dynamical systems and their embeddings from data. We consider two canonical cases: (i) systems for which we have full measurements of the governing variables, and (ii) systems for which we have incomplete measurements. For systems with full state measurements, we show that the recent sparse identification of nonlinear dynamical systems (SINDy) method can discover governing equations with relatively little data and introduce a sampling method that allows SINDy to scale efficiently to problems with multiple time scales, noise and parametric dependencies. For systems with incomplete observations, we show that the Hankel alternative view of Koopman (HAVOK) method, based on time-delay embedding coordinates and the dynamic mode decomposition, can be used to obtain a linear models and Koopman invariant measurement systems that nearly perfectly captures the dynamics of nonlinear quasiperiodic systems. Neural networks are used in targeted ways to aid in the model reduction process. Together, these approaches provide a suite of mathematical strategies for reducing the data required to discover and model nonlinear multiscale systems.

Biography:  Nathan Kutz is the Yasuko Endo and Robert Bolles Professor of Applied Mathematics and Electrical and Computer Engineering and Director of the AI Institute in Dynamic Systems at the University of Washington, having served as chair of applied mathematics from 2007-2015.  He received the BS degree in physics and mathematics from the University of Washington in 1990 and the Phd in applied mathematics from Northwestern University in 1994. He was a postdoc in the applied and computational mathematics program at Princeton University before taking his faculty position. He has a wide range of interests, including neuroscience to fluid dynamics where he integrates machine learning with dynamical systems and control.

June 9, 2023
Location: Fariborz Maseeh Hall (FMH), room 462
1855 SW Broadway
Time: 3:15pm - 4:15pm

Speaker: Dr. Joel Rosenfeld
Faculty Host: Dr. Bruno Jedynak

Title: The Kernel Perspective on Dynamic Mode Decompositions
Abstract: The system identification (or approximation) problem is the determination of an accurate approximation of a dynamical system from observations of the state or output. In this talk we discuss the role operators and reproducing kernel Hilbert spaces play in nonlinear system identification and modeling problems. This is directly connected to the Koopman operator framework that has recently become popular. I will present the advantages that come with the kernel perspective, as well as a pointwise convergent result that comes from the use of compact operators.

Biography: Dr. Joel A. Rosenfeld is an Assistant Professor in the Department of Mathematics and Statistics at the University of South Florida and the Principal Investigator of the Learning DOCK Group. He received his Ph.D. from the mathematics department at the University of Florida in 2013 under the advisement of Dr. Michael T. Jury studying Operator Theory and Functional Analysis. Following graduate school, he was a postdoc in Mechanical Engineering at the University of Florida under Dr. Warren E. Dixon studying numerical methods in optimal control theory and fractional calculus. Subsequently, he joined the Department of Electrical Engineering and Computer Science at Vanderbilt University under Dr. Taylor T. Johnson studying numerical methods in formal methods for computing, which ultimately led to a position as a Senior Research Scientist Engineer within the Institute for Software Integrated Systems (ISIS). Dr. Rosenfeld is a Young Investigator Research Program (YIP) awardee through the Air Force Office of Scientific Research (AFOSR) (awarded in 2020).