Campus Alert:
2:54 PM
May 1st, 2024

PSU ALERT: Millar Library is closed. NO ONE is authorized to be in the library. Anyone remaining is committing criminal trespass. See your pdx.edu email for more information.

Maseeh Mathematics + Statistics Colloquium Series 2016-2017 Archive

September 30, 2016
Malgorzata Peszynska, Oregon State University
Adsorption: new mathematics and computations for multiple components  

Adsorption is a well known process during which (adsorbent) gas particles adhere to a surface (of adsorbent). Adsorption is present in many natural systems and is widely used in biotechnology, pharmaceutical and chemical engineering industry. The mathematical models of adsorption relate the amount adsorbed to that present in the fluid which transports the adsorbent, and range from simple nonlinear parametric algebraic relationships called isotherms to the complex statistical mechanics algorithms which take into account the surface energy and bonding energy of the particles. The mathematical and computational treatment of the transport with adsorption is challenging due to its coupled nonlinear hyperbolic system structure.  
In the talk we present our recent results on hybrid models of multicomponent adsorption in which advection is the prevalent transport mechanism. In particular, we discuss the well-posedness of adsorption-desorption hysteresis and computational stability of non-equilibrium models, and the hyperbolicity of a system formulated with hybrid models in which the isotherms are not given explicitly.

October 7, 2016 
Martin Flashman, Humboldt State University
Making sense of calculus with mapping diagrams: a visual alternative to graphs  

Mapping diagrams are an important and underutilized alternative to graphs for visualizing functions. Starting from basics, Professor Flashman will demonstrate some of his assaults on the challenges of visualizing differential and integral calculus using mapping diagrams. Knowledge of at least one semester of calculus will be presumed.

October 14, 2016 
Jane-Jane Lo, Western Michigan University
Integrating computer technology in a geometry course for prospective elementary teachers  

In this talk, I reflect on the journey my colleagues and I went through when attempting to integrate computer technology in a geometry course for prospective elementary teachers. Specifically, I examined our rationales for starting this journey, the challenges we faced, the approaches we took to meet the challenges, the lessons we learned, where we are in our journey, and where we are heading next. Samples of technology uses and student work will be explored and examined. Implications for teaching other mathematics courses and future research will also be discussed.

October 21, 2016 
Dacian Daescu, Portland State University
The value of observations in BIG data assimilation: significance, challenges, and research opportunities  

Data assimilation systems (DAS) for numerical weather prediction (NWP) combine information from a numerical model, observational data, and error statistics to analyze and predict the state of the atmosphere. Variational methods (3D-Var, 4D-Var) produce an estimate (analysis) to the true state by solving a large scale nonlinear optimization problem. The rapid growth in the data volume provided by satellite-based instruments has prompted research to assess and improve the forecast impact ("value") of high-resolution observations.  
We discuss mathematical and computational aspects of hyperparameter sensitivity and impact estimation in the context of model-constrained optimization. The evaluation of the sensitivity of a forecast error measure ("quantity of interest") to observations and error covariance parameters is considered in a 4D-Var DAS. Special emphasis is given to the analysis of correlated errors in observations assimilated from hyperspectral remote sensing instruments. The practical significance, challenges, and research opportunities are presented together with illustrative numerical results and a summary of the current status of implementation at operational NWP centers.

October 28, 2016 
Ignacio Muga, Pontificia Universidad Católica de Valparaíso
The Petrov-Galerkin method and the historical evolution of its quasi-optimality constant  

In this talk the Petrov-Galerkin Method is presented in the general context of linear problems in Banach spaces. The historical evolution of the quasi-optimality constant associated with this method will be shown, from the pioneering work of Babuska in 1971 up to the advances in recent years.

December 2, 2016 
Peter Veerman, Portland State University
Mediatrices and minimal separating sets  

For distinct points p and q in a two-dimensional connected Riemannian manifold M, we define their mediatrix Lpq as the set of points equidistant to p and q. It is known that mediatrices have a cell decomposition consisting of a finite number of branch points connected by Lipschitz curve. We show additional geometric regularity properties of mediatrices: at each point they have the radial linearizability property, which means that they are tangent to a finite collection of lines meeting in the origin. Simply put: they are Lipschitz images of (multi) graphs. In the case of mediatrices on the sphere, where mediatrices are simple closed Lipschitz curves, we show these curves have at most countable singularities, and the total angular deficiency has a finite upper bound related to the total curvature of the metric on the sphere.  
On the other hand mediatrices have the minimal separating property: they separate the manifold M into two parts and that any proper subset of them does not. This fact allows for their topological classification. In principle we can determine which (multi)graphs can minimally separate a surface of genus g. This classification is in some sense a generalization of the Jordan Brouwer Theorem. We will briefly discuss the classification of minimal separating sets in the orientable surfaces of genus 0, 1, 2, and 3.  Mediatrices found an application in a long-standing territorial conflict between Peru and Chile. In 2014, the International Court of Justice in The Hague weighed in on the issue and used the concept of mediatrix in its decision. We will show what they did.

January 20, 2017 
Michelle Stephan, University of North Carolina at Charlotte
Is mathematical inquiry for all students?  

In this talk, I explore the differences between direct and indirect mathematics instruction and argue that despite many researchers’ attempts to blend the two, this blend is intractable, if not impossible. I explore this issue using the data from a classroom teaching experiment conducted in an inclusive setting in which an inquiry approach was used to teach integers for understanding. The students in this project were members of a co-taught classroom in which students with disabilities were mainstreamed into the regular education setting. There were two teachers present each day, a special educator and a mathematics teacher (myself). I explore the roles that both teachers played in facilitating learning from an inquiry approach as well as the ways in which students with disabilities participated in the classroom instruction, both individually as well as in small groups. In the end, I will answer the question about whether an inquiry approach to mathematics is feasible for all students, in particular those with disabilities.

January 27, 2017 
Mckenzie West, Reed College
Rational thoughts on surfaces  

Polynomial equations and their solutions form a cornerstone of mathematics. Solutions with rational coordinates are particularly intriguing; a fantastic surprise is the great difficulty of determining the mere existence of a rational solution to a given equation (let alone the complete set). We will discuss this problem in two cases, diagonal cubic surfaces:
ax3 + by3 + cz3 + d = 0,
and degree 2 del Pezzo surfaces,
ax4 + by4 + cx2y2 + d = z2.
A surprising and successful modern approach, the Brauer-Manin obstruction, employs tools from linear algebra, geometry, and non-commutative algebra. I will discuss a collection of interesting and motivating examples with simultaneous historical and modern interest, and also explain some of the tools and techniques that form the backbone of my research program.

March 10, 2017 
Ander Erickson, Western Oregon University
The mathematics of information problem-solving tasks: Surface or substance?  

I present a cross-case analysis that explores the demands and opportunities that arise when information problem-solving tasks are introduced into college mathematics classes. Educators at three universities collaborated with me to develop statistics-related activities that required students to engage in research outside the classroom. This presentation focuses on one aspect of the study: a comparison of how the teachers balanced mathematical content with information-problem solving in the tasks that they created. These tasks incorporated mathematics in a variety of ways, ranging from tasks in which the mathematical component was crucial to others where mathematics served solely as a marker of credibility. This research has the potential to provide tools for understanding how to productively incorporate information-literacy instruction into the mathematics classroom without losing sight of mathematical goals.

April 14, 2017 
Ping-Shou Zhong, Michigan State University
Unified empirical likelihood ratio tests for functional concurrent linear models and the phase transition from sparse to dense functional data  

We consider the problem of testing functional constraints in a class of functional concurrent linear models where both the predictors and the response are functional data measured at discrete time points. We propose test procedures based on the empirical likelihood with bias-corrected estimating equations to conduct both pointwise and simultaneous inferences. The asymptotic distributions of the test statistics are derived under the null and local alternative hypotheses, where sparse and dense functional data are considered in a unified framework. We find a phase transition in the asymptotic null distributions and the orders of detectable alternatives from sparse to dense functional data. Specifically, the proposed tests can detect alternatives of root-n order when the number of repeated measurements per curve is of an order larger than nη0 with n being the number of curves. The transition points η0 for pointwise and simultaneous tests are different and both are smaller than the transition point in the estimation problem. Simulation studies and real data analyses are conducted to demonstrate the proposed methods.

April 21, 2017
Jeffrey Thunder, Northern Illinois University
An application of number theory to coding  

Most codes in use today are linear codes (subsets of vector spaces over a finite field), the theory of which is essentially the domain of algebra and geometry. Here we will consider an application of a particular area of number theory, Diophantine geometry, to the subject of non-linear codes. We will discuss the non-linear codes under consideration and also the techniques borrowed from number theory which apply. We will see how these techniques may be used to prove that, in a fairly concrete sense, these codes have excellent transmission and error detection rates in comparison to linear codes.

May 5, 2017 
Xiaozhe Hu, Tufts University
Algebraic multigrid methods for computing diffusion state distance on graphs  

Recently, diffusion state distance was introduced for protein-protein interaction networks and used for protein function prediction. In this talk, we focus on the challenges in computing diffusion state distance, especially for large-scale networks. By exploring the algebraic properties of the diffusion state distance, we reformulate the computation of the distance into solving a series of graph Laplacians and apply algebraic multigrid methods to solve them efficiently. Applications to the protein-protein networks will be presented and possible generalizations will be discussed.

May 12, 2017 
Daniel Naiman, Johns Hopkins University
To replace or not to replace in finite population sampling  

A classical result in finite population sampling states that in equally-likely "simple" random sampling the sample mean is more reliable when we do not replace after each draw. This talk focuses on the case of weighted sampling, where it is natural to compare the the Horvitz-Thompson inverse probability weighted estimator to the estimator based on sampling with replacement for a sampling design with the same marginals. A sufficient condition for superiority of the sampling without replacement scheme is presented based on sampling schemes in which bivariate selection probabilities take a special form. This form is one leading to interesting geometric considerations, questions as to existence of joint distributions, and effective sampling algorithms arise. Instances in which it is better to sample without replacement are described. Similar results from the literature are reviewed, including results due to Bondesson & Traat (2013), and the connection to our work is explained. This is joint work with Fred Torcaso.

May 19, 2017 
Laurel Beckett, University of California, Davis
Looking for heterogeneity in markers for brain aging  

Cognitive decline is common in aging, with Alzheimer's Disease (AD) the most common and one of the most feared diseases. AD brain damage begins long before clinical symptoms. Neuroimaging allows us to visualize such features of brain damage as amyloid or tau protein accumulation, decreased metabolism, and cortical atrophy.   
Longitudinal measurement can help to characterize the earliest signs and their timing relative to each other and to cognitive impairment. The classic hypothesis for AD has been that the first stage is amyloid plaques, followed by cell death and atrophy. Postmortem data and neuroimaging increasingly suggest, however, that amyloid is not the whole story. I will present statistical methods for analyzing longitudinal data from multiple neuroimaging modalities that address two questions:  
1. What are the earliest features of brain damage?  
2. Are there distinct subgroups that suggest different underlying syndromes?  
The answers will help us to determine the timing and targets for interventions to prevent or delay the onset of cognitive impairment or dementia. I will illustrate with data from the Alzheimer’s Disease Neuroimaging Initiative (ADNI).

May 26, 2017 
George Hart, Stony Brook University
Applying math to sculpture  

George Hart will discuss examples of his mathematically informed sculptures, which generally are inspired by mathematical ideas and then apply computer technology in their design and/or fabrication. These include works made of metal, wood, plastic, or found objects, and often use laser-cutting, plasma-cutting or 3D-printing technologies in their realization. Some of the underlying ideas will be discussed, ranging from four-dimensional polytopes to inversive transformations to zonotopal dissections to uniform tessellations in the hyperbolic plane. Along the way, Hart will show a variety of related work from his creative output, including mathematical puzzles, insightful videos, hands-on workshop activities, and the museum of mathematics in NYC—all means to get the public thinking and talking about mathematical ideas and demonstrate that math is a living, creative, joyful subject. Some small physical examples will be on hand for people to see and enjoy. See www.georgehart.com for examples and more information.  
Bio: George Hart is a sculptor and applied mathematician who finds original ways to share the beauty of mathematical thinking. An interdepartmental research professor at Stony Brook University, he holds a BS in Mathematics and a PhD in Electrical Engineering and Computer Science from MIT. Hart is an organizer of the annual Bridges Conference on Mathematics and Art, and the editor for sculpture for the Journal of Mathematics and the Arts. His research explores innovative ways to use computer technology in the design and fabrication of his artwork, which can be found at universities around the world, including MIT, Princeton, Brown, Duke, and U.C. Berkeley. He is also passionate about designing educational activities that focus on making math visible via hands-on construction in classrooms from elementary level through college. Hart co-founded the Museum of Mathematics in New York City and developed its initial set of interactive exhibits, which have been enjoyed by over a half million visitors. He also makes videos that show the fun and creative sides of mathematics. See www.georgehart.com for examples of his work.

June 9, 2017 
Bianca Viray, University of Washington
The local to global principle for rational points  

Let X be a connected smooth projective variety over ℚ. If X has a ℚ point, then X must have local points, i.e. points over the reals and over the p-adic completions ℚp. However, local solubility is often not sufficient. Manin showed that quadratic reciprocity together with higher reciprocity laws can obstruct the existence of a ℚ point (a global point) even when there exist local points. We will give an overview of this obstruction (in the case of quadratic reciprocity) and then show that for certain surfaces, this reciprocity obstruction can be viewed in a geometric manner. More precisely, we will show that for degree 4 del Pezzo surfaces, Manin's obstruction to the existence of a rational point is equivalent to the surface being fibered into genus 1 curves, each of which fail to be locally solvable. This talk will be suitable for a general audience. In particular, all mathematical words in this abstract will be defined in the talk.