Archive of MSOR School Colloquia

Erupting Dusts

Mark McGuinness

We present a new model for the initiation of high-speed eruptive two-phase dust flows in the laboratory. Shock-tube experiments have been conducted on beds of solid particles in nitrogen under high pressure, which are suddenly decompressed. Our model is successful in explaining the slab-like structures that are often observed during initiation of bed movement, by considering the interaction between the compressible flow of gas through the bed and the stress field in the particle bed, which ruptures when bed cohesion is overcome by the effective stress in the bed generated by the gas flow. Our model includes the effects of overburden and wall friction, and predicts that all layered configurations will rupture initially in this fashion, consistent with experimental observation. We also find that the modelled dependence of layer size on particle size is a good match to experiment. This work arose out of discussions with Colin Wilson in SGEES following my Colloquium talk here in 2010 on exploding rocks.

The Mathematics of the Internet

Dillon Mayhew

Our world has been completely transformed by the instantaneous movement of information around the globe.The amount of information being transmitted is massive - if we filled A4 pages with the information sent over the internet during an average hour in 2013, those pages would comfortably cover the surface of the moon. How can we transmit this torrent of information accurately and securely? The answer lies in the mathematics of codes and ciphers. This talk will be a non-technical explanation of some of these mathematical ideas.

Known Unknowables - Logic and the Limits to Mathematics

Adam Day

There are many unsolved questions in mathematics. Most of these questions are unsolved because we simply have not been smart enough to figure out how to answer them. We can hope that these will be solved by future generations of mathematicians building on work done today. However, there are some questions in mathematics that cannot be answered (at least by mathematics as we currently understand it). These include questions such as: are the foundations of mathematics logically consistent? This talk gave a non-technical introduction to these questions focusing on Gödel's incompleteness theorem and why we know that we cannot answer these questions...

On the Generation of Tsunami Waves

Dimitrios Mitsotakis

(4pm Friday 24 August 2014, LBLT118, Laby Building 118)

A tsunami is a water wave caused by the displacement of a large volume of a body of water in an ocean or a large lake. Earthquakes, volcanic eruptions landslides and other disturbances (such as detonation of underwater nuclear devices, meteorite impacts etc.) can generate a tsunami. A tsunami warning system is used to detect tsunamis in advance and issue warnings to prevent loss of life and damage. Tsunami warning systems require an efficient and reliable model to describe the early stages of a tsunami and issue accurate tsunami warnings.The present lecture is devoted to the problem of the generation of tsunami waves due to an underwater earthquake. This talk will be discussing the developments of the present study that are illustrated on the 17 July 2006 Java event, where an underwater earthquake of magnitude 7.7 generated a tsunami that inundated the southern coast of Java.

Inductive Reasoning and (one of) the Foundations of Machine Learning

David Balduzzin

(4pm Friday 26 September 2014, LBLT118, Laby Building 118)

In mathematics, theorems are deduced from axioms that are asserted to be self-evidently true. The natural sciences are quite different. There are no axioms. Instead, physical laws are inferred from patterns observed in empirical data. Leibniz famously imagined taking a quill pen and randomly splashing ink on a sheet of paper. Is it not always possible to find some mathematical expression that explains this data? When is this mathematical expression a law? When are we justified in calling it a law?

Scientists have confronted these problems for centuries, but it is only in the last 50 years that a formal theory of inference has been developed. This talk will give a non-technical overview of learning theory, and sketch how we can understand and control how the performance of learning algorithms on new data — to which were not previously exposed. Sister Mary Celine Fasenmyer and recurrence relations for polynomial sequences

Petros Hadjicostas

(4pm Thursday 23 October 2014, LBLT118, Laby Building 118)

Sister Mary Celine Fasenmyer earned her PhD in Mathematics at the University of Michigan in 1946 at the age of 40. She published two papers related to her PhD thesis topic, and went on to teach at Mercyhurst College in Pennsylvania, USA. She was the first one to provide a quite general algorithmic method for finding pure recurrence relations satisfied by various hypergeometric polynomial sequences. Her method was mentioned in the book Special Functions by her PhD supervisor Earl Rainville in 1960, but other than that it did not receive attention until the 1970s when various mathematicians (such as Doron Zeilberger and Herbert Wilf) realised its importance, and generalised it. This has led to the establishment of the WZ theory that allows one to "provide extremely succinct proofs of known identities," and "discover new identities whenever it succeeds in finding a proof certificate for a known identity."

Almost fifty years after she finished her PhD thesis, Sister Fasenmyer was brought out of retirement (at the age of 87) and was formally recognized in a conference held in her honour in Boca Raton, Florida. This talk will review Sister Fasenmyer's method and its impact today. Some examples from my research that used her method will be mentioned.

2010

  • 8 October 2010; Nokuthaba Sibanda: "How Good is Your GP?"
  • 10 September 2010; Peter Donelan: "A Robot's Walk in the Garden of Invariant Theory"
  • 6 August 2010; Rob Goldblatt: "What is a Co-Algebra?"
  • 28 May 2010; Mark McGuinness: "Exploding Rock"
  • 23 April 2010; Noam Greenberg: "What is Model Theory?"
  • 19 March 2010; Shirley Pledger: "Fuzzy Ecological Communities"

How Good is Your GP?

Nokuthaba Sibanda (4pm Friday 8 October 2010, Easterfield LT206)

The talk focuses on how methods for measuring and comparing performance among healthcare providers have evolved over the last few decades.

Quality control techniques that were designed to monitor product quality in manufacturing have been adapted for: 1) continuous monitoring of patient care outcomes to enable timely detection of deterioration in performance, and 2) comparing hospitals to identify those that give the best and worst quality of care. I will discuss the adjustments required to make quality control methods suitable for use with healthcare data. I will also review the latest developments in this area.


A Robot's Walk in the Garden of Invariant Theory

Peter Donelan (4pm Friday 10 September 2010, Easterfield LT206)

Classification and recognition are fundamental human activities by which we seek to understand the world we observe. For mathematicians, classifications arise from the action of a group of symmetries and invariants provide recognition principles. Prompted by developments in geometry and algebra, the foundations of Invariant Theory were laid in the 19th century, culminating in groundbreaking work of Hilbert. In the last few decades the subject has been rejuvenated by the development of computational tools and new applications in, for example, computer vision and quantum computing. I will attempt to illuminate some of the underlying ideas via applications in robot kinematics.

What is a Co-Algebra?

Rob Goldblatt (4pm Friday 6 August 2010, Easterfield LT206)

This talk will discuss the historical evolution of the notion of an "algebra", and the principle of duality that leads from there to the notion of a coalgebra.

Coalgebras are structures that have wide application, including to the modelling of state-based dynamical systems and automata, data structures like streams and trees, and interpretations of modal logics.


Exploding Rock

Mark McGuinness (4pm Friday 28 May 2010, Laby LT118)

The mathematical model presented here is motivated by recent experimental work, in which a vertical column of rock under large pressure is suddenly depressurised, so that it explodes, in a sequence of horizontal fractures that forms from the top down. The resulting blocks are lifted off and ejected by the escaping gas. This experiment provides a framework for understanding the way in which catastrophic explosion can occur, and is motivated by the corresponding phenomenon of magmatic explosion during Vulcanian eruptions. I will summarise a theoretical model built to describe these results, and show that it is capable of describing both the primary sequence of fracturing, and the secondary intra-block fracturing.


What is Model Theory?

Noam Greenberg (4pm Friday 23 April 2010, Laby LT118)

Model theory is a branch of mathematical logic which uses logical and topological tools to unify algebra, geometry and analysis. In this talk I plan to survey the landscape of model theory, from its modest beginnings in Tarski's theorem on projections of semi-algebraic sets, to Shelah's grand cathedral, the main gap theorem.


Fuzzy Ecological Communities

Shirley Pledger (4pm Friday 19 March 2010, Laby LT118)

In studies of ecological communities, a typical data set is a matrix with one row per species and one column per sample. The matrix may contain incidence data (presence/absence, binary data 1/0), or abundance data (count data, the number of individuals of each species at each site).

Traditional analyses use multivariate methods such as multidimensional scaling, principal component analysis, cluster analysis, ordination, correspondence analysis, association analysis, and direct and indirect gradient analysis. These methods are essentially mathematical rather than statistical, and they provide dimension reduction and plots in low dimensions in order to illustrate the main features of the data.

By introducing statistical mixture models, we may switch to fuzzy clustering, in which species and/or samples are allocated to groups probabilistically. Exploring these models, we obtain not only low-dimensional graphical results similar to those from traditional analyses, but also the benefits of available methods for comparing models, testing hypotheses and estimating parameters. For example, (i) if species are to be clustered into "functional groups" (occurring in similar habitats), how many clusters are indicated by the data? (ii) can the samples be ordered (using their species compositions) along a single axis, or are more dimensions needed to represent the patterns?

Although the examples will be from community ecology, these models have applications in a wide range of disciplines.

No detailed knowledge of mathematics, statistics or, indeed, ecology is required of the audience.


2009

2009

  • 20 March 2009; Richard Arnold: "Measuring tectonic stresses from earthquakes and Forecasting the Election night result on television"
  • 8 May 2009, Rod Downey: "When does a problem have a solution: A logician and computability theorist's view"
  • 29 May 2009, Mark Johnston: "Insight from Visualisation in Combinatorial Optimisation"
  • 17 July 2009; Dillon Mayhew: "What is a matroid?"
  • 11 September 2009; Estate Khmaladze: "How to detect small changes in statistics, and what happens when we try to catch elusive objects?"
  • 2 October 2009; Matt Visser: "The interface between quantum physics and gravity"

Measuring tectonic stresses from earthquakes and Forecasting the Election night result on television.

Richard Arnold (4pm Friday 20 March 2009, CO339)

Earthquakes are evidence of the tectonic stresses in the earth's crust. These stresses are anisotropic, and their orientations and magnitudes can vary strongly from place to place. In this talk I will present work on using statistical techniques to use data from earthquakes to estimate the properties of the tectonic stress. This statistical problem is strongly constrained by both geometry and physics -- and I will discuss some of the approaches to this problem that I have worked on, together with geophysicists in SGEES.

I will also spend a short while discussing what it was like to be the Election Night Statistician for Television New Zealand last year. I'll briefly discuss the statistical model that we used, and how it was implemented.


When does a problem have a solution: A logician and computability theorist's view.

Rod Downey (4pm Friday 8 May 2009, CO339)

Much of mathematics is devoted to giving solutions to equations, calculating solutions to problems, classifying structures according to invariants and the like. Natural questions arise as to when this is not possible. This talk looks at questions such as this tracing, in a idiosyncratic way, a historical line leading to modern incarnations wherein logic allows us to show that no invariants are possible for (e.g.) certain problems in group theory. This is done by showing that normal mathematical structures can be caused to emulate computation in faithful ways.

This talk is aimed at a very general audience, and will be accessible to beginning graduate or even advanced undergraduate students.


Insight from Visualisation in Combinatorial Optimisation

Mark Johnston (4pm Friday 29 May 2009, CO339)

Combinatorial optimisation problems are used extensively as mathematical models of large-scale scheduling, timetabling and routing applications, but are extremely difficult to solve. Local-search based metaheuristics, however, expect to find good solutions in reasonable computation time, by essentially balancing between brute-force neighbourhood search and insightful rules-of-thumb. The idea that visualisation leads to insights is championed by Edward Tufte with his goal to "maximize the ratio of data to ink''. In this talk we give several examples where novel visual representations of solutions to combinatorial optimisation problems, often requiring significant computational effort to produce, lead to valuable insights from which better heuristics can be developed.

This talk is aimed at a very general audience.


What is a Matroid?

Dillon Mayhew (4pm Friday 17 July 2009, CO339)

Matroids are abstract objects that lie just beneath the surface of many naturally-occurring mathematical entities. In this talk I will explain what matroids are, why you should be interested in them, and more particularly, why I am interested in them. The talk will be introductory, and no prior knowledge will be assumed.


How to detect small changes in statistics, and what happens when we try to catch elusive objects?

Estate Khmaladze (4pm Friday 11 September 2009, CO339)

Given a sample, how small could be the changes in its distribution that we can detect? - in the first part of the talk we will show examples, when this becomes indeed a central and most essential question of research. In the second part we will discuss a seemingly straightforward extension of this question to the case of geometric objects like shape: given a sample, which contains imprecise measurements intended to describe a set, or an image, how small changes in this image can be detected and distinguished?

All of a sudden, a theoretical answer to this question opens a large new area and reveals connections between various fields, which were invisible and hidden up to now.

Thus, unlike the talk of Professor Downey, who showed us a magnificent building of a fascinating part of modern mathematics, I will take you to one of the construction sites of modern statistics, where the building has only just started. However, the site is beautiful and full of promise. Although I still do not know it so very well, let us try.


The interface between quantum physics and gravity

Matt Visser (4pm Friday 2 October 2009, EALT206)

The search for a quantum theory of gravity, some sort of theory merging the quantum realm with Einstein's general relativity, has frustrated mathematicians and theoretical physicists for over fifty years. Why do we care? What should we be looking for in such a theory? Is there any realistic hope of progress in the near future?