Ewch i’r prif gynnwys

Operational Research Group

Our research contributes to fundamental operational research theory and creates impact. We have particular strengths and leading expertise in areas of optimisation and planning, queueing systems, healthcare modelling, environmental modelling, and finance and risk.

The group has an impressive track-record of contributing to both the theoretical underpinnings of the subject and to novel applications, including working on complex problems arising in healthcare, public health and epidemiology, finance, transportation, timetabling, environmental issues, manufacturing, information security, and green logistics.

We have over 20 academic staff and research students and host an active seminar series jointly with the Statistics research group, Data Innovation Research Institute (DIRI) seminars,  and meetings of the South Wales Operational Research Discussion Society (SWORDS), which regularly attracts a large audience of OR researchers and practitioners.

Our group members are also prominent in interdisciplinary collaboration, with particularly strong links to researchers in: computer science and informatics (research on data science, machine learning and their interface with OR methods); medicine (through our extensive programme of modelling work on health and care services); social sciences (modelling for societal benefit, social network analysis, public health and educational interventions); earth sciences and geography (food and water insecurity, rainfall modelling); and business and economics (financial and risk modelling).

Group members are highly active in international collaborations, for example including joint research and staff/student mobility with the Federal University of Rio de Janeiro and the University of São Paulo (Brazil), Twente University (Netherlands), ETH Zürich and EPFL (Switzerland), Carnegie Mellon University, University of California Davis, Cornell University and New York University (USA) University of Toronto, University of Manitoba (Canada), University of Namibia, University of Vienna (Austria), Technische Universität Berlin, Karlsruhe Institute of Technology and Technische Universität München (Germany), Monash University and the University of Melbourne (Australia), and University of Lisbon (Portugal).

Our staff and students have partnered with major external research users who help us deliver societal and economic benefit, including: Admiral, BA, Ernst & Young, Lloyds, McLaren, Nationwide, ONS, Tata Steel, Welsh Government and Welsh Water. Over the last REF period alone, the OR group has secured income directly from industry and charities to fund research amounting to 9 PhD studentships, 18 funded years of research associate time, and 4 years towards a match-funded lectureship.

Members of the Operational Research group are also contributing to the response to the COVID-19 pandemic and have been extensively consulted by the Welsh Government (WG) and its Technical Advisory Group (TAG). For example, OR methods and thinking have been key to scheduling the roll out of vaccinations across NHS Wales, workforce planning for WG Test, Trace and Protect strategy, facilitating a safe university campus, and optimising the design and deployment of our in-house asymptomatic testing service. The OR group is part of the Welsh wastewater screening project WeWASH, and we participate in the UK-wide OR response (N-CORN) through the Royal Society’s RAMP initiative.

The main areas of research within the current group are:

In focus

Planning and optimisation

Our planning and optimisation research considers fundamental research on mathematical optimisation techniques and also their applications to real life problems, particularly in the areas of transport, scheduling and packing. These techniques can be used to introduce efficiency and reduce waste in the logistical operations of companies and government agencies.

Fundamental research focuses on the theory of mixed-integer programming and combinatorial optimisation. In mixed-integer programming, we are interested in sparsity-proximity type optimisation problems and algorithms. Our research combines the standard optimisation tools with results from convex discrete geometry and algorithmic geometry of numbers. In combinatorial optimisation, our focus in mostly on graph-theoretical problems such as graph colouring, dominating sets and packings in graphs, vertex and arc routing, shortest path problems, and fixed-length cycle problems. We are particularly interested in dynamic versions of these, whereby problem structures and requirements evolve over time. Typical methods used in these areas include heuristics and metaheuristics, integer programming, and hybrid techniques.

The group has carried out research in many real-world application areas. Previous research has looked at sports timetabling, where the aim is to produce schedules that are fair to competitors and that also satisfy constraints regarding venue availability, television rights, and so on. The group has previously worked with the International Rugby Board and the Welsh Rugby Union and has used metaheuristic search techniques to produce schedules for world cups, Welsh domestic rugby leagues, and international rugby fixtures.

Another vibrant area of research concerns the problem of timetabling. Universities, for example, periodically face the burden of scheduling exams and lectures so that a variety of complex, and often conflicting constraints are met. Members of the group have previously designed methods for such problems and have been involved in the organisation of the International Timetabling Competition which permits researchers from across the globe to design and test their algorithms on real-life problems in a competitive environment.

The group have also published widely in the field of partitioning problems. Such problems arise regularly in industry, transportation and logistics, and include multi-dimensional packing and balancing problems, stock cutting problems, rostering problems and various graph theoretical problems, including graph colouring. Stock cutting problems, for example, arise in areas such as the clothing and building industries, where the aim is to cut a set of predefined and possibly multi-dimensioned items from a set of equi-dimensioned “stocks” such that the wastage is minimised (thus encouraging economic savings).

As mentioned, the group is also interested in dynamic routing problems – that is routing problems where requirements change over time. An example is where a company receives new orders during the day and has to reroute delivery vans to the new customers while still minimising the distance travelled.

We work on combinatorial modelling for facility location problems, e.g. to optimise placement of refuelling stations for alternative fuel vehicles in road networks. Research in information security and access control deals with location obfuscation methods for online route planning and the workflow satisfiability problem. We have experience in designing heuristic and fixed-parameter tractable algorithms for the corresponding computationally challenging problems.

Queueing systems

There is a strong Cardiff OR tradition in the study of queueing systems, with fundamental research and applications of queueing theory, simulation and game theory. A typical research project involves both analytical insights from queueing theory and game theory, and the use of computer simulation with particular applications to healthcare and transportation problems.

Recent theoretical has made significant progress with the transient solution of queueing systems with a variety of service mechanisms, time-dependent queueing systems with multiple priority classes, and novel behavioural queueing theory considering switching thresholds and service times for servers (e.g. A&E staff) dependent on system congestion (e.g. numbers of patients waiting for service).

Research and application in simulation has involved discrete-event, system dynamics, agent-based, and hybrid methods. Novel research has focussed on the use of simulation models incorporating social network structures for modelling of disease propagation, modelling consumer choice and incorporating human behaviours. Applications include NHS patient choice, ambulance services, breast cancer, public health,  A&E department and critical care. Novel work on hybrid methods is exploring the feasibility and benefits of combined methodologies (such as ABS and SD) and work with Social Scientists and Public Health/NHS.

Another focus of our research in this field is on developing and supporting open source and sustainable decision support tools, such as the discrete event simulation shell Ciw. The OR group has had three Fellows of the Software Sustainability Institute: Dr Vince Knight, Dr Nikoleta Glynatsi, and Dr Geraint Palmer.

Healthcare modelling

Cardiff is renowned for its long and successful tradition of research in this field. We have a large and active group of staff and postgraduate research students working on numerous health-related topics, including planning and management of healthcare services, epidemiology, and prevention, early detection and treatment of disease.

Professor Harper is Director of Health Modelling Centre Cymru (hmc2) and Director of the Data Innovation Research Institute. Dr Daniel Gartner is a matched funded senior lecturer by the Aneurin Bevan University Health Board, and a number of PhD students and Research Associates (post-doctoral students) are funded directly by NHS Wales and Public Health Wales.  We have established an innovative researchers-in-residence programme with the Aneurin Bevan University Health Board, with joint working between the OR group and the Aneurin Bevan Continuous Improvement team.

Particular contributions include stochastic models for integrated healthcare resource systems (hospital bed capacities, theatre scheduling and workforce planning), stochastic facility location problems, conditional phase-type modelling, patient choice, combined data mining and simulation methodologies, modelling the cost-effectiveness of various strategies for preventing and screening for disease including breast cancer, colorectal cancer, HIV/AIDS and diabetic retinopathy, targeted screening programmes for Chlamydia, small world models for the dynamics of HIV infection, and novel research on healthcare behavioural modelling.

Our work is also contributing to global development, such as an active EPSRC GCRF funded project to develop spatiotemporal forecasts of demand, with stochastic optimisation and simulation decision support tools for emergency and disaster relief planning across Indonesia. It builds both on novel underpinning research to uniquely capture the behaviour of queueing systems with multiple priority classes and time-dependent arrivals.

Several staff within the group are members of the European Working Group on Operational Research Applied to Health Services (ORAHS), and members of the Steering Group of the EPSRC funded Network in Healthcare Modelling and Simulation (MASHnet).

The healthcare modelling team are recipients of a Times Higher Educational Award in the category of “Outstanding Contribution to Innovation and Technology” for healthcare modelling research and impact (“Maths Saves Lives!”). Several of our research students have won distinguished prizes for their work in this field, including the OR society 2018 doctoral award to Dr Geraint Palmer. Professor Paul Harper became, in 2019, the youngest ever recipient of the Companion of Operational Research Award, for sustained support for the development of OR and outstanding service to The OR Society and the wider OR community, particularly for his work in healthcare modelling.

Environmental modelling

Professor Owen Jones is a Board Member of the Water Research Institute and leads on its theme “Digital Solutions for Water Risk Management”. He is an investigator on the Cardiff-led H2020 DOWN2EARTH project on food and water insecurity in the Horn of Africa Drylands. Professor Jones is applying his expertise in rainfall modelling to construct a flexible simulation model capable of delivering spatial-temporal fields of rainfall at high resolution (relevant to drylands), to drive a hydrological model incorporating runoff, recharge of groundwater, soil moisture, and evaporation. The models will be sufficiently parsimonious to be fitted using reliably available data, and will be implemented efficiently so that simulations can be used to account for climatic stochasticity. Model outputs will allow predictions of soil moisture and groundwater, yielding concise information that can be used by farmers and pastoralists, NGOs and governments to mitigate the impacts of climate change on rural livelihoods.

Owen Jones (together with Kirstin Strokorb and Marie Ekström, Earth Sciences), won the 2018 Royal Statistical Society Mardia Workshop Prize to support interdisciplinary work on extreme weather by funding the international workshop series “Workshops on Extreme Trends in Weather”

Finance and risk

Extending the classic financial mathematics/OR research, our research on finance and risk has broadened to consider the state of art technology and model frameworks that can contribute to both academic literature and theory as well as practical problems directly coming from the financial industry, markets and regulators. We cover the large spectrum of finance research including time series and stochastic modelling for finance, financial pricing theory and modeling, portfolio management, asset management, fintech technology, behavioral finance, financial markets and regulation, fraud detection, financial stability and reliability, financial networks, market microstructure, trading and hedging etc.

In the realm of  traditional mathematical finance, our main focus is on Fractal Activity Time Geometric Brownian Motion (FATGBM) models, option pricing, and stochastic pricing modeling. Developing hedging strategies of such models (e.g. FATGBM pricing) help advance their applications to industrial practice. Recent progress has been made on delta hedging of this type of pricing model.

The group aims at contributing to cutting-edge research fields in finance and economics that have high importance and applications. We also highly promote and encourage interdisciplinary research and  academic-business collaborations worldwide. One of the highlights of the group’s research is to establish financial modelling with Hawkes processes, which has popularized in recent years. This has enabled us advance in the area of behavioural finance, studying problems like herding effects, contagion of extremal returns, trading behavior recognition and classification, financial market stability and reliability and market microstructure. We hosted the first workshop on Hawkes processes in finance in Cardiff in 2017, co-hosted another two similar workshops in Swansea in 2018 and in New York City in 2019. These have also enabled us to lead editing two special issues of ‘Hawkes processes in Finance’ with Quantitative Finance and The European Journal Finance.

Most recently, the group has dedicated itself to develop in FinTech with strong focus on problems such as blockchain and cryptocurrency trading, AI and machine learning in finance, social trading network etc. We also examine these areas from the perspective of behaviour finance and decision theories. One of the examples is to model investment sentiment and financial market complexity through the entropy-based new modeling framework. All these research emphasize on their scientific foundation and processes as well as the economic, social and policy impact towards a wide range of user groups from general public to finance professionals.

In the next few years, the group has the vision and dedication to further establish in the area of FinTech and explore broader topics such as green finance, definancing network, financial inclusion, knowledge management in the complex financial system and more. We will be leading a SIG (Special Interest Group) of FinTech under DIRI. Our hope is to continue growing strong interdisciplinary research projects, with particular close collaborations with computer science, business and management. We will also aim to lead various innovative initiatives of collaboration with world renowned research institutions and organisations such as NSF (National Science Foundation, USA), Innovative UK, CFTC (Commodity Futures Trading Committee) etc.

Head of group

Yr Athro Paul Harper

Yr Athro Paul Harper

Deputy Head of School, Professor of Operational Research

+44 (0)29 2087 6841

Staff academaidd

Dr Iskander Aliev

Dr Iskander Aliev

Senior Lecturer

+44 (0)29 2087 5547
Yr Athro Maggie Chen

Yr Athro Maggie Chen

Senior Lecturer in Financial Mathematics

+44 (0)29 2087 5523
Dr Dafydd Evans

Dr Dafydd Evans

Lecturer in Operational Research

Siarad Cymraeg
+44 (0)29 2087 0621
Dr Andrei Gagarin

Dr Andrei Gagarin


+44 (0)29 2068 8850
Yr Athro Daniel Gartner

Yr Athro Daniel Gartner

Professor of Operational Research

+44 (0)29 2087 0850
Yr Athro Owen Jones

Yr Athro Owen Jones

Chair in Operational Research

029 2251 0253
Dr Vincent Knight

Dr Vincent Knight


+44 (0)29 2087 5548
Dr Rhyd Lewis

Dr Rhyd Lewis


+44 (0)29 2087 4856
Dr Anqi Liu

Dr Anqi Liu

Lecturer in Financial Mathematics

+44 29208 70908
Dr Jonathan Thompson

Dr Jonathan Thompson

Admissions Tutor

+44 (0)29 2087 5524
Dr Mark Tuson

Dr Mark Tuson


+44 (0)29 2087 4827
Dr Sarie Brice

Dr Sarie Brice

Research Associate

Dr Henry Wilde

Dr Henry Wilde

Research Associate


Myfyrwyr Ôl-raddedig

All seminars will be held virtually via Zoom and commence at 14:10 on Thursdays (unless otherwise stated).

View the seminar calendar of the Statistics and OR group.

The calendar is maintained independently by members of the research groups.

Please contact Dr Mark Tuson for more details regarding Operational Research/WIMCS lectures and Bertrand Gauthier and  Kirstin Strokorb for more details regarding Statistics lectures.



9 March 2020

Room M/0.40

Almut Veraart (Imperial College London)

Volatility estimation in time and space

The concept of (stochastic) volatility/intermittency is of central importance in many fields of science. In this talk I am going to discuss how stochastic volatility can be introduced in a stochastic model and which properties of the stochastic model have an influence on the methods available for volatility estimation. I will showcase some recent results on how stochastic volatility can be estimated in multivariate non-semimartingale settings and show some first results in extending the classical stochastic volatility concept to spatial/spatio-temporal settings.

The results presented in this talk are based on collaborations with Ole E. Barndorff-Nielsen, Fred Espen Benth, Andrea Granelli, Michele Nguyen, Riccardo Passaggeri.

2 March 2020Ioannis Kosmidis (Warwick University)

Improved estimation of models for ordinal responses

For the estimation of cumulative link models and adjacent category models for ordinal data, we derive adjustments to the likelihood score functions, whose solution ensures an estimator with smaller asymptotic bias than the maximum likelihood estimator typically has. The form of the adjustments suggests a parameter-dependent adjustment of the multinomial counts, which in turn suggests the solution of the adjusted score equations through iterated maximum likelihood fits on adjusted counts, greatly facilitating implementation.

Like the maximum likelihood estimator, the reduced-bias estimator is found to respect the key invariance properties that make cumulative link models a good choice for the analysis of categorical data. Its additional finiteness and optimal frequentist properties, along with the adequate behaviour of related asymptotic inferential procedures, make the reduced-bias estimator attractive as a default choice for practical applications.

We will also discuss the improved estimation of the adjacent category model, which is another popular model for ordinal data, and how this can be achieved using a modification of the so-called "Poisson trick".

13 February 2020

Time:14:10 to 15:10

Room M/2.20

Tatiana Benaglia (University of Campinas)Bayesian Mixture Models for longitudinal data on cognition loss in elderly people

A regression mixture model to handle elderly’s cognitive ability up to their death is presented. Cognition is measured across time with standard questionnaires from geriatrics which involve, amongst others, memory, language and reasoning issues. The output of such questionnaires is recorded with a countable and finite score. Models for Binomial response variables are discussed here. The mixture specification rises to discriminate two prevalent behaviours in the data: one group of elderly people presents cognition decline at constant rate; whilst the other experiences a spontaneous accelerated decline at some time. The latter aspect is dealt with random change points nonlinear predictors. In addition, logit and complementary log-log link functions were used to model the mixture allocation with predictor variables. The study’s goal is to quantify associations amidst cognition loss and the diagnostics of dementias like Alzheimer’s disease, besides sociodemographic factors. The proposed model is evaluated in the database provided by the Rush University - Chicago, United States, through the Rush Memory and Aging Project from 1997 to 2016.

The talk is based on joint work with Eric Krishna, Hildete P. Pinheiro (Campinas) and Graciela Muniz-Terrera (Edinburgh).

10 February 2020Xin Liu (University of Bath)Diversification in Lottery-Like Features and Portfolio Pricing Discounts

I study the asset pricing implications of cumulative prospect theory on portfolio discounts. I extend Barberis and Huang (2008) and show that a portfolio consisting of lottery-like stocks should trade at a discount due to diversification. This discount can be partially mitigated if lottery-like stocks tend to produce extreme payoffs at the same time. I utilize three empirical settings to support this theoretical prediction: the closed-end fund puzzle, the announcement returns of mergers and acquisitions, and conglomerate discounts. My findings support cumulative prospect theory from an alternative perspective and provide a novel and unifying explanation for three seemingly unrelated phenomena.
27 January 2020Dino Sejdinovic (University of Oxford)Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings

Current meta-learning approaches focus on learning functional representations of relationships between variables, i.e. estimating conditional expectations in regression. In many applications, however, the conditional distributions cannot be meaningfully summarized solely by expectation (due to eg multimodality). We introduce a novel technique for meta-learning conditional densities, which combines neural representation and noise contrastive estimation together with well-established literature in conditional mean embeddings into reproducing kernel Hilbert spaces. The method shows significant improvements over standard density estimation methods on synthetic and real-world data, by leveraging shared representations across multiple conditional density estimation tasks.
17 December 2019Eliana Christou (UNC Charlotte, North Carolina)Central Quantile Subspace

Quantile regression (QR) is becoming increasingly popular due to its relevance in many scientific investigations. There is a great amount of work about linear and nonlinear QR models. Specifically, nonparametric estimation of the conditional quantiles received particular attention, due to its model flexibility. However, nonparametric QR techniques are limited in the number of covariates.  Dimension reduction offers a solution to this problem by considering low-dimensional smoothing without specifying any parametric or nonparametric regression relation. Existing dimension reduction techniques focus on the entire conditional distribution. We, on the other hand, turn our attention to dimension reduction techniques for conditional quantiles and introduce a new method for reducing the dimension of the predictor X. The novelty of this paper is threefold. We start by considering a single index quantile regression model, which assumes that the conditional quantile depends on X through a single linear combination of the predictors, then extend to a multi index quantile regression model, and finally, generalize the proposed methodology to any statistical functional of the conditional distribution. The performance of the methodology is demonstrated through simulation examples and real data applications. Our results suggest that this method has a good finite sample performance and often outperforms existing methods.

Please note this talk takes place at 11:10 in room M/2.06.
9 December 2019Ruth Misener (Imperial College)

Scoring positive semidefinite cutting planes for quadratic optimization via trained neural networks

Semidefinite programming relaxations complement polyhedral relaxations for quadratic optimization, but global optimization solvers built on polyhedral relaxations cannot fully exploit this advantage. We develop linear outer-approximations of semidefinite constraints that can be effectively integrated into global solvers for nonconvex quadratic optimization. The difference from previous work is that our proposed cuts are (i) sparser with respect to the number of nonzeros in the row and (ii) explicitly selected to improve the objective. A neural network estimator is key to our cut selection strategy: ranking each cut based on objective improvement involves solving a semidefinite optimization problem, but this is an expensive proposition at each Branch&Cut node. The neural network estimator, trained a priori of any instance to solve, takes the most time consuming computation offline by predicting the objective improvement for any cut.

This is joint work with Radu Baltean-Lugojan, Pierre Bonami, and Andrea Tramontani.

2 December 2019Edilson Fernandes De Arruda (Rio de Janeiro / Cardiff)Solving Markov Processes by Time Aggregation: Theory and Applications

Markov decision processes are a natural way to model sequential decisions under uncertainty and find applications in many fields, such as healthcare, renewable energy and supply chains. Unfortunately, complex problems give rise to very large state spaces (curse of dimensionality), rendering classical algorithms intractable. In this talk, I will discuss some algorithms that make use of time aggregation (embedding) to tackle the curse of dimensionality and seek optimal or sub-optimal solutions to complex systems in reasonable computational time. I will present some real-world applications to illustrate both the approach and the flexibility of Markov decision processes as a modelling tool.
25 November 2019Theo Economou (University of Exeter)An Advanced Hidden Markov Model for Hourly Rainfall Time Series

For hydrological applications, such as urban flood modelling, it is often important to be able to simulate sub-daily rainfall time series from stochastic models. However, modelling rainfall at this resolution poses several challenges, including a complex temporal structure including long dry periods, seasonal variation in both the occurrence and intensity of rainfall, and extreme values. We illustrate how the hidden Markov framework can be adapted to construct a compelling model for sub-daily rainfall, which is capable of capturing all of these important characteristics well. These adaptations include clone states and non-stationarity in both the transition matrix and conditional models. Set in the Bayesian framework, a rich quantification of both parametric and predictive uncertainty is available, and thorough model checking is made possible through posterior predictive analyses. Results from the model are interpretable, allowing for meaningful examination of seasonal variation and medium to long term trends in rainfall occurrence and intensity. To demonstrate the effectiveness of our approach, both in terms of model fit and interpretability, we apply the model to an 8-year long time series of hourly observations.
18 November 2019Jack Noonan (Cardiff University)First passage time for Slepian process with linear barrier

In 1971, L.A. Shepp found explicit formulas for the first passage probability Pr(S(t)<a for all t in [0,T] | S(0)=x), for all T>0, where S(t)is a Gaussian process with mean 0 and covariance E S(t)S(t')=max{0,1-|t-t’|}. In a recent paper, we extended Shepp’s results to the more general case of piecewise-linear barriers; previously, explicit formulas for even Pr(S(t)<a+bt for all t in [0,T]) were known only for the cases b=0 (constant barrier) or T <= 1 (short interval). In this talk, we outline applications to a change point detection problem - detecting temporary drift change of  Brownian. After discussing Average Run Length (ARL) approximations, we formulatevery accurate approximations for the power of the test. We also investigate the performance of the test when the change in drift is permanent and compare performance to the known optimal CUSUM and Shiryaev-Roberts procedures.

11 November 2019Enrica Pirozzi (University of Naples)On a Fractional Ornstein-Uhlenbeck Process and its applications

The seminar is centred on a fractional Ornstein-Uhlenbeck process that is solution of a linear stochastic differential equation, driven by a fractional Brownian motion; it is also characterised by a stochastic forcing term in the drift. For such a process, mean and covariance functions will be specified, concentrating on their asymptotic behaviour. A sort of short- or long-range dependence, under specified hypotheses on the covariance of the forcing process, will be shown. Applications of this process in neuronal modelling are discussed, providing an example of a stochastic forcing term as a linear combination of Heaviside functions with random center. Simulation algorithms for the sample path of this process are also given.
4 November 2019Emma Aspland (Cardiff University)

Lung Cancer Clinical Pathway Modelling

Clinical pathways are an effective and efficient approach in standardising the progression of treatment to support patient care and facilitate clinical decision making. Our review of the related literature highlighted a need to better integrate data engineering and OR techniques with expert/domain knowledge to assist with clinical pathway discovery and formation.  Consequently, we have produced a decision support tool that facilitates expert interaction with data mining, through the application of clustering. This has involved the development of a new distance metric, modified from the Needleman-Wunsch algorithm, that considers weightings and groupings of activities as specified by an expert user.  The resulting set of pathways are then automatically translated into the basis of a discrete event simulation to model patient flows through the captured clinical pathways.

4 November 2019Clement Twumasi (Cardiff University)

Comparative modelling of parasite population dynamics of two Gyrodactylus species  

Understanding fully host-parasite systems is challenging if employing just experimental approaches, whereas mathematical models can help uncover in-depth knowledge of the infection dynamics. The current study compares the infection dynamics of two parasite species (Gyrodactylus turnbulli and Gyrodactylus bullatarudis) across three host populations (ornamental, Lower Aripo and Upper Aripo fish), by developing a Continuous-time Markov Chain (CTMC) model. The model simulates the movement of parasites for two age groups over the external surfaces (eight body regions) of a fish over a 17-day infection period with population carrying capacity (dependant on host size and area of body regions). The model was parameterised by the birth, death and movement rates of young and older parasites, in the presence or absence of host’s immune response. Host death was assumed to occur at a rate proportional to the total number of parasites on the fish. The CTMC simulation model was fitted using a novel Weighted-iterative Approximate Bayesian Computation (ABC). The findings from this study would help policy makers and biologists to better understand the Gyrodactylus-fish system using mathematical models and inform management decisions for the control of gyrodactylid infections.

21 October 2019

Tri-Dung Nguyen (University of Southampton)

Game of Banks – Keeping Free ATMs Alive?

The LINK ATM network is a fundamental part of the UK's payments infrastructure - with nearly 62,000 ATMs - and cash machines are by far the most popular channel for cash withdrawal in the UK, used by millions of consumers every week. The record high daily withdrawal in 2019 was 10.7 million ATM transactions (29 March) and with over half a billion pounds paid out by ATMs (28 June).

The UK's cash machine network is special in that most of them are currently free of charge. Underlying this key feature is the arrangement among the banks and cash machine operators to settle the fees among themselves instead of putting the burden on the consumers' shoulders. The ATM network in the UK has recently, however, been experiencing many issues as some members are not happy with the mechanism for interchange fee sharing. In this talk, we show how Game Theory, especially how to combine mathematical models developed by John Nash and Lloyd Shapley, two Nobel laureates in Economics, to resolve the current ATM crisis.

We present a novel `coopetition' game theoretic model for banks to optimally invest in the ATM network and to share the cost. This coopetition game includes both a cooperative game theory framework as the mechanism for interchange fee sharing and a non-cooperative counterpart to model the fact that banks also wish to maximize their utilities. We show that the current mechanism for sharing is unstable, which explains why some members are threatening to leave. We also show that, under some settings, the Shapley allocation belongs to the core and hence it is not only fair to all members but also leads to a stable ATM network. We prove the existence of a pure Nash equilibrium, which can be computed efficiently. In addition, we show that the Shapley value allocation dominates the current mechanism in terms of social welfare. Finally, we provide numerical analysis and managerial insights through a case study using real data on the complete UK ATM network.

14 October 2019Ruth King (University of Edinburgh)Challenges of quantity versus complexity for ecological data

Capture-recapture data are often collected on animal populations to obtain insight into the given species and/or ecosystem. Long-term datasets combined with new technology for observing individuals are producing larger capture-recapture datasets – for example, repeated observations on >10,000 individuals are becoming increasingly common. Simultaneously, increasingly complex models are being developed to more accurately represent the underlying biological processes which permit a more intricate understanding of the system. However, fitting these more complex models to large datasets can become computationally very expensive. We propose a two step Bayesian approach: (i) fit the given capture-recapture model to a smaller subsample of the data; and then (ii) “correct” the posterior obtained so that it is (approximately) from the posterior distribution of the complete sample. For a feasibility study we apply this two-step approach to data from a colony of guillemots where there are approximately 30,000 individuals observed within the capture-recapture study and investigate the performance of the algorithm.

7 October 2019

George Loho (LSE)

To be confirmed

30 September 2019

Rajen Shah (University of Cambridge)

RSVP-graphs: Fast High-dimensional Covariance Matrix Estimation Under Latent Confounding

We consider the problem of estimating a high-dimensional p × p covariance matrix S, given n observations of confounded data with covariance S + GG^T , where G is an unknown p × q matrix of latent factor loadings. We propose a simple and scalable estimator based on the projection on to the right singular vectors of the observed data matrix, which we call RSVP. Our theoretical analysis of this method reveals that in contrast to approaches based on removal of principal components, RSVP is able to cope well with settings where the smallest eigenvalue of G^T G is relatively close to the largest eigenvalue of S, as well as when eigenvalues of G^T G are diverging fast. RSVP does not require knowledge or estimation of the number of latent factors q, but only recovers S up to an unknown positive scale factor. We argue this suffices in many applications, for example if an estimate of the correlation matrix is desired. We also show that by using subsampling, we can further improve the performance of the method. We demonstrate the favourable performance of RSVP through simulation experiments and an analysis of gene expression datasets collated by the GTEX consortium.

22 August 2019

Time:11:10 to 12:00

Room M/2.06

Dr. Mofei Jia, Xi'an (Jiaotong-Liverpool University, China)

Curbing the Consumption of Positional Goods: Behavioural Interventions versus Taxation

Little is known whether behavioural techniques, such as nudges, can serve as effective policy tools to reduce the consumption of positional goods. We study a game, in which individuals are embedded in a social network and compete for a positional advantage with their direct neighbours by purchasing a positional good. In a series of experiments, we test four policy interventions to curb the consumption of the positional good. We manipulate the type of the intervention (either a nudge or a tax) and the number of individuals exposed to the intervention (either the most central network node or the entire network). We illustrate that both the nudge and the tax can serve as effective policy instruments to combat positional consumption if the entire network is exposed to the intervention. Nevertheless, taxing or nudging the most central network node does not seem to be equally effective because of the absence of spillover effects from the center to the other nodes. As for the mechanism through which the nudge operates, our findings are consistent with an explanation where nudging increases the psychological cost of the positional consumption.

18 July 2019

Time:11:10 to 12:00

Room M/2.06

Nina Golyandina (St. Petersburg State University)

Detecting signals by Monte Carlo singular spectrum analysis: the problem of multiple testing

The statistical approach to detection of a signal in noisy series is considered in the framework of Monte Carlo singular spectrum analysis. This approach contains a technique to control both type I and type II errors and also compare criteria. For simultaneous testing of multiple frequencies, a multiple version of MC-SSA is suggested to control the family-wise error rate.

1 July 2019

Room M/0.40

Dr. Joni Virta (University of Aalto)

Statistical properties of second-order tensor decompositions

Two classical tensor decompositions are considered from a statistical viewpoint: the Tucker decomposition and the higher order singular value decomposition (HOSVD). Both decompositions are shown to be consistent estimators of the parameters of a certain noisy latent variable model. The decompositions' asymptotic properties allow comparisons between them. Also inference for the true latent dimension is discussed. The theory is illustrated with examples.

8 April 2019

Dr. Andreas Anastasiou (LSE)

Detecting multiple generalized change-points by isolating single ones

In this talk, we introduce a new approach, called Isolate-Detect (ID), for the consistent estimation of the number and location of multiple generalized change-points in noisy data sequences. Examples of signal changes that ID can deal with, are changes in the mean of a piecewise-constant signal and changes in the trend, accompanied by discontinuities or not, in the piecewise-linear model. The method is based on an isolation technique, which prevents the consideration of intervals that contain more than one change-point. This isolation enhances ID’s accuracy as it allows for detection in the presence of frequent changes of possibly small magnitudes. Thresholding and model selection through an information criterion are the two stopping rules described in the talk. A hybrid of both criteria leads to a general method with very good practical performance and minimal parameter choice. Applications of our method on simulated and real-life data sets show its very good performance in both accuracy and speed. The R package IDetect implementing the Isolate-Detect method is available from CRAN.

1 April 2019

Stephen Disney (Cardiff University)

When the Bullwhip Effect is an Increasing Function of the Lead Time

We study the relationship between lead times and the bullwhip effect produced by the order-up-to policy. The usual conclusion in the literature is that longer lead-time increase the bullwhip effect, we show that this is not always the case. Indeed, it seems to be rather rare. We achieve this by first showing that a positive demand impulse response leads to an always increasing in the lead time bullwhip effect when the order-up-to policy is used to make supply chain inventory replenishment decisions. By using the zeros and poles of the z-transform of the demand process, we reveal when this demand impulse is positive. To make concrete our approach in a nontrivial example we study the ARMA(2,2) demand process.

22 March 2019

Martina Testori (University of Southampton)

How group composition affects cooperation in fixed networks: can psychopathic traits influence group dynamics?

Static networks have been shown to foster cooperation for specific cost-benefit ratios and numbers of connections across a series of interactions. At the same time, psychopathic traits have been discovered to predict defective behaviours in game theory scenarios. This experiment combines these two aspects to investigate how group cooperation can emerge when changing group compositions based on psychopathic traits. We implemented a modified version of the Prisoner’s Dilemma game which has been demonstrated theoretically and empirically to sustain a constant level of cooperation over rounds. A sample of 190 undergraduate students played in small groups where the percentage of psychopathic traits in each group was manipulated. Groups entirely composed of low psychopathic individuals were compared to communities with 50% high and 50% low psychopathic players, to observe the behavioural differences at the group level. Results showed a significant divergence of the mean cooperation of the two conditions, regardless of the small range of participants’ psychopathy scores. Groups with a large density of high psychopathic subjects cooperated significantly less than groups entirely composed of low psychopathic players, confirming our hypothesis that psychopathic traits affect not only individuals’ decisions but also the group behaviour. This experiment highlights how differences in group composition with respect to psychopathic traits can have a significant impact on group dynamics, and it emphasizes the importance of individual characteristics when investigating group behaviours.


Joe Paat (ETH Zurich)

The proximity function for IPs

Proximity between an integer program (IP) and a linear program (LP) measures the distance between an optimal IP solution and the closest optimal LP solution. In this talk, we consider proximity as a function that depends on the right hand side vector of the IP and LP. We analyze how this proximity function is distributed and create a spectrum of probabilistic-like results regarding its value. This work uses ideas from group theory and Ehrhart theory, and it improves upon a recent result of Eisenbrand and Weismantel in the average case. This is joint work with Timm Oertel and Robert Weismantel. The proximity functions for IPs.



Prof Philip Broadbridge (La Trobe University)

Shannon entropy as a diagnostic tool for PDEs in conservation form

After normalization, an evolving real non-negative function may be viewed as a probability density. From this we may derive the corresponding evolution law for Shannon entropy. Parabolic equations, hyperbolic equations and fourth-order “diffusion” equations evolve information in quite different ways. Entropy and irreversibility can be introduced in a self-consistent manner and at an elementary level by reference to some simple evolution equations such as the linear heat equation. It is easily seen that the 2nd law of thermodynamics is equivalent to loss of Shannon information when temperature obeys a general nonlinear 2nd order diffusion equation. With the constraint of prescribed variance, this leads to the central limit theorem.

With fourth order diffusion terms, new problems arise. We know from applications such as thin film flow and surface diffusion, that fourth order diffusion terms may generate ripples and they do not satisfy the Second Law. Despite this, we can identify the class of fourth order quasilinear diffusion equations that increase the Shannon entropy.

4 March 2019

Dr. Emrah Demir (Cardiff Business School)

Creating Green Logistics Value through Operational Research

Green logistics is related to producing and dispatching goods in a sustainable way, while playing attention to environmental factors. In a green context, the objectives are not only based on economic considerations, but also aim at minimising other detrimental effects on society and on the environment. A conventional focus on planning the associated activities, particularly for the freight transportation, is to reduce expenses and, consequently, increase profitability by considering internal transportation costs. With an ever-growing concern about the environment by governments, markets, and other private entities worldwide, organizations have started to realize the importance of the environmental and social impacts associated with transportation on other parties or the society.

Efficient planning of freight transportation activities requires a comprehensive look at wide range of factors in the operation and management of transportation to achieve safe, fast, and environmentally suitable movement of goods. Over the years, the minimization of the total travelled distance has been accepted as the most important objective in the field of vehicle routing and intermodal transportation. However, the interaction of operational research with mechanical and traffic engineering shows that there exist factors which are critical to explain fuel consumption. This triggered the birth of the green vehicle routing and green intermodal studies in operational research. In recent years, the number, quality and the flexibility of the models have increased considerably. This talk will discuss green vehicle routing and green intermodal transportation problems along with models and algorithms which truly represent the characteristics of green logistics.


Oded Lachish (Birkbeck, University of London)

mart queries versus property independent queries

In the area of property testing, a central goal is to design algorithms, called tests, that decide, with high probability, whether a word over a finite alphabet is in a given property or far from the property. A property is a subset of all the possible words over the alphabet. For instance, the word can be a book, and the property can be the set of all the books that are written in English - a book is 0.1 far from being written in English if at least 0.1 of its words are not in English. The 0.1 is called the distance parameter and it can be any value in [0,1]. The input of a test is the distance parameter, the length of the input word and access to an oracle that answers queries of the sort: please give me the i'th letter in the word.

The quality of a test is measured by it query complexity, which is the maximum number of queries it uses as a function of the input word length and the distance parameter, ideally this number does not depend on the input length. Tests that achieve this ideal for specific properties have been discovered for numerous properties. In general, tests that achieve the ideal for different properties differ in the manner in which they select their queries. That is, the choice of queries depends on the property.

In this talk, we will see that for the price of a significant increase in the number of queries it is possible to get rid of this dependency. We will also give scenarios in which this trade-off is beneficial.

18 February 2019 (Time 13:10 - 14:00)

Prof. Giles Stupfler (University of Nottingham)

Asymmetric least squares techniques for extreme risk estimation

Financial and actuarial risk assessment is typically based on the computation of a single quantile (or Value-at-Risk). One drawback of quantiles is that they only take into account the frequency of an extreme event, and in particular do not give an idea of what the typical magnitude of such an event would be. Another issue is that they do not induce a coherent risk measure, which is a serious concern in actuarial and financial applications. In this talk, I will explain how, starting from the formulation of a quantile as the solution of an optimisation problem, one may come up with two alternative families of risk measures, called expectiles and extremiles. I will give a broad overview of their properties, as well as of their estimation at extreme levels in heavy-tailed models, and explain why they constitute sensible alternatives for risk assessment using some real data applications. This is based on joint work with Abdelaati Daouia, Irène Gijbels and Stéphane Girard.

21 January 2019

Stefano Coniglio (University of Southampton)

Bilevel programming and the computation of pessimistic single-leader-multi-follower equilibria in Stackelberg games

We give a very broad overview of bilevel programming problems and their relationship with Stackelberg games, with focus on two classical limitations of this paradigm: the presence of a single follower and the assumption of optimism.
We then investigate the problem of computing an equilibrium in Stackelberg games with two or more noncooperating followers who react to the strategy chosen by the leader by playing a Nash Equilibrium, focusing, in particular, on the pessimistic case where, if the follower's game (parameterized by the leader's strategy) admits more Nash equilibria, the followers choose one which minimizes the leader's utility.

We then address the case where the followers are restricted to pure strategies, illustrate some hardness and inapproximability results, and the concentrate on exact solution algorithms.

After proposing a single-level (but undecidable) reformulation for the problem, we propose an exact implicit enumeration algorithm capable of computing the supremum of the problem as well as an alpha-approximate strategy, for any nonnegative alpha.

Experimental results are presented and illustrated, showing the viability of our approach.

11 December 2018

Anatoly Zhigljavsky (Cardiff University)

Multivariate dispersion

3 December 2018

Dr Ilaria Prosdocimi  (University of Bath)

Detecting coherent changes in flood risk in Great Britain

Flooding is a natural hazard which has affected the UK throughout history, with significant costs for both the development and maintenance of flood protection schemes and for the recovery of the areas affected by flooding. The recent large repeated floods in Northern England and other parts of the country raise the question of whether the risk of flooding is changing, possibly as a result of climate change, so that different strategies would be needed for the effective management of flood risk. To assess whether any change in flood risk can be identified, one would typically investigate the presence of some changing patterns in peak flow records for each station across the country. Nevertheless, the coherent detection of any clear pattern in the data is hindered by the limited sample size of the peak flow records, which typically cover about 45 years. We investigate the use of multi-level hierarchical models to better use the information available at all stations in a unique model which can detect the presence of any sizeable change in the peak flow behaviour at a larger scale. Further, we also investigate the possibility of attributing any detected change
to naturally varying climatological variables.


Prof Benjamin Gess (Max Planck Institute Leipziz)

Random dynamical systems for stochastic PDE with nonlinear noise

In this talk we will revisit the problem of generation of random dynamical systems by solutions to stochastic PDE. Despite being at the heart of a dynamical system approach to stochastic dynamics in infinite dimensions, most known results are restricted to stochastic PDE driven by affine linear noise, which can be treated via transformation arguments. In contrast, in this talk we will address instances of stochastic PDE with nonlinear noise, with particular emphasis on porous media equations driven by conservative noise. This class of stochastic PDE arises in particular in the analysis of stochastic mean curvature motion, mean field games with common noise and is linked to fluctuations in non-equilibrium statistical mechanics.

Past events

Past Seminars 2017-18

Past Seminars 2016-17

Past Seminars 2015-16