Operational Research Group

Our group has an impressive track-record of contributing to both the theoretical aspects of the subject area and to applications, including working on complex problems arising in healthcare planning, epidemiology, transportation, timetabling, manufacturing, green logistics and scheduling of sporting fixtures.

We now have over 20 academic staff and research students and host a thriving seminar series jointly with the Statistics research group.

Networks

  • We are a partner group within the LANCS initiative, a collaboration of four Universities with a total investment of £13m (£5.4m from EPSRC) to support the development of research capacity in OR;
  • A contributor to the EPSRC-funded National Taught-course Centre in Operational Research, NATCOR, which provides in-depth training for PhD students in OR;
  • Within the Wales Institute of Mathematical and Computational Sciences, WIMCS, the Cardiff Operational Research group leads the OR and Statistics cluster, and organises pan-Wales and UK-wide events (Professor Harper is the OR & Statistics cluster co-ordinator);
  • We help to organise meetings of the South Wales Operational Research Discussion Society (SWORDS).
  • We are a partner in the Centre for Transport Network Optimisation which develops new, highly efficient approaches for public transport network design (Dr Rhyd Lewis is our link member).
  • All of the OR staff are highly active in international collaborations, for example including strong links with the Federal University of Rio de Janeiro (Brazil), Twente University (Netherlands), University of California Davis (US), University of Toronto (Canada), University of Vienna (Austria), Karlsruhe Institute of Technology (Germany) and Monash University (Australia).

Health modelling

Professor Harper is Director of Health Modelling Centre Cymru (hmc2), which is fostering collaboration across different research areas of the mathematical and computational sciences, to create a more vibrant and effective interface between the mathematical research community, the medical research community, NHS Wales, the Welsh Government and the Health industry.

The main areas of research within the current group are:

In focus

Planning and optimisation

The planning and optimisation group is involved in the design and application of mathematical optimisation techniques to real life problems, particularly in the areas of scheduling and packing. These techniques can be used to introduce efficiency and reduce waste in the logistical operations of companies and government agencies.

One vibrant area of research in this group concerns the problem of timetabling. Universities, for example, periodically face the burden of scheduling exams and lectures so that a variety of complex, and often conflicting constraints are met. Members of the group have previously designed methods for such problems and were also involved in the organisation of the Second International Timetabling Competition in 2006-7 which allowed researchers from across the globe to design and test their algorithms on real-life problems in a competitive environment. The resources generated from this competition continue to stimulate new work by providing a useful access point into the field.

Members of the group are also active in the area of sports timetabling, where the aim is to produce schedules that are fair to all teams and that also satisfy constraints regarding pitch availability, television requirements etc. The group has previously worked with the International Rugby Board and the Welsh Rugby Union and has used metaheuristic search techniques to produce schedules for the 1999 Rugby World Cup, Welsh domestic rugby leagues, and all international rugby fixtures over a 12-year period.

The group have also published widely in the field of partitioning problems. Such problems arise regularly in industry, transportation and logistics, and include multi-dimensional packing and balancing problems, stock cutting problems, rostering problems and graph-theory. Stock cutting problems, for example, arise in areas such as the clothing and building industries, where the aim is to cut a set of predefined and possibly multi-dimensioned items from a set of equi-dimensioned “stocks” such that the wastage is minimised (thus encouraging economic savings). Previous research by the group has resulted in methods achieving state of the art results on popular benchmark problems (some of which have originated from real-world industrial processes), as well as the analysis and solving of new cutting problems provided to us by industrial partners.

Finally, the group is also investigating dynamic routing problems – that is routing problems where requirements change over time. An example is where a company receives new orders during the day and has to re-route delivery vans to the new customers while still minimising the distance travelled. High-quality solutions have been achieved using ant colony optimisation and our methods have also been applied to large scale static problems in order to divide problems into more manageable parts.

Queueing systems

There is a strong Cardiff OR tradition in the study of queueing systems, with applications of queueing theory, simulation and probability theory to practical problems. A typical research project involves both analytical insights from queueing theory and the use of computer simulation, and a number of PhD students are working in this area with particular applications to healthcare, transportation and telecommunications problems.

Queueing studies have focussed on bulk service queues and time-dependent queueing models, including research projects at Gatwick Airport, the Severn Bridge, the Channel Tunnel, and healthcare services (including the intensive care unit, operating theatres and ambulance services). Recent theoretical has made significant progress with the transient solution of queueing systems with a variety of service mechanisms (Prof. Jeff Griffiths and Dr Janet Williams) and a number of research projects have been awarded, dating back as far as 1975, to contracts and consultancies from Transport Research Laboratory, Suez Canal Authority, BP Oil Ltd, Department of Transport, Research Councils, etc. Projects have been undertaken relating to delays to pedestrians and vehicles at pedestrian crossings, accident analyses within computer controlled signal networks, facilities provided at motorway roadworks, toll systems, advantages of flared junctions at traffic signals, etc.

Research and application in simulation has involved discrete-event, system dynamics, agent-based, Monte Carlo and hybrid methods. Novel research has focussed on the use of simulation models incorporating small-world theory for modelling of disease propagation (Dr Israel Vieira), modelling consumer choice (Dr Vince Knight) and incorporating human behaviour (Prof. Paul Harper). Applications include NHS patient choice, HIV/AIDS, ambulance services, breast cancer, A&E department and critical care. Novel work on hybrid methods is exploring the feasibility and benefits of combined methodologies (such as DES and SD) and work with Social Scientists.

Probabilistic methods are being applied to modelling of telecommunication systems and opportunistic networks (Dr Dafydd Evans) which consist of mobile nodes equipped with short range wireless communications devices. Information is dispersed both by wireless transmission between the participating nodes and the movement of the nodes themselves. For example, a source node located at a railway station transmits a message to people passing nearby, who then disseminate the message across the local area. Fixed nodes are strategically placed throughout the area to act as message repositories. Dr Evans is developing probabilistic models of opportunistic networks, and using these to derive theoretical performance bounds for this type of network. Network performance statistics can involve concepts at the network level (e.g number of connected components), at the neighbourhood level (e.g. number of nodes within transmission range) or at the node level (e.g. number of messages waiting to be relayed).

Healthcare modelling

Cardiff is renowned for its long and successful tradition of research in this field. We have a large and active group of staff and postgraduate research students working on numerous health-related topics, including planning and management of healthcare services, epidemiology, and prevention, early detection and treatment of disease. Professor Harper is also Director of Health Modelling Centre Cymru (hmc2). A number of PhD students and Research Associates (post-doctoral students) are funded directly by Local Health Boards. An exciting recent initiative is the creation of a Mathematical Modelling Unit, funded by the Aneurin Bevan University Health Board with a joint lectureship and three research associates working between the OR group and the Health Board within the Aneurin Bevan Continuous Improvement team.

Research projects typically comprise of a mixture of theoretical and practical investigations, and many projects have been funded by external organisations, including various funding councils, Department of Health, NHS Information Centre, NHS Trusts and Primary Care Trusts.

Particular contributions include stochastic models for integrated healthcare resource systems (hospital bed capacities, theatre scheduling and workforce planning), stochastic facility location problems, conditional phase-type modelling, patient choice, combined data mining and simulation methodologies, modelling the cost-effectiveness of various strategies for preventing and screening for disease including breast cancer, colorectal cancer, HIV/AIDS and diabetic retinopathy, targeted screening programmes for Chlamydia, small world models for the dynamics of HIV infection, and novel research on healthcare behavioural modelling.

Several staff within the group are members of the European Working Group on Operational Research Applied to Health Services (ORAHS), and members of the Steering Group of the EPSRC funded Network in Healthcare Modelling and Simulation (MASHnet). Prof. Harper’s work on screening for Chlamydia was awarded the 2006 OR Society’s Goodeve Medal for the best paper published in the Journal of the Operational Research Society. The 2011 ORAHS international conference was be held in Cardiff (Organising team: Paul Harper, Janet Williams, Vince Knight and Israel Vieira). Recent PhD graduate Richard Wood (modelling of rehabilitation services using queueing theory and scheduling techniques) won the best PhD prize by the UK OR Society.

Head of group

Photograph of Professor Paul Harper

Professor Paul Harper

Deputy Head of School, Professor of Operational Research

Email:
harper@cardiff.ac.uk
Telephone:
+44 (0)29 2087 6841

Academic staff

Iskander Aliev

Dr Iskander Aliev

Senior Lecturer

Email:
alievi@cardiff.ac.uk
Telephone:
+44 (0)29 2087 5547
Photograph of Dr Maggie Chen

Dr Maggie Chen

Senior Lecturer in Financial Mathematics

Email:
chenj60@cardiff.ac.uk
Telephone:
+44 (0)29 2087 5523
Photograph of Tracey England

Dr Tracey England

Research Associate

Email:
englandtj@cardiff.ac.uk
Telephone:
+44 (0)29 2087 0986
Photograph of Dr Dafydd Evans

Dr Dafydd Evans

Lecturer in Operational Research

Email:
evansd8@cardiff.ac.uk
Telephone:
+44 (0)29 2087 0621
Dr Andrei Gagarin photograph

Dr Andrei Gagarin

Lecturer in Mathematics

Email:
gagarina@cardiff.ac.uk
Telephone:
+44 (0)29 2068 8850
Photograph of Daniel Gartner

Dr Daniel Gartner

Lecturer

Email:
gartnerd@cardiff.ac.uk
Telephone:
+44 (0)29 2087 0850
Dr Ahmed Kheiri photograph

Dr Ahmed Kheiri

Research Associate

Email:
kheiria@cardiff.ac.uk
Telephone:
+44 (0)29 2087 0936
Photograph of Dr Vincent Knight

Dr Vincent Knight

Senior Lecturer

Email:
knightva@cardiff.ac.uk
Telephone:
+44 (0)29 2087 5548
Rhyd Lewis photograph

Dr Rhyd Lewis

Senior Lecturer

Email:
lewisr9@cardiff.ac.uk
Telephone:
+44 (0)29 2087 4856
Photograph of Timm Oertel

Dr Timm Oertel

Lecturer

Email:
oertelt@cardiff.ac.uk
Telephone:
+44 (0)29 2087 0849
Dr Jonathan Thompson

Dr Jonathan Thompson

Director of Learning and Teaching

Email:
thompsonjm1@cardiff.ac.uk
Telephone:
+44 (0)29 2087 5524

All seminars will commence at 12:10pm in room M/0.34, The Mathematics Building, Cardiff University, Senghennydd Road (unless otherwise stated).

Please contact Dr Iskander Aliev for more details regarding Operational Research/WIMCS lectures and Dr Andrey Pepelyshev for more details regarding Statistics lectures.

Seminars

DateSpeakerSeminar

10 July 2017

Dr Kirstin Strokorb (Cardiff School of Mathematics)

Stability and Dependence in Extreme Value Theory

Extreme value theory is a branch of probability and statistics that aims to provides theoretically sound procedures for extrapolation beyond the range of available data (and also to understand the limits of such procedures). In the first part of this talk I will explain:

(i) why stability properties such as max-stability or threshold-stability are naturally considered as modelling assumptions to justify two non-parametric approaches to such problems
(ii) address some of the difficulties for understanding dependence between extreme observations

In the second part I will present joint work with Ilya Molchanov, where we identify some connections between Extreme value theory, Random sets and Properties of risk measures enabling us to recover and extend several results from the literature from a unifying perspective.

5 July 2017

Dr Tom Beach (Cardiff School of Engineering)

TBA

14 June 2017

Dmitrii Silvestrov (Stockholm)

Asymptotic Expansions for Stationary and Quasi-Stationary Distributions of Nonlinearly Perturbed Markov Chains and semi-Markov Processes

New algorithms for computing asymptotic expansions for stationary and quasi-stationary distributions of nonlinearly perturbed Markov chains and semi-Markov processes are presented. The algorithms are based on special techniques of sequential phase space reduction and some kind of “operational calculus” for Laurent asymptotic expansions applied to moments of hitting times for perturbed semi-Markov processes. These algorithms have a universal character. They can be applied to nonlinearly perturbed semi-Markov processes with an arbitrary asymptotic communicative structure of the phase space. Asymptotic expansions are given in two forms, without and with explicit bounds for remainders. The algorithms are computationally effective, due to a recurrent character of the corresponding computational procedures. The related references are [1-2].

References

[1] Gyllenberg, M., Silvestrov, D.S. (2008). Quasi-Stationary Phenomena in Nonlinearly Perturbed Stochastic Systems. De Gruyter Expositions in Mathematics, 44, Walter de Gruyter, Berlin, ix+579 pp.

[2] Silvestrov, D., Silvestrov, S. (2016). Asymptotic expansions for stationary distributions of perturbed semi-Markov processes. In: Silvestrov, S., Rancic, M. (Eds). Engineering Mathematics II. Algebraic, Stochastic and Analysis Structures for Networks, Data Classification and Optimization, Chapter 10. Springer Proceedings in Mathematics & Statistics 179, Springer, Cham, 151 -- 222.

12 June 2017

Nikita Zvonarev (St. Petersburg State University)

Fisher Scoring in Low-Rank Signal Estimation

The problem of signal estimation is considered. We suggest an algorithm of construction of MLE/LS signal estimation. The algorithm is based on low-rank approximation and suggests a stable solution of the corresponding optimization problem.

31 May 2017

Prof. Anatoly Zhigljavsky (Cardiff University)

Simplicial variances, distances and correlations with a view towards big data

I will describe a brand-new approach for defining variances, distances and correlations for high-dimensional data and argue that the new concepts may have serious advantages over the classical ones, especially if points of a multivariate sample lie close to a linear subspace.

10 May 2017

Martin Kunc (Warwick)

TBA

19 April 2017

Martin Lotz (University of Manchester)

Randomized dimension reduction in optimization and topological data analysis

This talk reviews some recent and current work on extracting information from random projections of data. We first discuss a phase transition phenomenon in convex optimization: the probability that convex regularization succeeds in recovering a signal from few random measurements jumps sharply from zero to one as the number of measurements exceeds a certain threshold. After explaining the geometric and probabilistic ideas behind this phenomenon, we show how they can also be applied to problems in other fields such as topological data analysis.

22 March 2017

James Woodcock (Cambridge)

Modelling active transport and health: state of the science and future directions

The last decade have seen a growth of studies modelling how a shift to active transport can affect population health. Results have generally shown the potential for substantial benefits but uncertainties remain around impacts for different subgroups and on the generalisability of the results to lower and middle income settings. Correspondingly as the science matures there is a move to more rigorous methods and use of new data. In this talk I will consider how research is addressing these issues with a focus on how we model population physical activity exposures and risks, and how we estimate changes in injury risk with changes in active travel volume.

 

Past seminars

DateSpeakerSeminar

15 February 2017

Alexander Stewart (UCL)

Evolution and the dynamics of social behavior

Abstract: Understanding the dynamics of social behavior is very much a puzzle for our time. On the one hand, this puzzle is critical to understanding how human behavior is changing as we update and expand our social networks in unprecedented ways.

On the other hand, these complex dynamics can (perhaps) be understood, due to recent expansion in empirical data and computational power. An evolutionary perspective is vital to this effort, providing a framework for understanding how and when behavioral change will spread through a population.

However, in order to provide this perspective we must develop mechanistic models of behavioral changes at the individual level that explain the emergence of social phenomena, such as social norms or collective identities, at the level of groups and populations. I will discuss ongoing attempts to model some of the complex social behaviors found in humans such as cooperation, using evolutionary game theory, and some insights this approach yields about the emergence and stability of social norms.

I will then briefly discuss how these game theoretic models can be connected to experiments to test qualitative predictions about human social behaviour.

30 November 2016

Ivan Papić (Osijek University)

Heavy-tailed fractional Pearson diffusions

Heavy-tailed fractional Pearson diffusions are a class of sub-diffusions with marginal heavy-tailed Pearson distributions: reciprocal gamma, Fisher-Snedecor and Student distributions. They are governed by the time-fractional diffusion equations with polynomial coefficients depending on the parameters of the corresponding Pearson distribution. We present the spectral representation of transition densities of fractional Fisher-Snedecor and reciprocal - gamma diffusions, which depend heavily on the structure of the spectrum of the infinitesimal generator of the corresponding non-fractional Pearson diffusion. Also, we present the strong solutions of the Cauchy problems associated with heavy-tailed fractional Pearson diffusions and the correlation structure of these diffusions.


This is a joint work with N. Leonenko (Cardiff University, UK), N.Suvak (University of Osijek, Croatia) and Alla Sikorskii (Michigan State University and Arizona University, USA).

16 November 2016

Joerg Kalcsics (University of Edinburgh)

Structural Properties of Voronoi Diagrams in Facility Location Problems with Continuous Demand

Authors: Igor Averbakh, Oded Berman, Jörg Kalcsics, Dmitry Krass

In most facility location models customer demand is assumed to be discrete and aggregated to a relatively small number of points. However, in many urban applications the number of potential customers can be in the millions and representing every customer residence as a separate demand point is usually infeasible. Thus, it may be more accurate to represent customer demand as spatially distributed over some region.

We consider the conditional market share problem where locations of n facilities are fixed and we seek to find the optimal location for the (n+1)st facility with the objective of maximizing its market share. We assume that demand is uniformly distributed over a convex polygon, facilities can be located anywhere in the polygon, and customers obtain service from the closest open facility under rectilinear distances. Once the locations of all facilities are specified, the market area of a facility is given by its Voronoi cell in the Voronoi diagram generated by the facilities. The main difficulty when optimizing the location of the new facility is that it is generally impossible to represent the objective function in closed form; in fact, the representation depends on the structure of the Voronoi diagram, i.e., the position and the geometry of the cell boundaries. Unfortunately, this structure can change drastically with the location of the new facility.

In this talk we derive structural properties of Voronoi diagrams for the rectilinear distances and show how to use them to identify regions where the resulting Voronoi diagram is "structurally identical" for every point in a region. Given such regions, we can derive a parametric representation of the objective function which is valid for any location in the region. This, in turn, allows us to optimize the location of the new facility over this region using classical non-linear programming techniques. While the optimization techniques are specific to the particular model being considered, the structural results we derive, as well as our general approach, are quite universal, and can be applied to many other location models as well.

2 November 2016

Laszlo Vegh (LSE)

Rescaled coordinate descent methods for Linear Programming

Simple coordinate descent methods such as von Neumann's algorithm or Perceptron, both developed in the 50s, can be used to solve linear programming feasibility problems. Their convergence rate depends on the condition measure of the problem at hand, and is typically not polynomial. Recent work of Chubanov (2012, 2014), related to prior work of Betke (2004), has gathered renewed interest in the application of these methods in order to obtain polynomial time algorithms for linear programming. We present two algorithms that fit into this line of research. Both our algorithms alternate between coordinate descent steps and rescaling steps, so that either the descent step leads to a substantial improvement in terms of the convergence, or we can infer that the problem is ill conditioned and rescale in order to improve the condition measure. In particular, both algorithms are based on the analysis of a geometrical invariant of the LP problem, used as a proxy for the condition measure, that appears to be novel in the literature.

This is joint work with Daniel Dadush (CWI) and Giacomo Zambelli (LSE).

19 October 2016

Oleg Klesov (National University of Ukraine)

Generalized renewal processes and Pseudo-regularly varying functions.

Read the full abstract.

13 July 2016

Lenny Fukshansky (Claremont McKenna)

Lattices from Abelian groups, spherical designs, and packing density maxima

Abstract: Lattices are fundamentally important in discrete geometry, cryptography, discrete optimization and computer science. The lattice sphere packing problem asks for a lattice that maximizes the packing density function among all lattices in a given dimension. A systematic approach to this problem involves understanding what properties a lattice should have to be a strong candidate for such a maximum. I will discuss such properties, including a connection to spherical designs, which are certain highly symmetric point configurations on spheres. I will then exhibit a construction of a family of lattices from finite Abelian groups, which have interesting geometric properties and produce local maxima for the packing density function in infinitely many dimensions. This is joint work with A. Böttcher, S. R. Garcia and H. Maharaj.

1 June 2016

Chenlei Leng (University of Warwick)

Distributed sparse regression by decorrelating features

An attractive approach for down-scaling a Big Data problem is to first partition the dataset into subsets and then fit using distributed algorithms. Feature space partitioning can be effective for analysing datasets with a large number of features, but suffers from the failure of not taking correlations into account if done naively. In this paper, we propose a new embarrassingly parallel framework named DECO for distributed variable selection and parameter estimation. The trick is to apply a simple decorrelation step before performing sparse regression on each subset. The theoretical and computational attractiveness of DECO will be illustrated. This is joint work with Xiangyu Wang and David Dunson.

25 May 2016

Murad Taqqu (Boston University, USA)

A unified approach to self-normalized block sampling

The inference procedure for the mean of a stationary time series is usually quite different under various model assumptions because the partial sum process behaves differently depending on whether the time series is short or long-range dependent, or whether it has a light or heavy-tailed marginal distribution. We develop an asymptotic theory for the self-normalized block sampling, and prove that the corresponding block sampling method can provide a unified inference approach for the aforementioned different situations in the sense that it does not require the a priori estimation of auxiliary parameters.

This is joint work with Shuyang Bai and Ting Zhang.

4 May 2016

Yuzhi Cai (Swansea)

Density forecasting with TGARCH models for financial returns

This talk presents a novel density forecasting method based on a threshold GARCH (TGARCH) model and an associated value at risk (VaR) model for financial returns. Without estimating the TGARCH model directly, we proposed a Bayesian approach based on a working likelihood to the estimation of the VaR model. This estimation method allows a model that has multiple thresholds to be dealt with easily. As a result, the forecasting method allows us to obtain multiple-step ahead density forecasts for financial returns without using any information on the distribution of the innovation term of a TGARCH model. Since our method produces density forecasts, it also enables us to study the effect of the past financial returns on the entire distribution of the current return. Hence, any predictive quantity of interest can be obtained as by-products of our method. Compared with other models, much improved density forecasts have been obtained from our method.

13 April 2016

Matthias Ehrgott (Lancaster)

Generating deliverable treatment plans and beam angle optimisation in the presence of multiple objectives in radiotherapy treatment planning

External radiation therapy is one of the major forms of treatment of cancer.

Planning a radiation treatment for a patient involves making trade-offs between the main goals of radiotherapy, namely to irradiate the tumour according to some prescription to affect tumour control and to avoid damage to surrounding healthy tissue. This conflict permeates all aspects of treatment planning from the selection of beam angles to the optimisation of fluence maps to the sequencing of the gantry for treatment delivery. In this talk I will first describe a matheuristic approach to incorporate multi-objective fluence optimisation in the beam angle optimisation problem. The second part of the talk will cover a column generation approach to the multi-objective fluence map optimisation problem, which achieves a reduction of the number of apertures and total fluence required to deliver a Pareto-optimal treatment.

16 March 2016

Mathias Henze (FU Berlin)

Tight asymptotic estimates for the quantitative discrete Helly-number of the integral lattice

Given a discrete subset $S$ of $\mathbb{R}^n$, let $c(S,k)$ be the smallest number $t$ such that for every finite system of linear inequalities that has exactly $k$ solutions in $S$, there exists a subsystem of at most $t$ inequalities that still has exactly $k$ solutions in $S$.

This number was introduced for $S=\mathbb{Z}^n$ by Aliev et al.~(2014) in order to study a quantitative version of Doignon's integer variant of the classical Helly theorem. K. Clarkson was the fi rst to notice that using Doignon's theorem one can obtain a probabilistic algorithm for integer linear programming. Following this approach, the quantitative version has been applied to finding the l best integer feasible points of an integer program.

The initial estimate $c(\mathbb{Z}^n,k)\in O(k)$ by Aliev et al.~was recently improved to a sublinear bound by Chestnut et al., who also determined the asymptotic growth of $c(\mathbb{Z}^2,k)$ to be $k^\frac13$. Via the theory of Helly-numbers of set families we show that $c(S,k)$ admits a variety of different interpretations, for example, by the maximum number of vertices of a polytope containing $k$ non-vertex points of $S$, or by the maximum number of facets of an inclusion-maximal polyhedron containing $k$ points of $S$ in its interior. Based on these interpretations, we show that for the integral lattice the asymptotic behavior of $c(\mathbb{Z}^n,k)$ is given by $\Theta(k^\frac{n-1}{n+1})$, and we determine $c(\mathbb{Z}^n,k)$ for small values of $k$ exactly. For general discrete sets $S$ we derive an upper bound of $c(S,k)\in O(k\cdot h(S))$, where $h(S)$ is the Helly-number of $S$.

This is joint work with G. Averkov, B. Gonz\'{a}lez Merino, I. Paschke, and S. Weltge.

2 March 2016

Paul Smith (Southampton)

Calibration estimators in official statistics

Model-assisted estimation has been in use in surveys for a long time under different names. I will trace some examples showing its evolution, and give a summary of modern calibration estimation as used by National Statistical Institutes in the production of official statistics. I will consider the reasons for calibration and the properties of the resulting estimates from both theoretical and user points of view, and give a range of examples demonstrating particular challenges and issues, and some developments where calibration estimation may continue to improve official outputs.

24 February 2016

Professor Indrajit Ray (Cardiff Business School)

Information-Revelation and Coordination using Cheap Talk: Theory and Experiment

17 February 2016

Professor Theodore Turocy (East Anglia)

Two-bidder all-pay auctions with interdependent valuations:
Equilibrium, complexity, competitiveness, and behaviour

We present results from two related papers. In the first paper,
we analyze symmetric, two-bidder all-pay auctions with interdependent valuations and discrete type spaces. Relaxing previous restrictions on the distribution of types and the valuation structure, we present a construction that characterizes all symmetric equilibria. We show how the search problem this construction faces can be complex.
In equilibrium, randomization can take place over disjoint intervals of bids, equilibrium supports can have a rich structure, and non-monotonicity of the equilibrium may result in a positive probability of allocative inefficiency when the value of the prize is not common. Particular attention is paid to the case in which an increase in a bidder’s posterior expected value of winning the auction is likely to be accompanied by a corresponding increase for the other bidder. Such environments are "highly competitive" in the sense that the bidder’s higher valuation also signals that the other bidder has an incentive to bid aggressively.

In the second paper, we focus on the relationship between monotonic equilibrium and those "highly competitive" cases. Having a high assessment of the value of the prize is good news, but only if the other participants in the contest are not too likely to
believe the same. In a laboratory experiment, we study behavior in both private-values and common-values settings. We vary the degree of correlation between types. While bidding is consistently aggressive across treatments, we find general support of the comparative statics of Bayes-Nash equilibrium for private values. In constrast, behavior in common values settings in which bidders have very noisy information about the value of the prize differs greatly from the equilibrium predictions.

10 February 2016

Evangelos Evangelou (Bath)

The Value of Information for Correlated GLM

In portfolio optimisation, one could potentially invest in several projects with an uncertain amount of revenue depending on the outcome of each project. Initially, the investor has a preliminary estimate about the outcome of each project and a prior value for the whole investment but is given the option to purchase some information (i.e. data) from some projects. When the projects are correlated, these data can be used to update the expected outcome of all projects and derive a posterior value for the investment. The information provided by the data has some value to the investor who must decide whether it is worth purchasing them.

In this talk we will consider the case where the outcome of each project is modelled by an exponential family. When the distribution is non-Gaussian, the value of information does not have a closed form expression. Using Laplace's approximation for intergrals we will derive an approximation to the value of information and examine the sensitivity of the approximation under different parameter settings and distributions. The method will be illustrated using a spatial decision problem.

Joint work with Jo Eidsvik (NTNU)