Ebook: Risk and Decision Analysis in Maintenance Optimization and Flood Management
The late professor J.M. van Noortwijk (1961-2008) worked to bridge the practice of mathematical modeling with solving complex problems in the field of civil engineering. He developed advanced probabilistic and statistical models and made these accessible to engineers working in such areas as structural reliability, hydraulic engineering and maintenance optimization. This book contains an overview of his work and a collection of twelve papers presented at the symposium held in his honor on November 24, 2009, in Delft, the Netherlands. The topics covered by these contributions include the elicitation of experts’ opinion, condition-based maintenance optimization using the gamma process, and the assessment and management of flood risks. They present the latest developments by his peers in their respective fields.
In his position as professor at the faculty of Electrical Engineering, Mathematics and Computer Science at the Delft University of Technology, Jan van Noortwijk had a simple goal: to apply mathematical modeling techniques to problems in civil engineering. In particular, he aimed to make advanced decision-theoretic models accessible to engineers in other fields such as civil and mechanical engineering. Most of his work involved the application of probability theory to problems in maintenance optimization and the management of risks due to flooding. The inherent uncertainty involved with the current and future state of structures and systems requires a sound methodology for quantifying these uncertainties.
This book presents some of the latest developments in these areas by leading researchers at academic institutions and practitioners in various lines of work. The contributions will be presented during a one-day symposium on November 24, 2009 in Delft, the Netherlands. Both this book and the symposium are a tribute to the legacy of professor Jan van Noortwijk.
First and foremost we are indebted to the authors for their enthousiastic response to the call for papers and the significant effort they have put into finishing their contributions within a very short period of time. We extend our appreciation to the scientific committee, being Tim Bedford, Christophe Bérenguer, Rommert Dekker, Pieter van Gelder, Antoine Grall, Matthijs Kok, Tom Mazzuchi, Robin Nicolai, Martin Newby, and Hans van derWeide for their swift reviews. We would also like to thank Ton Botterhuis and Karolina Wojciechowska for additional reviewing and editing of a number of contributions.
At the time of writing, the symposium has been made possible by the organizing institutions, HKV Consultants and the Delft University of Technology, as well as the Nederlandse Vereniging voor Risicoanalyse en Bedrijfszekerheid (NVRB), Universiteitsfonds Delft, the Netherlands Organization for Applied Scientific Research (TNO), and the organizing committee of the 7th International Probabilistic Workshop (November 25-26, 2009 in Delft).
The editors,
Maarten-Jan Kallen and Sebastian Kuniewski
Delft, September 23, 2009.
We give an overview of the research and publications by professor Jan van Noortwijk starting from his graduation at the Delft University of Technology in 1989 up to his death on September 16, 2008. The goal of this overview is to list all of his scientific publications and to put these in a historical perspective. We show how his Ph.D. thesis was a stepping stone to the two primary fields in which he did most of his later work: maintenance optimization and the management of risks due to flooding.
The introduction of the Project Evaluation and Review Technique (PERT) dates back to the 1960's and has found wide application since then in the planning of construction projects. Difficulties with the interpretation of the parameters of the beta distribution let Malcolm et al. [1] to suggest the classical expressions for the PERT mean and variance for activity completion that follow from lower and upper bound estimates a and b and a most likely estimate θ thereof. The parameters of the beta distribution are next estimated via the method of moments technique. Despite more recent papers still questioning the PERT mean and variance approach, their use is still prevalent in operations research and industrial engineering text books that discuss these methods. In this paper an overview is presented of some alternative approaches that have been suggested, including a recent approach that allows for a direct model range estimation combined with an indirect elicitation of bound and tail parameters of generalized trapezoidal uniform distributions describing activity uncertainty. Utilizing an illustrative Monte Carlo analysis for the completion time of an 18 node activity network, we shall demonstrate a difference between project completion times that could result when requiring experts to specify a single most likely estimate rather than allowing for a modal range specification.
By using informational consistency requirements, Jaynes (1968) derives the form of maximal non-informative priors for regression coefficients, to be uniform. However, this result does not tell us what the limits of this uniform distribution should be. If we are faced with a problem of model selection this information is an integral part of the evidence, which is used to rank the various competing models. In this paper, we give some guidelines for choosing a parsimoneous proper uniform prior. It turns out that in order to construct such a parsimoneous prior one only needs to assign a maximal length to the dependent variable and minimal lengths to the independent variables, together with their maximal correlations.
The late Jan van Noortwijk (JvN) made valuable contributions in many areas such as Reliability, Risk management, Maintenance modelling, Applications to Decision theory and more. His contributions to model river discharges for flood prevention (van Noortwijk et al., [1, 2] and others) are of interest to forecast river stream flow. The posterior predictive densities for several distributions, which can be considered as candidates to model river discharges, were derived using Jeffreys prior. The Jeffreys prior was derived for these distributions by careful algebraic derivations of the Fisher information matrix. The posterior predictive density is the way we believe to follow for predicting future values once the best model is selected. Van Noortwijk et al. [1, 2] proposed Bayes weights for selecting the best model. The advantage of the posterior predictions over substituting the estimates of the parameters in the quantile function is discussed for a special case. A further application under regression in the lognormal model with the Southern Oscillation Index (SOI) as independent variable, is shown for the annual discharge of the Orange River in South Africa. It implies the prediction of the SOI at least one year ahead through an autoregressive time series.
End of August 2005 the flood defences of New Orleans were hit by hurricane Katrina. It quickly became apparent that they could not withstand this force of nature. The three bowls of the city were flooded. Over a thousand people lost their lives and the total damage exceeded $20 billion US. What can we learn from this disaster? Can the process of understanding be supported by mathematics? Is it possible to draw conclusions with the help of mathematics that can help to avoid a repeat of this tragedy?
Two years after the disaster no decision has been taken about the required level of protection. This is a mathematical decision problem where the increasing cost of protection is equated with the reduced risk (probability × consequence) of flooding. Where the sum of the cost of protection and the present value of the risk reaches a minimum, the optimal level of protection is found. Along this line of reasoning the level of flood protection of the Netherlands was decided in 1960. However today some think that an insurance against the consequences of flooding is to be preferred over spending money on a flood defence system that will never be absolutely safe. Others judge it necessary to prepare the evacuation in case of a flood because perfect safety by flood protection is unattainable. Mathematics shows that both options are probably no alternative to optimal prevention.
This paper describes a new maintenance inspection methodology called relative material loss (RML) used for approximating the material loss contribution on each plate side separating two or more dissimilar marine environments. The new methodology leverages actual “at sea” environmental and operational conditions by defining relationships between the dissimilar environments and solving for the material loss on each plate side. The RML theory and a case study using a sixty five year old in-service structure; a dry dock caisson gate is presented.
Recently we have presented nonparametric predictive inference (NPI) for system reliability [1, 2], with specific attention to redundancy allocation. Series systems were considered in which each subsystem i is a ki-out-of-mi system. The different subsystems were assumed to consist of different types of components, each type having undergone prior success-failure testing. This work uses NPI for Bernoulli variables [3], which enables prediction for m future variables based on n observations, without the need of a prior distribution. In this paper, we present a generalization of these results by considering multiple subsystems which all consist of one type of component, which provides an important step to wider applicability of this approach.
Prediction of the life-cycle performance of structural systems must be accompanied with an efficient intervention planning procedure that assures the safe upkeep of structures. Multi-criteria optimization is an effective approach for conducting this procedure. Life-cycle performance of structural systems is typically quantified by means of performance indicators. The ability of the performance measures and their predictive models to accurately interpret and quantify the effects of applying maintenance interventions is necessary. The objective of this paper is to review recent advances in methods of multi-criteria optimization of life-cycle performance of structural systems under uncertainty. Two approaches for finding optimum maintenance strategies for deteriorating structural systems through multi-criteria optimization and using genetic algorithms are presented with applications. These approaches use different problem formulations and types of performance indicators.
The aeration of the activated sludge tank of wastewater treatment plant (WWTP) Westpoort in Amsterdam (the Netherlands) has been optimised using model based control. Discharge limits for the effluent of the treatment plant require total nitrogen (Ntot) concentrations below 10 mg/l. Ntot levels are reduced using biological nitrification-denitrification. This process is controlled by aeration which consumes a lot of energy. In order to reduce energy, the nitrification-denitrification process is optimised using a non linear regression model for the ammonium (NH4) concentration. Simulation results show that the total nitrogen concentration in the effluent can be decreased with a lower oxygen concentration, thus consuming less energy. Both nitrogen removal and energy consumption were reduced with ten percent. Currently, the model based control (MBC) is implemented in the actual process control.
This paper discusses the maintenance optimization of a railway track, based on the observation of two dependent randomly increasing deterioration indicators. These two indicators are modelled through a bivariate Gamma process constructed by trivariate reduction. Empirical and maximum likelihood estimators are given for the process parameters and tested on simulated data. The EM algorithm is used to compute the maximum likelihood estimators. A bivariate Gamma process is then fitted to real data of railway track deterioration. Preventive maintenance scheduling is studied, ensuring that the railway track keeps a good quality with a high probability. The results are compared to those based on both indicators taken separately, and also on one single indicator (usually taken for current track maintenance). The results based on the joined information are proved to be safer than the other ones, which shows the interest of the bivariate model.
This paper deals with the construction and optimisation of accurate condition-based maintenance policies for cumulative deteriorating systems. In this context, the system condition behavior can be influenced by different environmental factors which contribute to increasing or reducing the degradation rate. The observed condition can deviate from the expected condition if the degradation model does not embrace these environmental factors. Moreover, if more information is available on the environment variations, the maintenance decision framework should take advantage of this new information and update the decision. The question is how shall we model the decision framework for this? A gamma process-degradation model with randomized parameters is proposed to model the influence of the random environment on the system behavior. An adaptive maintenance policy is constructed which takes into account the environmental changes. The mathematical framework is presented here and a numerical experiment is conducted to highlight the benefit of our approach.
The gamma process is a stochastic cumulative process that can be used to model a time-variant uncertain process. Professor van Noortwijk's research work played a key role in modeling degradation by gamma process and making it popular in engineering community. The maintenance optimization models mostly use the renewal theorem to evaluate the asymptotic expected cost rate and optimize the maintenance policy. However, many engineering projects have relative short and finite time horizon in which the application of the asymptotic formula becomes questionable. This paper presents a finite time model for computing the expected maintenance cost and investigates the suitability of the asymptotic cost rate formula.
In this paper we study models for cumulative damage of a component caused by shocks occurring randomly in time, following a historical approach. The damage caused by a shock, is also of random nature. A very well-known model is the compound renewal process, with the compound Poisson process as a special case. These models play an important role in maintenance analysis and cost calculations. In these models the times at which shocks occur and the damage caused by the shock are assumed to be independent. But very often this is not realistic, the damage will depend on the time since the last shock, in some engineering applications it is even a deterministic function of the time since the last shock. Also, the results are often asymptotic. We will develop a model which allows dependence between damage and time since the last shock. We will calculate Laplace transforms of the interesting quantities and show how these can be inverted to get probability distributions for finite time horizons.