How historical information can improve extreme coastal water levels probability prediction : application to the Xynthia event at La Rochelle ( France )

Introduction Conclusions References


Introduction
Extreme value theory has been widely used to estimate the highest values of coastal water levels (WL).Within risk analyses, the knowledge of extreme WL and their associated annual probabilities of exceedance or return periods are required for dimensioning coastal defences or within flooding hazard estimations.A first approach consists in performing a classical extreme value analysis (EVA) directly on tide gauge observations (this approach is called direct) (Arns et al., 2013).However, such a method provides limited extrapolation time.Indeed, it is generally considered that one should not estimate levels whose return periods exceed four times the data-span to keep uncertainties manageable (Pugh, 2004), whereas the analysis is fully constrained by the duration of observations (a few decades at most).In addition, direct methods are sensitive to outliers (Tawn and Vassie, 1989), those particularly extreme values much higher than other observations, thus making results even more uncertain.An outlier might be an extreme manifestation of the random variable we want to analyse or it can be a realisation of a different random process or an error in recording or reporting the measurement (Grubbs, 1969).In the first case, the outlying observation should be kept in the sample as it provides valuable information on the random variability inherent in the data (Mazas and Hamm, 2011).
An alternative to the direct approach consists in performing an EVA to the random atmospheric surge signal and then combining it with the deterministic tidal probability distribution (Tawn and Vassie, 1989;Batstone et al., 2013), thus allowing extrapolation to larger return periods while being less sensitive to outliers (Haigh et al., 2010).Such an indirect method assumes surges and tides are independent.This assumption being wrong in some places (Idier et al., 2012), methods have been developed to take into account this partial dependency (Mazas et al., 2014).However, the results are not yet fully satisfactory, with for instance a notable offset between direct and indirect methods within the interpolation domain (i.e.where return periods are less than the duration of observation).Moreover, even if this approach allows estimating WL of longer return periods, it is still constrained by the information measured by the tide gauge.Consequently, outliers might not be better described by the final distribution (typically if the associated atmospheric surges are outliers in their own distribution), making the estimation of their return periods problematic.For instance, the hourly WL recorded at La Rochelle (8.01 m Z. H. -Zéro Hydrographique, French navy chart datum) during the storm Xynthia that hit the French Atlantic coast on 28 February 2010 causing 47 Introduction

Conclusions References
Tables Figures

Back Close
Full deaths (Bertin et al., 2012), still appears as an outlier using an indirect approach and the estimation of its return period is not relevant (Duluc et al., 2014).Another possibility is to use regional frequency analysis (RFA) to increase artificially the duration of observation and reduce incertainties (Duluc et al., 2014;Weiss et al., 2014a, b).Outliers may thus be better described by the distribution as their representativity might increase.RFA consists in pooling together observations from several sites inside a homogeneous region, assuming the highest observations in that region follow a common regional probability distribution, up to a local scale factor representing specific characteristics of each site.However, this approach raises the issues of the definition of homogeneous regions and the intersite dependency.Using a RFA of skew surges, Duluc et al. (2014) estimated a return period of Xynthia's WL greater than 1000 years, although they acknowledged uncertainties were large.
The above described techniques are all initially based on WL measurements.In the past, before the tide gauge era, extreme events also happened.For those generating marine submersion, testimonies exist which report the inundated places.This information is often partial, in the sense that most of the time it indirectly indicates that the sea-level was at least higher than a given mark, but not which water level was actually reached.Recently, Hamdi et al. (2014) proposed a method to integrate historical information in extreme surge frequency estimation, using the maximum likelihood estimators for the distribution parameters.However, this method requires the knowledge of historical surges, a piece of information rarely found in archives (see e.g.Baart et al., 2011).Among the statistical techniques developed to combine both sources of data (recent observations and historical information), Bayesian methods provide the most flexible and adequate framework (Reis and Stedinger, 2005).The added value of using historical information in EVA has been widely recognised for the last 30 years in the domain of hydrology (see e.g.Benito et al. (2004) for a review).Surprisingly, we found only one reference (Van Gelder, 1996) developing such a method for sea water levels.Van Gelder (1996) set up a Bayesian framework to account for known historical sea floods in the estimation of sea dikes design level in the Netherlands.The method Introduction

Conclusions References
Tables Figures

Back Close
Full consists in using historical data as prior information to estimate an a priori distribution for the parameters of the probability distribution.However, the method cannot deal with partial information (an estimation of the historical water level is needed), implying that a lot of historical information cannot be integrated in such a framework.In the hydrology field, Reis and Stedinger (2005) developed a Bayesian Markov Chain Monte Carlo (MCMC) approach to tackle the issue of integrating partial historical information within EVA.In the present study, we build on this approach to develop a Bayesian MCMC method adapted for EVA of coastal water levels (called BMC2 method hereafter).We notably take into account the influence of mean sea-level rise on tide gauge data and historical information.We also take advantage of the Bayesian framework to derive predictive return levels (Coles and Tawn, 2005).In particular, we investigate whether it is possible to better predict the probability of future extreme coastal WL by considering partial historical information.As a case study, we apply the BMC2 method to the site of La Rochelle and investigate whether: (Q1) integrating historical information significantly reduces statistical uncertainties, (Q2) the WL reached during Xynthia in 2010 is really an outlier, (Q3) it would have been possible to predict the annual exceedance probability of that level before it happened.Section 2 describes the BMC2 method.The case study at La Rochelle is then presented in Sect.3. In Sect.4, results are discussed and some conclusions and perspectives that such a method opens for extreme statistics are drawn in Sect. 5.

The BMCmethod
The model chosen to represent and extrapolate extreme values of WL is the Generalised Pareto Distribution (GPD), applied to a Peaks-Over-Threshold (POT) sample.The threshold choice is driven by classical visual tools such as mean residual life and parameters stability plots (see Coles, 2001).The GPD is a distribution with two parameters (σ -scale parameter, and ξ -shape parameter).For a given threshold u, the cumulative distribution function (CDF) of the GPD is equal to the probability Introduction

Conclusions References
Tables Figures

Back Close
Full P (X ≤ x|X > u), where the random variable X describes observed sea-levels at high water, and it can be written as follows: where σ > 0 and the notation y + for y ∈ R is defined as y + = max(y, 0).The support of the distribution is the width of the distribution, ξ controls the behaviour of the distribution's tail.If ξ < 0, the distribution is bounded, we are in the Weibull domain.If ξ > 0 (resp.= 0), the distribution is unbounded, we are in the Fréchet (resp.Gumbel) domain.Contrary to the Weibull domain, a small change of ξ in the Fréchet domain involves significant changes of the distribution.
In contrast with classical statistical methods used to compute the parameters and derive extreme values (e.g.maximum likelihood, method of moments, probability weighted moments . . .), Bayesian techniques provide a natural framework to deal with uncertainties.They are designed to obtain the full posterior distribution of variables of interest and not only point estimates (Coles and Tawn, 2005).
Let's denote by θ the vector of parameters (ξ, σ).Its posterior distribution is related to the likelihood of data through Bayes' theorem: where f (D|θ) is the likelihood function of a set of observations D given the parameters vector, f (θ) is the prior distribution of the parameters and f (D) is a normalising constant depending only on the observations.f (θ) translates the prior knowledge one may have about the parameters.In our study, we have no prior information about GPD parameters for our dataset.Consequently, we use a non-informative prior, namely the Introduction

Conclusions References
Tables Figures

Back Close
Full uniform distribution (Payrastre et al., 2011).In that case, f (θ|D) is proportional to the likelihood function.
To sample effectively the posterior distribution of interest, we use a Markov Chain Monte Carlo (MCMC) algorithm.MCMC algorithms allow sampling values of the parameters from the posterior distribution efficiently, without computing the normalising constant.In this study, the Metropolis-Hastings (MH) algorithm (Metropolis et al., 1953;Hastings, 1970) is used to generate a set of 40 000 vectors θ with density f (θ|D).The convergence of the chain is checked numerically with the Geweke test (Geweke, 1992) and visually with trace plots.We can then compute the corresponding quantiles of WL according to the GPD.In particular, the mode of the set of vectors θ can be retrieved, whose associated quantiles x p correspond to the maximum likelihood estimates for WL.Credibility intervals on WL can also be estimated based on the large set of quantile values.Results can be displayed on a return level plot once the correspondence between quantiles x p (x p > u) and return periods T has been set up: where maxy is the annual maximum, n is the number of high tides per year and λ = nP (X > u) is the mean number of exceedances of threshold u per year.The quantile x p is said to be the standard estimative T year return level and it is exceeded, in average, once every T years.
Because successive water levels at high tide might not be independent (typically if a storm spans multiple tidal cycles), we derive and apply an extremal index α (Leadbetter, 1983) so as not to overestimate levels for particular return periods.For a given level x, the extremal index is the inverse of the mean cluster size defined in terms of the number of successive high waters exceeding x.To derive α for any level x, we fit the same function as in the work of Batstone et al. (2013) to the observations: ), with a > 0.Then, the corrected return period is equal to T/α(x p ). Introduction

Conclusions References
Tables Figures

Back Close
Full One main advantage of the Bayesian analysis is the possibility to integrate all the available information in a unique predictive distribution for extreme WL values (Coles and Tawn, 2005), which is defined as follows: (4) Thus, the predictive distribution can be easily estimated as the mean of GPD values calculated at x for the entire set of sampled parameters and can be represented on a return level plot after solving the equation p = α x p λP X > x p |X > u, D , where x p is the predictive return level associated with p. Since all the uncertainty information has been integrated in the final result, credibility intervals are no longer defined.Instead, the value of p can be interpreted as the probability that, given all the available information, next year's maximum WL will exceed x p .The formulation of the likelihood function in Eq. ( 2) depends on the characteristics of observations D (Payrastre et al., 2011).We can split the likelihood function into two parts, thus separating the systematic and historical periods: Let us assume we have a number s of systematic tide gauge observations above u (x 1 , . .., x s ) and a historical period of n y years with H = h events above a perception threshold X 0 (X 0 > u).The h events above X 0 during the historical period are supposed to be exhaustive.This is a necessary condition.Historical information can be of different types.The number h can thus be broken down into h 1 historical events whose water levels are known (y 1 , . .., y h 1 ), a number h 2 of historical events that exceeded the perception threshold X 0 but whose exact water levels are not known and h 3 historical events whose water levels are known to be within a given range of values (lower bounds Introduction

Conclusions References
Tables Figures

Back Close
Full ).In addition, let's assume the number K of POT data during the historical period is equal to k.The general expression of the likelihood of systematic data is: where g θ is the probability density function of the GPD for parameters θ.
The likelihood of historical data is a bit more complicated.Let's first consider the likelihood of historical data conditional to K : The first term of the right hand side is the probability of observing h = h 1 +h 2 +h 3 events above X 0 among the k POT that occurred during n y years.It is calculated as follows: )), Eq. ( 7) becomes: To compute the likelihood of historical data, we use the total probability theorem: Considering the number of exceedances of threshold u follows a Poisson distribution with rate λ (Coles, 2001), the probability of observing k POT during n y years is: Replacing Eq. ( 11) into Eq.( 10) gives: So far, we have implicitly considered that the POT sample represents a stationary process.This assumption is systematically made in the hydrology field (Gaume et al., 2010).However, extreme WL exhibit long-term trends that cannot be ignored.Over the 20th century, these trends have been shown to be similar to those of mean sealevel (m.s.l.) at most locations worldwide (Woodworth et al., 2011).To account for this behaviour in the systematic dataset, the linear trend is calculated for the entire tide gauge record and removed from the data.Then data are adjusted to have a mean sealevel equal to that of the reference year of interest.The historical perception threshold must also be corrected for the m.s.l.rise (and called hereafter the adjusted perception threshold).Once this is done, Eq. ( 12) becomes: Introduction

Conclusions References
Tables Figures

Back Close
Full where X 0,m is the adjusted perception threshold for historical year m and h 1,m , h 2,m , h 3,m are respectively the numbers of historical events with known WL, with unknown WL and with WL within a range of values, that exceeded X 0,m during year m.h m is the total number of historical events that exceeded X 0,m during year m (h m = 3 Application to the Xynthia event at La Rochelle

Study site and data
The study site is La Rochelle (west Atlantic coast of France, Fig. 1), focusing on the tide gauge located at La Pallice harbour (about 30 years of data till 2013).The highest recorded sea-level at high water is 8.01 m Z. H. and it occurred during Xynthia on 28 February 2010 (see Fig. 2).As a comparison, the highest tidal level estimated from tidal components analysis is 6.86 m Z. H. (SHOM, 2013).
As highlighted in the introduction, to illustrate the usefulness of the developed BMC2 method, we investigate whether: (Q1) integrating historical information significantly reduces statistical uncertainties, (Q2) the WL of 8.01 m Z. H. reached during Xynthia is really an outlier, (Q3) it would have been possible to predict the annual exceedance probability of that level beforehand.
Four cases are considered, applying the BMC2 method respectively to: (case 1) the systematic data including Xynthia's year (2010), (case 2) the systematic data till year Introduction

Conclusions References
Tables Figures

Back Close
Full 2009, (case 3) the systematic data till year 2009 with historical information, (case 4) the systematic data including Xynthia's year (2010) with historical information.Whereas all cases are useful to answer our first point (Q1), cases 1 and 4 aim more specifically at investigating the outlier nature of Xynthia's WL (Q2), and cases 2, 3 and 4 aim at studying the capability of the BMC2 method to predict the probability of Xynthia's WL beforehand (Q3).
Regarding the systematic data till 2010, the tide gauge provides about 27 years of data.Figure 2 shows the data after removing the linear trend (1.9 ± 0.1 mm yr −1 , in agreement with the study of Gouriou et al. (2013) in the same area) and adjusting it to the mean sea-level of 2010 (calculated from the same dataset and equal to 3.93 m Z. H.). Concerning historical events, the dataset is based on two analyses of archives: Garnier and Surville (2010) and Lambert (2014).A convenient perception threshold is the altitude of the old harbour dock of La Rochelle, which has remained unchanged over the studied period (first identified event: 1890).When the dock is mentioned as flooded, the water level is considered to have reached at least the dock altitude.Following the notations of Sect.2, we are in a case where h 1 = h 3 = 0. Based on a Digital Terrain Model (DTM) (Litto3D ® , horizontal resolution 1 m, vertical accuracy 0.15 m), the mean altitude of the dock, calculated from 457 points surrounding it, is X 0 = 7.1 ± 0.1 m Z. H.A total of 8 flooding events of the old harbour dock is identified since 1890 (Table 1 and Fig. 2).Original archives can be found in the above mentioned references.The entire historical period covers 94.4 years (including gaps in the systematic period).As explained in Sect.2, this historical dataset must be corrected for the mean sea-level rise.Since the systematic data trend is close to the global sea-level rise trend (see e. 1880-1935 (global linear trend of 1.1 ± 0.7 mm yr −1 ) to adjust the perception threshold X 0 for each year since 1890 and until 1935.For the period 1936-2010, we use the one calculated previously from local tide gauge measurements.For example, X 0 adjusted for year 1890 becomes X 0,1 = X 0 +0.0019 (2010-1936)+0.0011(1936-1890)7.29 m.

Results
To select the threshold u, the classical tools described in Sect. 2 are applied to the case with the smallest dataset, i.e. case 2. This provides a threshold u = 6.74 m (99.5th percentile).For this case, the mean number of high tides that exceed that threshold per year is λ = 3.3.For sake of intercomparison, the threshold u is kept constant for every case (1 to 4).
Results are presented in Fig. 3 and Table 2.As a general comment, whatever the case, predictive return levels are uniformly above standard estimates (Fig. 3).This is a consequence of the parameter uncertainty they account for (Coles and Tawn, 2005).At low levels, there is little difference between predictive and standard return levels.At higher levels, the difference becomes larger as a consequence of the increasing parameter uncertainty.
First, we focus on the impact of historical information on the standard estimative return levels WL T (T = 50, 100 or 500 years) as well as on their associated credibility intervals (Q1) (Table 2).When historical data are taken into account, the values of WL T increase whatever the considered return period (cases 3 and 4 vs. cases 2 and 1 respectively).When systematic data till end of 2010 are used (cases 1 and 4), the precision related to estimated return levels also changes.In particular, the integration of historical data divides by a factor of 2 the relative widths of the credibility intervals whatever the return period.When systematic data till end of 2009 are used (cases 2 and 3), we notice the relative widths of the credibility intervals are almost the same, with a slight increase for case 3 where historical data are integrated.This can be explained by a shift of the distribution of the GPD parameters towards the Fréchet domain Introduction

Conclusions References
Tables Figures

Back Close
Full (i.e.positive values of ξ) (Fig. 3a 2 and a 3 ).Indeed, as mentioned in Sect.2, a small change of ξ in the Fréchet domain involves significant changes of the distribution.Consequently, credibility intervals are wider if the distribution of the GPD parameters lies in the Fréchet domain than if it lies in the Weibull domain.If we consider that case 4 is the reference, then integrating historical data in case 3 leads to more accurate values of WL T compared to case 2 where no historical information is used, while keeping the relative width of the credibility interval similar to case 2. Thus, integrating historical information in the EVA of WL reduces uncertainties with more accurate and/or more precise estimative return levels.Now, we investigate the outlier nature of Xynthia's WL (Q2), comparing standard estimative return periods for cases 1 and 4 (Fig. 3).In case 1, the bivariate posterior probability density contours of (ξ, σ) shows that the shape parameter of the distribution's mode is positive, indicating a heavy tailed distribution.There is also a large variability of ξ, resulting in extremely large credibility intervals for the highest return periods.This is due to the value of the highest point (Xynthia) which is about 0.8 m above the second highest and could be reasonably considered as an outlier.The standard estimation of the return period of Xynthia's WL is 350 years which is much larger than four times the observation period (4 × 27 = 108 years) and therefore highly uncertain.In case 4, the bivariate density plot shows that ξ is better constrained: the historical information has greatly reduced the uncertainties on ξ as it can be seen on the credibility intervals (Fig. 3b 4 ).In this case, the water level reached during Xynthia no longer appears as an outlier.The standard estimation of its return period is 210 years.Finally, we evaluate if we could have predicted Xynthia's WL before it happened (Q3), by comparing results of cases 2, 3 and 4 in terms of standard estimation and prediction (Fig. 3).We should recall that the prediction can be interpreted as the probability that next year's maximum WL (e.g. in 2010 if we are in 2009) will exceed a given threshold (here Xynthia's WL).In case 2, the shape parameter of the distribution's mode is slightly negative, which indicates a bounded distribution.The standard estimation of the return period for a Xynthia-like WL is unrealistically large (> 10 7 years).The Introduction

Conclusions References
Tables Figures

Back Close
Full obtained prediction of the annual probability of exceedance of a Xynthia-like WL for 2010 is p 1 : 1350 years.In case 3, the bivariate density plot for the GPD parameters shows that the value of ξ for the distribution's mode is now positive compared to case 2, indicating a heavy tailed distribution.The dispersion of ξ is slightly lower than in case 2 but its distribution is shifted towards the Fréchet domain, resulting in larger credibility intervals as mentioned previously (Fig. 3b 3 ).The standard estimation of the return period of a Xynthia-like WL is about 490 years.Considering the predictive return levels, the annual probability of exceedance of a Xynthia-like WL is p 1 : 290 years.Thus, by considering historical data, the predictive probability of having an annual maximum WL in 2010 of at least 8.01 m is about five-fold the one calculated in case 2 where no historical information is used.Finally, results of case 4 can be used to estimate the predictive quality of the method for this event on the study site: interestingly, there is not much difference between case 3 and case 4. The bivariate density plot of case 4 shows that ξ is slightly greater with smaller dispersion and that σ is a bit more constrained.Consequently, the return level plots are very similar in both cases.The standard estimation of the annual exceedance probability of Xynthia's WL is 1 : 210 years, a value close to 1 : 290 years predicted back in 2009 (case 3).Thus, end of 2009, applying the BMC2 method to the available systematic data at that time, together with historical information, we could have predicted the right order of magnitude of the annual exceedance probability of a Xynthia-like WL.

Discussion
By integrating historical information in the extreme value analysis of WL, the proposed method allows a better assessment of standard estimative return levels while reducing statistical uncertainties.This has been verified on the site of La Rochelle.Furthermore, the BMC2 method allows placing extreme events which can be considered as outliers in classical EVA, in a broader context, thus relativizing their uniqueness.The standard estimation of the return period of the WL reached during Xynthia in the complete

Conclusions References
Tables Figures

Back Close
Full analysis at La Pallice (case 4, T = 210 years) is significantly lower than the previous estimate of Duluc et al. (2014) using the same systematic dataset (T > 1000 years, see Sect. 1).Going one step further, the method, applied on the full dataset (systematic data till 2013 and historical information, all data adjusted to the mean water level of year 2013), still provides a standard return period of a Xynthia-like WL of about 210 years.It is the smallest return period we found in the literature regarding Xynthia's WL, tending to show it is probably closer to 100 years than to 1000 years.In terms of prediction, the method provides a probability of about 1 : 160 years that the maximum WL in 2014 exceeds that of Xynthia.However, like other EVA approaches, the BMC2 method relies on some approximations and assumptions (both on the data and the statistical model).
First, the use of historical data leads to uncertainties at two levels.Within this study, we assume WL values at the tide gauge of La Pallice and inside the harbour of La Rochelle (about 5 km apart) are comparable.Due to local effects, this might not be exactly the case.This is a first difficulty when using historical data: most of the time, historical observations are not made at the tide gauge location.One solution to deal with this issue, although beyond the scope of this paper, would be hydrodynamic modelling of last decades' events to statistically quantify the WL offset, called DZ hereafter.A second source of uncertainty is the estimation of the perception threshold.Most of archives' information deals with water flooding a given area without more detailed information.In the present study, archives indicate that the old harbour dock was flooded without specifying the water entrance location on the dock.We assume the threshold to be the mean dock altitude but this is an approximation.It should be noted that since the distribution of ξ lies mostly in the Fréchet domain (especially in cases 1, 3 and 4), the standard and predictive estimation of large WL should be highly sensitive to the DZ and X 0 parameters.Nevertheless, the selected values (no DZ and X 0 = 7.1 m) lead to a standard estimation of the return periods of the 8 historical events ranging from about 10 to 15 years (Fig. 3b 4 ).Such return periods appear consistent with the observed probability of flooding events (8 in 94.4 years) and as a first approximation, Introduction

Conclusions References
Tables Figures

Back Close
Full our choice seems reasonable.For other applications, where DZ and X 0 could be more difficult to estimate, the Bayesian framework should make the integration of DZ and X 0 uncertainties within the BMC2 method possible (Reis and Stedinger, 2005).The statistical model also contains uncertainties.In the POT/GPD model, a main source of uncertainties is the choice of the systematic threshold u.Estimated quantiles are indeed highly dependent on the threshold, the selection of which is sometimes difficult and often subjective (Li et al., 2012).In our case study, even if there are still some uncertainties in the threshold selection (done for case 2, see Sect.3.2), the resulting estimative distribution passed two statistical adjustment tests with a 0.05 level of risk (χ 2 with 10 classes, Greenwood and Nikulin, 1996;and Kolmogorov-Smirnov, Shorack and Wellner, 2009).A second source of uncertainty comes from the seasonal and interannual variability of WL which has not been considered in our model.Regarding the seasonal variability, if u is chosen high enough, which is the case here, the selected events occur mainly in the winter period (October to March for the French Atlantic coast) and we can reasonably consider that seasonal variability is negligible in the POT sample.Interannual variability, on the other hand, can lead to significant variations of extreme values in time as highlighted by the work of Menéndez et al. (2009).Finally, we have fitted the GPD directly on WL measurements, so even with additional historical information, our approach could be classified as direct (see Sect. 1).As such, additional uncertainties may be involved for high return values compared to an indirect approach (Haigh et al., 2010).However, integrating historical information in an indirect approach is challenging.It would require the characterisation of historical events in terms of surges rather than WL, a piece of information rarely found in archives as mentioned in Sect. 1.It would thus also require the knowledge of historical tides which might be difficult to estimate as the tide is not a stationary process, as highlighted for instance by studies of sea-level rise influence on tidal harmonics (Pickering et al., 2012).
As described in Sect. 3 and Fig. 3, the bivariate distribution of the GPD parameters for our case study at La Rochelle lies mostly in the Fréchet domain.A consequence Introduction

Conclusions References
Tables Figures

Back Close
Full is that small changes of ξ can generate significant changes in the return level pots, especially in the tail of the distribution.Therefore, regarding the above-described approximations and assumptions done in the present study, the estimated values of the return period of Xynthia's WL should be considered with caution, and interpreted as magnitude orders rather than exact values.

Conclusion
To reduce statistical uncertainties and to address the issue of outliers in extreme value analyses of coastal water levels, we developed a Bayesian method to integrate historical information (even partial) of past events that occurred before the systematic gauging area.The proposed method, inspired from previous works in the hydrology field, is adapted to POT sample of coastal sea-levels, taking into account the influence of mean sea-level rise.It provides standard estimative as well as predictive return levels, the latter being particularly useful for decision makers.
The application of the method on the site of La Rochelle in France illustrates the usefulness of historical information in reducing statistical uncertainties in EVA and relativizing apparent outliers such as Xynthia's WL.In particular, it shows that, back in 2009 before the storm, we could have predicted the right order of magnitude of the annual exceedance probability of a Xynthia-like WL.These results are particularly important for raising awareness of decision makers and eventually enhancing preparedness for future flooding events.However, some uncertainties remain in the data and the statistical model, and because of the high variability of the GPD tail in the Fréchet domain, numerical values presented in this paper should be considered as indicative only.
The method opens a large field of possibilities for engineers wishing to put into perspectives classical extreme value analyses of water levels with the richness of historical information on coastal floods.Furthermore, beyond the integration of historical information in the EVA of WL, the proposed method should allow combining data of different natures together with associated uncertainties.For instance, future research may focus Introduction

Conclusions References
Tables Figures

Back Close
Full with a WL x such that x < X 0,k , and C k is the number of events that did not reach the perception thresholds X 0,1 , . .., X 0,k during the periods of definition of the thresholds.For example, if X 0,2 is defined for 5 years with 2 events above it during these 5 years and X 0,3 is defined for 10 years with 1 event above it during these 10 years, then C 3 is estimated as follows: C 3 = 15λ − 3, with λ the mean number of exceedances of threshold u per year (see Sect. 2).
The empirical exceedance probabilities P k i (1 ≤ i ≤ A k ), or plotting positions, of the A k events with WL between X 0,k and X 0,k+1 ranked in descending order, are finally calculated with the following formula: where a is a constant between 0 and 0.5 characterising the spacing between plotting positions.For the present work, we used the value 0.4 (Cunnane).Introduction

Conclusions References
Tables Figures

Back Close
Full  Full  Full Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | y lb 1 , . .., y lb h 3 larger than X 0 ; upper bounds y ub 1 , . .., y ub h 3 Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | g. Church and White, 2011) and the vertical land motion at the study site (monitored by GPS station since 2001) is negligible (Santamaría-Gómez et al., 2012), we can assume that the relative sea-level rise in the La Rochelle area is equal to the absolute global sea-level rise.Making the hypothesis that this result is valid on the long range, we use the global sea-level rise rate of Church and White (2011) over the period Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper |

Table 1 .
Summary of the 8 historical flooding events that submerged the old harbour dock since 1890.The altitude of the old harbour dock is 7.1 m Z. H.Each event reported here has therefore generated a WL higher than 7.1 m Z. H. back in the year of the event.Notations for the sources: GS -Garnier and Surville (2010); L -Lambert (2014).

Table 2 .
Standard estimative return values of WL and widths of the associated 95 % credibility intervals (absolute -∆CI and relative -∆CI/WL T ) for several return periods T according to each case.