Extreme Value Analysis of Time Series II

Estimation of Marginal Parameters Empirical Processes of Cluster functionals Extreme Value Analysis of Time Series II Holger Drees University of Hamb...
Author: Rhoda Ferguson
2 downloads 1 Views 1MB Size
Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Extreme Value Analysis of Time Series II Holger Drees University of Hamburg

Risk Semester, AMERISKA

Holger Drees

EVA of Time Series II

1/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Outline

1

Estimation of Marginal Parameters Tail Empirical Processes Asymptotics of Estimators Simulations

2

Empirical Processes of Cluster functionals Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Holger Drees

EVA of Time Series II

2/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Estimation of marginal parameters We have seen that scale and location parameter of blockwise maxima are influenced by serial dependence via extremal index. In contrast, (point) estimators based on POT (peaks over threshold) approach resp. on largest order statistics constructed for iid data are usually still consistent if serial dependence is present. However, the distribution of the estimation error is typically more spread out for serially dependent data. This has to be taken into account in the construction of confidence regions.

Holger Drees

EVA of Time Series II

3/64

Tail Empirical Processes Asymptotics of Estimators Simulations

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Simulation: performance of Hill estimator Consider two time series with same marginal distribution iid unit Fr´echet data with cdf F (x) = exp(x −1 ), x > 0 moving maxima: Xt = max(Zt , Zt−1 )/2 with iid unit Fr´echet innovations Zt We estimate Pk the extreme value index γ = 1 with Hill estimator γˆn := k1 i=1 log(Xn−i+1:n /Xn−k:n ) 1.6

250

1.4

200

1.2 150 1 100 0.8 50

0 0.6

0.6

0.8

1

1.2

1.4

1.6

0.4 0.5

1

1.5

Histograms of Hill estimator (n = 1000, k = 99) and qq-plot based on 1000 simulations Holger Drees

EVA of Time Series II

4/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Tail empirical quantile function In the following we focus on stationary univariate time series (Xt )t∈Z with marginal cdf F ∈ D(Gγ ) estimators which can be written as functions of large order statistics Denote the order statistics by X1:n ≤ X2:n ≤ . . . ≤ Xn−1:n ≤ Xn:n If estimators (or test statistics) use only the k + 1 largest observations, then they can be written as functionals of the tail empirical quantile function Qn (t) := Xn−bktc:n ,

t ∈ [0, 1].

We will analyze asymptotic behavior of Qn under suitable mixing conditions. The asymptotics of smooth functionals T (Qn ) then follows by some functional delta method. Holger Drees

EVA of Time Series II

5/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Tail empirical quantile function In the following we focus on stationary univariate time series (Xt )t∈Z with marginal cdf F ∈ D(Gγ ) estimators which can be written as functions of large order statistics Denote the order statistics by X1:n ≤ X2:n ≤ . . . ≤ Xn−1:n ≤ Xn:n If estimators (or test statistics) use only the k + 1 largest observations, then they can be written as functionals of the tail empirical quantile function Qn (t) := Xn−bktc:n ,

t ∈ [0, 1].

We will analyze asymptotic behavior of Qn under suitable mixing conditions. The asymptotics of smooth functionals T (Qn ) then follows by some functional delta method. Holger Drees

EVA of Time Series II

5/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Examples Hill estimator Z 1 k 1X Xn−i+1:n Qn (t) γˆn := log = log dt k Xn−k:n Q n (1) 0 i=1

estimators of extreme quantiles xp := F ← (1 − p) for small p Choose t = k/n, x = np/k in convergence lim t↓0

F ← (1 − tx) − F ← (1 − t) x −γ − 1 = a(t) γ

(for suitable normalizing function a) to motivate estimator xˆn,p := Xn−k:n + ˆan

(np/k)−ˆγn − 1 γˆn

with suitable estimators γˆn = T (Qn ) and ˆan = S(Qn ) of γ and a(k/n) respectively. If γ > 0, then one can choose a(t) = γF ← (1 − t) and estimator simplifies to xˆn,p = Xn−k:n (np/k)−ˆγn Holger Drees

EVA of Time Series II

6/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Examples Hill estimator Z 1 k 1X Xn−i+1:n Qn (t) γˆn := log = log dt k Xn−k:n Q n (1) 0 i=1

estimators of extreme quantiles xp := F ← (1 − p) for small p Choose t = k/n, x = np/k in convergence lim t↓0

F ← (1 − tx) − F ← (1 − t) x −γ − 1 = a(t) γ

(for suitable normalizing function a) to motivate estimator xˆn,p := Xn−k:n + ˆan

(np/k)−ˆγn − 1 γˆn

with suitable estimators γˆn = T (Qn ) and ˆan = S(Qn ) of γ and a(k/n) respectively. If γ > 0, then one can choose a(t) = γF ← (1 − t) and estimator simplifies to xˆn,p = Xn−k:n (np/k)−ˆγn Holger Drees

EVA of Time Series II

6/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Examples Hill estimator Z 1 k 1X Xn−i+1:n Qn (t) γˆn := log = log dt k Xn−k:n Q n (1) 0 i=1

estimators of extreme quantiles xp := F ← (1 − p) for small p Choose t = k/n, x = np/k in convergence lim t↓0

F ← (1 − tx) − F ← (1 − t) x −γ − 1 = a(t) γ

(for suitable normalizing function a) to motivate estimator xˆn,p := Xn−k:n + ˆan

(np/k)−ˆγn − 1 γˆn

with suitable estimators γˆn = T (Qn ) and ˆan = S(Qn ) of γ and a(k/n) respectively. If γ > 0, then one can choose a(t) = γF ← (1 − t) and estimator simplifies to xˆn,p = Xn−k:n (np/k)−ˆγn Holger Drees

EVA of Time Series II

6/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Uniform tail empirical process It is often simpler to first standardize the marginal cdf by considering the rv’s Ut = F (Xt ) If F is continuous, then Ut uniformly distributed on [0, 1]. An important step in the analysis of tail empirical quantile functions is analysis of corresponding uniform tail empirical processes: en (x) := √

n  1 X 1{U > 1 − v x} − vn x , t n nvn t=1

x ≥ 0,

with vn → 0, nvn → ∞. For iid observations well known that for a standard Brownian motion W en −→ W

weakly in D[0, ∞)

i.e. weak convergence holds w.r.t. supremum norm on bounded intervals. Holger Drees

EVA of Time Series II

7/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Uniform tail empirical process It is often simpler to first standardize the marginal cdf by considering the rv’s Ut = F (Xt ) If F is continuous, then Ut uniformly distributed on [0, 1]. An important step in the analysis of tail empirical quantile functions is analysis of corresponding uniform tail empirical processes: en (x) := √

n  1 X 1{U > 1 − v x} − vn x , t n nvn t=1

x ≥ 0,

with vn → 0, nvn → ∞. For iid observations well known that for a standard Brownian motion W en −→ W

weakly in D[0, ∞)

i.e. weak convergence holds w.r.t. supremum norm on bounded intervals. Holger Drees

EVA of Time Series II

7/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Uniform tail empirical process (cont.) For iid data en (x) := √

n  1 X 1{U > 1 − v x} − vn x → W (x) t n nvn t=1

weakly in D[0, ∞)

Questions: Under which conditions generalizes this to time series? How does serial dependence influence limit process? For x close to 0, above convergence is uninformative, because both sides close to 0. Can one strengthen convergence result by using other norms, e.g. by dividing both sides by a weight function q(x) converging to 0 as x ↓ 0?

Holger Drees

EVA of Time Series II

8/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Uniform tail empirical process (cont.) For iid data en (x) := √

n  1 X 1{U > 1 − v x} − vn x → W (x) t n nvn t=1

weakly in D[0, ∞)

Questions: Under which conditions generalizes this to time series? How does serial dependence influence limit process? For x close to 0, above convergence is uninformative, because both sides close to 0. Can one strengthen convergence result by using other norms, e.g. by dividing both sides by a weight function q(x) converging to 0 as x ↓ 0?

Holger Drees

EVA of Time Series II

8/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Mixing conditions for time series Mixing coefficients measure “strength of dependence” between observations separated in time Let Bij αk βk

 := σ (Xt )i≤t≤j  ∞ := sup |P(A ∩ B) − P(A) · P(B)| | A ∈ B1l , B ∈ Bl+k+1 ,l ∈ N  ∞ := sup E ess-sup |P(B | B1l ) − P(B)| | B ∈ Bl+k+1 l∈N

(Xt )t∈N strong mixing (α-mixing) ⇐⇒ limk→∞ αk = 0 (Xt )t∈N absolutely regular (β-mixing) ⇐⇒ limk→∞ βk = 0 β-mixing implies α-mixing

Holger Drees

EVA of Time Series II

9/64

Tail Empirical Processes Asymptotics of Estimators Simulations

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Uniform tail empirical process: β-mixing n

 1 X 1 − vn x , x ≥ 0, nvn t=1 {Ut > 1 − vn x} for some vn → 0, nvn → ∞; let αk , βk be mixing coefficients for (Ut )t∈N . Recall

en (x) := √

Theorem 2.1 (Rootz´en 1995, 2009, D. 2000) Suppose that for some 1  `n  rn  n and some K > 0 n (C1) β`n + rn (nvn )−1/2 log2 (nvn ) → 0 rn rn rn X  X 1 Cov 1{Ut >1−vn x} , 1{Ut >1−vn y } → c(x, y ) ∀0 ≤ x, y ≤ x0 (C2) rn vn t=1 t=1 (C3) E

rn X t=1

1{1 − v y < U ≤ 1 − v x} n t n

2

≤ Krn vn (y − x) ∀0 ≤ x < y ≤ x0

en e → weakly in D[0, x0 ] for a centered Gaussian process e with q q covariance function c, provided x 1/4 | log x|1/4+δ = O(q(x)) as x ↓ 0 for some δ > 0. Holger Drees EVA of Time Series II

Then

10/64

Tail Empirical Processes Asymptotics of Estimators Simulations

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Uniform tail empirical process: β-mixing n

 1 X 1 − vn x , x ≥ 0, nvn t=1 {Ut > 1 − vn x} for some vn → 0, nvn → ∞; let αk , βk be mixing coefficients for (Ut )t∈N . Recall

en (x) := √

Theorem 2.1 (Rootz´en 1995, 2009, D. 2000) Suppose that for some 1  `n  rn  n and some K > 0 n (C1) β`n + rn (nvn )−1/2 log2 (nvn ) → 0 rn rn rn X  X 1 (C2) Cov 1{Ut >1−vn x} , 1{Ut >1−vn y } → c(x, y ) ∀0 ≤ x, y ≤ x0 rn vn t=1 t=1 (C3) E

rn X t=1

1{1 − v y < U ≤ 1 − v x} n t n

2

≤ Krn vn (y − x) ∀0 ≤ x < y ≤ x0

en e → weakly in D[0, x0 ] for a centered Gaussian process e with q q covariance function c, provided x 1/4 | log x|1/4+δ = O(q(x)) as x ↓ 0 for some δ > 0. Holger Drees EVA of Time Series II

Then

10/64

Tail Empirical Processes Asymptotics of Estimators Simulations

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Uniform tail empirical process: β-mixing n

 1 X 1 − vn x , x ≥ 0, nvn t=1 {Ut > 1 − vn x} for some vn → 0, nvn → ∞; let αk , βk be mixing coefficients for (Ut )t∈N . Recall

en (x) := √

Theorem 2.1 (Rootz´en 1995, 2009, D. 2000) Suppose that for some 1  `n  rn  n and some K > 0 n (C1) β`n + rn (nvn )−1/2 log2 (nvn ) → 0 rn rn rn X  X 1 (C2) Cov 1{Ut >1−vn x} , 1{Ut >1−vn y } → c(x, y ) ∀0 ≤ x, y ≤ x0 rn vn t=1 t=1 (C3) E

rn X t=1

1{1 − v y < U ≤ 1 − v x} n t n

2

≤ Krn vn (y − x) ∀0 ≤ x < y ≤ x0

en e → weakly in D[0, x0 ] for a centered Gaussian process e with q q covariance function c, provided x 1/4 | log x|1/4+δ = O(q(x)) as x ↓ 0 for some δ > 0. Holger Drees EVA of Time Series II

Then

10/64

Tail Empirical Processes Asymptotics of Estimators Simulations

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Uniform tail empirical process: β-mixing n

 1 X 1 − vn x , x ≥ 0, nvn t=1 {Ut > 1 − vn x} for some vn → 0, nvn → ∞; let αk , βk be mixing coefficients for (Ut )t∈N . Recall

en (x) := √

Theorem 2.1 (Rootz´en 1995, 2009, D. 2000) Suppose that for some 1  `n  rn  n and some K > 0 n (C1) β`n + rn (nvn )−1/2 log2 (nvn ) → 0 rn rn rn X  X 1 (C2) Cov 1{Ut >1−vn x} , 1{Ut >1−vn y } → c(x, y ) ∀0 ≤ x, y ≤ x0 rn vn t=1 t=1 (C3) E

rn X t=1

1{1 − v y < U ≤ 1 − v x} n t n

2

≤ Krn vn (y − x) ∀0 ≤ x < y ≤ x0

en e → weakly in D[0, x0 ] for a centered Gaussian process e with q q covariance function c, provided x 1/4 | log x|1/4+δ = O(q(x)) as x ↓ 0 for some δ > 0. Holger Drees EVA of Time Series II

Then

10/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Uniform tail empirical process: α-mixing Theorem 2.2 (Rootz´en, 2009) Suppose (C2) and rn X 2+ε (C3) E 1{1 − v y < U ≤ 1 − v x} ≤ Krn vn (y − x) ∀0 ≤ x < y ≤ x0 n i n i=1

(D1) αk = o(k −θ ) as k → ∞, bn/rn c

(D2)

X i,j=1

Cov

rn X

αln

 n = o (nvn )ν rn

n → 0, rn

1{1−vn y 1 − v x} − vn x = nvn −x t n vn n t=1 vn   ← √ 1 − Gn (1 − vn t) = nvn −t vn

r

en (x) gn (t)

=

¯n (1 − vn •)/vn . and (1 − Gn← (1 − vn •))/vn is generalized inverse of G Holger Drees

EVA of Time Series II

13/64

Tail Empirical Processes Asymptotics of Estimators Simulations

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Uniform tail empirical quantile process (cont.) Lemma 2.3 (Vervaat) zn ∈ D[0, x0 ] nondecreasing, z ∈ [0, x0 ] for some x0 > 1, λn → ∞ such that  λn zn (x) − x 0≤x≤x0 → z, then λn zn← (x) − x

 0≤x≤1

→ −z.

Corollary 2.4 If, for some x0 > 1, the conditions of Theorem 2.1 or of Theorem 2.2 hold, then gn → e

weakly in D[0, 1].

Holger Drees

EVA of Time Series II

14/64

Tail Empirical Processes Asymptotics of Estimators Simulations

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Uniform tail empirical quantile process (cont.) Lemma 2.3 (Vervaat) zn ∈ D[0, x0 ] nondecreasing, z ∈ [0, x0 ] for some x0 > 1, λn → ∞ such that  λn zn (x) − x 0≤x≤x0 → z, then λn zn← (x) − x

 0≤x≤1

→ −z.

Corollary 2.4 If, for some x0 > 1, the conditions of Theorem 2.1 or of Theorem 2.2 hold, then gn → e

weakly in D[0, 1].

Holger Drees

EVA of Time Series II

14/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Weighted approx. of uniform tail empirical quantile process Problem: Many important extreme value statistics are functionals of tail empirical quantile function Qn = Fn← (1 − vn •), but functional is not differentiable w.r.t. supremum norm. Solution: Consider weighted supremum norms.

Corollary 2.5 If the conditions of Theorem 2.1 hold for some x0 > 1, then gn e → q q

weakly in D[0, 1]

for all weight functions q such that x 1/4 | log x|1/4+δ = O(q(x)) as x ↓ 0 for some δ > 0. Under strengthening of moment condition (C3) or extra conditions on `n and vn weight function may tend to 0 at a faster rate. Holger Drees

EVA of Time Series II

15/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Weighted approx. of uniform tail empirical quantile process Problem: Many important extreme value statistics are functionals of tail empirical quantile function Qn = Fn← (1 − vn •), but functional is not differentiable w.r.t. supremum norm. Solution: Consider weighted supremum norms.

Corollary 2.5 If the conditions of Theorem 2.1 hold for some x0 > 1, then gn e → q q

weakly in D[0, 1]

for all weight functions q such that x 1/4 | log x|1/4+δ = O(q(x)) as x ↓ 0 for some δ > 0. Under strengthening of moment condition (C3) or extra conditions on `n and vn weight function may tend to 0 at a faster rate. Holger Drees

EVA of Time Series II

15/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Weighted approx. of uniform tail empirical quantile process Problem: Many important extreme value statistics are functionals of tail empirical quantile function Qn = Fn← (1 − vn •), but functional is not differentiable w.r.t. supremum norm. Solution: Consider weighted supremum norms.

Corollary 2.5 If the conditions of Theorem 2.1 hold for some x0 > 1, then gn e → q q

weakly in D[0, 1]

for all weight functions q such that x 1/4 | log x|1/4+δ = O(q(x)) as x ↓ 0 for some δ > 0. Under strengthening of moment condition (C3) or extra conditions on `n and vn weight function may tend to 0 at a faster rate. Holger Drees

EVA of Time Series II

15/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

General tail quantile functions Recall definition of tail empirical quantile function  kn  Qn (x) := Fn← 1 − x = Xn−bkn xc:n , n

0 ≤ x ≤ 1.

for some intermediate sequence kn = o(n).  If F belongs to max domain of attraction of Gγ (x) = exp − (1 + γx)−1/γ , then R(t, x) :=

F ← (1 − tx) − F ← (1 − t) x −γ − 1 t↓0 − −→ 0 a(t) γ

for suitable a(t) > 0.

Even, for all x0 , η > 0, sup x γ+η |R(t, x)| → 0

as t ↓ 0.

x∈[0,x0 ]

Quantile transformation technique yields weighted approximation for Qn Holger Drees

EVA of Time Series II

16/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

General tail quantile functions Recall definition of tail empirical quantile function  kn  Qn (x) := Fn← 1 − x = Xn−bkn xc:n , n

0 ≤ x ≤ 1.

for some intermediate sequence kn = o(n).  If F belongs to max domain of attraction of Gγ (x) = exp − (1 + γx)−1/γ , then R(t, x) :=

F ← (1 − tx) − F ← (1 − t) x −γ − 1 t↓0 − −→ 0 for suitable a(t) > 0. a(t) γ

Even, for all x0 , η > 0, sup x γ+η |R(t, x)| → 0

as t ↓ 0.

x∈[0,x0 ]

Quantile transformation technique yields weighted approximation for Qn Holger Drees

EVA of Time Series II

16/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Weighted approximations of general tail quantile functions Theorem 2.6 (Drees, 2000) Suppose, for some x0 > 1, K > 0 and all 0 ≤ x, y ≤ x0 , f βln n + ln kn−1/2 log2 kn → 0 (C1) ln rn rn X  X n f (C2) Cov 1{Xi >F −1 (1− kn x)} , 1{Xi >F −1 (1− kn y )} → c(x, y ) n n rn kn i=1

f (C3)

n E rn kn

and Then

rn X

i=1

2

≤ K (y − x) < Xi ≤ F (1 − x γ+1  kn  , x → 0. kn1/2 sup R n x∈[0,x0 ] q(x)   γ+1  x −γ − 1  e Qn (x) − Dn 1/2 x kn − → q(x) a(kn /n) γ q 0≤x≤1

i=1

1 −1 {F (1 −

kn n y)

−1

kn n x)}

weakly in D[0, 1] for suitable random variable Dn (= F ← (1 − kn /n) if γ ≥ −1/4) Holger Drees

EVA of Time Series II

17/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Weighted approximations of tail quantile function for γ > 0 For γ > 0 one can choose a(t) = γF ← (1 − t) and Theorem 2.6 simplifies:

Corollary 2.7 (Drees, 2000) f C3) f and Under the conditions (C1)–( x γ+1 F ← (1 − kn x/n) − x −γ → 0. ← q(x) F (1 − k /n) n x∈[0,x0 ]

kn1/2 sup one has

  γ+1  X e n−bkn xc:n 1/2 x −γ kn −x →γ ← q(x) F (1 − kn /n) q 0≤x≤1

weakly in D[0, 1].

Holger Drees

EVA of Time Series II

18/64

Tail Empirical Processes Asymptotics of Estimators Simulations

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Application to Hill estimator Recall that Hill estimator γˆn for γ > 0 can be represented as smooth functional T (Qn ) of tail empirical quantile function Qn with Z T (z) =

1

log 0

z(t) dt z(1)

It can be shown that for decreasing functions, functional T is differentiable at zγ defined by zγ (t) := t −γ in the following sense: If supt∈[0,1] t γ+1−δ |yn (t) − y (t)| → 0 for some δ > 0 and λn → 0, then T (zγ + λn yn ) − T (zγ ) → λn

Holger Drees

Z

1

t γ y (t) − y (1) dt.

0

EVA of Time Series II

19/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Application to Hill estimator (cont.) If supt∈[0,1] t γ+1−δ |yn (t) − y (t)| → 0 for some δ > 0 and λn → 0, then Z 1 T (zγ + λn yn ) − T (zγ ) → t γ y (t) − y (1) dt. λn 0 −1/2

Apply this with λn = kn

,

yn = kn1/2



 Qn − z γ F ← (1 − kn /n)

and y (t) = γt −(γ+1) e(t) to obtain kn1/2 (ˆ γn − γ)

=

  Qn − T (z ) γ F ← (1 − kn /n)  T (zγ + kn−1/2 yn ) − T (zγ )

  kn1/2 T

kn1/2 Z 1 → γ t −1 e(t) − e(1) dt =

0

and the limit is centered Gaussian with variance γ 2 c(1, 1). Holger Drees

EVA of Time Series II

20/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Application to quantile estimation Recall estimator xˆn,pn = Xn−kn :n (npn /kn )−ˆγn = Qn (1)(npn /kn )−T (Qn ) for quantile xpn = F ← (1 − pn ). −1/2 Suppose pn → 0 s.t. npn = o(kn ), log(npn /kn ) = o(kn ) and   ← γ  F (1 − pn ) npn − 1 = o kn−1/2 log(npn /kn ) . F ← (1 − kn /n) kn Then 1/2

xˆn,pn kn log | log(npn /kn )| x pn  1/2 kn Qn (1) = log ← − (ˆ γn − γ) log(npn /kn ) | log(npn /kn )| F (1 − kn /n)  F ← (1 − kn /n)(npn /kn )−γ + log x pn OP (1/| log(npn /kn )|) + kn1/2 (ˆ γn − γ) + o(1) Z 1 → γ t −1 e(t) − e(1) dt. =

0 Holger Drees

EVA of Time Series II

21/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Application to quantile estimation Recall estimator xˆn,pn = Xn−kn :n (npn /kn )−ˆγn = Qn (1)(npn /kn )−T (Qn ) for quantile xpn = F ← (1 − pn ). −1/2 Suppose pn → 0 s.t. npn = o(kn ), log(npn /kn ) = o(kn ) and   ← γ  F (1 − pn ) npn − 1 = o kn−1/2 log(npn /kn ) . F ← (1 − kn /n) kn Then 1/2

xˆn,pn kn log | log(npn /kn )| x pn  1/2 kn Qn (1) = log ← − (ˆ γn − γ) log(npn /kn ) | log(npn /kn )| F (1 − kn /n)  F ← (1 − kn /n)(npn /kn )−γ + log x pn OP (1/| log(npn /kn )|) + kn1/2 (ˆ γn − γ) + o(1) Z 1 → γ t −1 e(t) − e(1) dt. =

0 Holger Drees

EVA of Time Series II

21/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Construction of confidence intervals

Asymptotic variance depends on unknown extremal serial dependence via covariance function c. For the construction of confidence intervals, this has to be estimated. Known approaches: bootstrap (will be discussed later in greater generality) use fact that errors of estimators which are based on different number k of large order statistics are related; fluctuations of process of estimators indexed by k contain important information on asymptotic variance

Holger Drees

EVA of Time Series II

22/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Construction of confidence intervals

Asymptotic variance depends on unknown extremal serial dependence via covariance function c. For the construction of confidence intervals, this has to be estimated. Known approaches: bootstrap (will be discussed later in greater generality) use fact that errors of estimators which are based on different number k of large order statistics are related; fluctuations of process of estimators indexed by k contain important information on asymptotic variance

Holger Drees

EVA of Time Series II

22/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Serial estimator of asymptotic variance As an example, consider Hill estimator based on largest i + 1 order statistics:   i  • , γˆn,i := T Qn kn

i ≤ kn ,

One may conclude similarly as above that kn1/2 γˆn,bλkn c − γ

 λ∈(0,1]



γ Z λ

1

t −1 e(λt) − e(λ) dt

 λ∈(0,1]

0

 =: ZT ,γ (λ) λ∈(0,1]

weakly in D(0, 1]. f one can show: Under slight generalization of condition (C2) limit covariance function of e is homogeneous of order 1 and thus e is self-similar (like Brownian motion), i.e. e(λ•) =d λ1/2 e(•). Hence ZT ,γ is self-similar, too, and thus Z˜T ,γ (u) := eu/2 ZT ,γ (eu ) stationary. Holger Drees

EVA of Time Series II

23/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Serial estimator of asymptotic variance As an example, consider Hill estimator based on largest i + 1 order statistics:   i  • , γˆn,i := T Qn kn

i ≤ kn ,

One may conclude similarly as above that kn1/2 γˆn,bλkn c − γ

 λ∈(0,1]



γ Z λ

1

t −1 e(λt) − e(λ) dt

 λ∈(0,1]

0

 =: ZT ,γ (λ) λ∈(0,1]

weakly in D(0, 1]. f one can show: Under slight generalization of condition (C2) limit covariance function of e is homogeneous of order 1 and thus e is self-similar (like Brownian motion), i.e. e(λ•) =d λ1/2 e(•). Hence ZT ,γ is self-similar, too, and thus Z˜T ,γ (u) := eu/2 ZT ,γ (eu ) stationary. Holger Drees

EVA of Time Series II

23/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Serial estimator of asymptotic variance (cont.) kn1/2 γˆn,bλkn c −γ

 λ∈(0,1]

 → ZT ,γ (λ) λ∈(0,1] ,

Z˜T ,γ (u) := eu/2 ZT ,γ (eu ) stationary

An ergodic theorem yields 

k

n kn −1 X (ˆ γn,i − γˆn,kn )2 jn i=jn Z  kn −1 1 (ZT ,γ (s) − ZT ,γ (1))2 ds ≈ log jn jn /kn Z  2 kn −1 0 = log Z˜T ,γ (u) − eu/2 Z˜T ,γ (0) du jn log(jn /kn ) 2 −→ E (Z˜T ,γ (0)) = as.Var (ˆ γn ),

log

provided jn /kn → 0 sufficiently slowly. Similarly, a variance estimator can be constructed from sequence of estimators xˆn,pn of extreme quantiles for different numbers of largest order statistics. Holger Drees

EVA of Time Series II

24/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Serial estimator of asymptotic variance (cont.) kn1/2 γˆn,bλkn c −γ

 λ∈(0,1]

 → ZT ,γ (λ) λ∈(0,1] ,

Z˜T ,γ (u) := eu/2 ZT ,γ (eu ) stationary

An ergodic theorem yields 

k

n kn −1 X (ˆ γn,i − γˆn,kn )2 jn i=jn Z  kn −1 1 (ZT ,γ (s) − ZT ,γ (1))2 ds ≈ log jn jn /kn Z  2 kn −1 0 = log Z˜T ,γ (u) − eu/2 Z˜T ,γ (0) du jn log(jn /kn ) 2 −→ E (Z˜T ,γ (0)) = as.Var (ˆ γn ),

log

provided jn /kn → 0 sufficiently slowly. Similarly, a variance estimator can be constructed from sequence of estimators xˆn,pn of extreme quantiles for different numbers of largest order statistics. Holger Drees

EVA of Time Series II

24/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Simulations: non-coverage probabilities Consider simplified extreme quantile estimator in the case γ > 0: xˆn,p := Xn−kn :n

 np −ˆγn kn

,

γˆn Hill estimator.

Simulations for 6 different symmetric time series models of length n = 2000 (i.e., up to about 1000 positive observations which can be used by Hill estimator): ARMA(1,1) models: Xi − φXi−1 = Zi + θZi−1 with two-sided Fr´echet innovations, i.e. 1 − FZ (x) = FZ (−x) = 12 x −3 , x ≥ 1, and (i) (ii) (iii) (iv)

φ = 0.95, θ = 0.9, φ = 0.95, θ = −0.6, φ = 0.95, θ = −0.9, φ = 0.3, θ = 0.9,

(G)ARCH time series Xi = σi Zi with iid standard normal innovations Zi and 2 (v) σi2 = 0.0001 + 0.9Xi−1 , 2 2 (vi) σi2 = 0.0001 + 0.4Xi−1 + 0.5σi−1 Holger Drees

EVA of Time Series II

25/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Simulations: non-coverage probabilities Consider simplified extreme quantile estimator in the case γ > 0: xˆn,p := Xn−kn :n

 np −ˆγn kn

,

γˆn Hill estimator.

Simulations for 6 different symmetric time series models of length n = 2000 (i.e., up to about 1000 positive observations which can be used by Hill estimator): ARMA(1,1) models: Xi − φXi−1 = Zi + θZi−1 with two-sided Fr´echet innovations, i.e. 1 − FZ (x) = FZ (−x) = 12 x −3 , x ≥ 1, and (i) (ii) (iii) (iv)

φ = 0.95, θ = 0.9, φ = 0.95, θ = −0.6, φ = 0.95, θ = −0.9, φ = 0.3, θ = 0.9,

(G)ARCH time series Xi = σi Zi with iid standard normal innovations Zi and 2 (v) σi2 = 0.0001 + 0.9Xi−1 , 2 2 (vi) σi2 = 0.0001 + 0.4Xi−1 + 0.5σi−1 Holger Drees

EVA of Time Series II

25/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Simulations: non-coverage probabilities (cont.) Estimate F ← (1 − pn ) with pn = 1/n = 0.0005 Comparison of non-coverage probabilities of confidence interval based on above theory (solid lines), i.e. h

  kn i kn  , xˆn,pn exp zα/2 σ ˆT ,γ kn−1/2 log , xˆn,pn exp − zα/2 σ ˆT ,γ kn−1/2 log npn npn

with those of confidence intervals for iid data (dashed lines), i.e. h

  kn  kn i xˆn,pn exp − zα/2 γˆn kn−1/2 log , xˆn,pn exp zα/2 γˆn kn−1/2 log . npn npn

The nominal non-coverage probability α = 0.05 is indicated by dotted lines.

Holger Drees

EVA of Time Series II

26/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Simulations: non-coverage probabilities (cont.) %

% 60

(i)

60 50

50

40

40

30

30

20

20

10

10

0

(ii)

0 0

100

200

300

400

500

600 k

0

100

200

300

400

500

600 k

Xi − φXi−1 = Zi + θZi−1 φ = 0.95,

θ = 0.9

φ = 0.95, Holger Drees

EVA of Time Series II

θ = −0.6 27/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Simulations: non-coverage probabilities (cont.) %

%

(iii)

60

60

50

50

40

40

30

30

20

20

10

10

0

(iv)

0 0

100 200 300 400 500 600 700 k

0

100 200 300 400 500 600 700 k

Xi − φXi−1 = Zi + θZi−1 φ = 0.95,

θ = −0.9

φ = 0.3, Holger Drees

EVA of Time Series II

θ = 0.9 28/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Simulations: non-coverage probabilities (cont.) %

%

(v)

60

60

50

50

40

40

30

30

20

20

10

10

0

(vi)

0 0

100 200 300 400 500 600 700 k

0

100 200 300 400 500 600 700 k

Xi = σi Zi σi2

= 0.0001 +

2 0.9Xi−1

2 2 σi2 = 0.0001 + 0.4Xi−1 + 0.5σi−1 Holger Drees

EVA of Time Series II

29/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Choice of sample fraction Performance of estimators strongly depend on number k + 1 of largest order statistics used for estimation. When k is too large and bias kicks in, variance estimator misinterprets this as a larger variance. Hence, it seems reasonable to choose k such that variance estimate is small (but not smaller than in iid case), provided k is not too small. The following choice turned out to work well in simulations: n o kˆ := arg min σ ˆ (k) k ≥ 80, σ ˆ (k) ≥ γˆn(k) model pn = 0.0005 pn = 0.0001 (i) 2.5% 2.2% (ii) 5.3% 6.6% (iii) 6.1% 6.7% (iv) 10.1% 14.1% (v) 7.7% 8.6% (vi) 5.5% 6.3% iid 5.4% 6.0% Empirical noncoverage probabilities for models (i)–(vi) and iid Fr´echet data with k chosen as above Holger Drees EVA of Time Series II

30/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Tail Empirical Processes Asymptotics of Estimators Simulations

Choice of sample fraction Performance of estimators strongly depend on number k + 1 of largest order statistics used for estimation. When k is too large and bias kicks in, variance estimator misinterprets this as a larger variance. Hence, it seems reasonable to choose k such that variance estimate is small (but not smaller than in iid case), provided k is not too small. The following choice turned out to work well in simulations: n o kˆ := arg min σ ˆ (k) k ≥ 80, σ ˆ (k) ≥ γˆn(k) model pn = 0.0005 pn = 0.0001 (i) 2.5% 2.2% (ii) 5.3% 6.6% (iii) 6.1% 6.7% (iv) 10.1% 14.1% (v) 7.7% 8.6% (vi) 5.5% 6.3% iid 5.4% 6.0% Empirical noncoverage probabilities for models (i)–(vi) and iid Fr´echet data with k chosen as above Holger Drees EVA of Time Series II

30/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Outline

1

Estimation of Marginal Parameters Tail Empirical Processes Asymptotics of Estimators Simulations

2

Empirical Processes of Cluster functionals Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Holger Drees

EVA of Time Series II

31/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Cluster functionals Problem: Tail empirical processes convey only information about marginal cdf. More general empirical processes indexed by functions (or sets) needed for analysis of serial dependence between extremes. To this end, concept of cluster functionals most useful. (Xt )t∈Z

Rd -valued stationary time series

We first set ‘non-extreme’ observations (depending on sample size n and problem at hand) to 0, leading to E -valued random vectors Xn,t , e.g. for univariate time series with cdf F ∈ D(Gγ ) X − b  Xt − bn t n Xn,t := = 1{X > b } t n an an + for regularly varying time series X  t+h Xn,t := 1 . un −h0 ≤h≤h0 {kXt k > un } We want to apply functionals to whole ‘clusters’ of non-vanishing Xn,t ; different concepts of clusters possible Holger Drees

EVA of Time Series II

32/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Cluster functionals Problem: Tail empirical processes convey only information about marginal cdf. More general empirical processes indexed by functions (or sets) needed for analysis of serial dependence between extremes. To this end, concept of cluster functionals most useful. (Xt )t∈Z

Rd -valued stationary time series

We first set ‘non-extreme’ observations (depending on sample size n and problem at hand) to 0, leading to E -valued random vectors Xn,t , e.g. for univariate time series with cdf F ∈ D(Gγ ) X − b  Xt − bn t n Xn,t := = 1{X > b } t n an an + for regularly varying time series X  t+h Xn,t := 1 . un −h0 ≤h≤h0 {kXt k > un } We want to apply functionals to whole ‘clusters’ of non-vanishing Xn,t ; different concepts of clusters possible Holger Drees

EVA of Time Series II

32/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Cluster functionals Problem: Tail empirical processes convey only information about marginal cdf. More general empirical processes indexed by functions (or sets) needed for analysis of serial dependence between extremes. To this end, concept of cluster functionals most useful. (Xt )t∈Z

Rd -valued stationary time series

We first set ‘non-extreme’ observations (depending on sample size n and problem at hand) to 0, leading to E -valued random vectors Xn,t , e.g. for univariate time series with cdf F ∈ D(Gγ ) X − b  Xt − bn t n Xn,t := = 1{X > b } t n an an + for regularly varying time series X  t+h Xn,t := 1 . un −h0 ≤h≤h0 {kXt k > un } We want to apply functionals to whole ‘clusters’ of non-vanishing Xn,t ; different concepts of clusters possible Holger Drees

EVA of Time Series II

32/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Cluster functionals Problem: Tail empirical processes convey only information about marginal cdf. More general empirical processes indexed by functions (or sets) needed for analysis of serial dependence between extremes. To this end, concept of cluster functionals most useful. (Xt )t∈Z

Rd -valued stationary time series

We first set ‘non-extreme’ observations (depending on sample size n and problem at hand) to 0, leading to E -valued random vectors Xn,t , e.g. for univariate time series with cdf F ∈ D(Gγ ) X − b  Xt − bn t n Xn,t := = 1{X > b } t n an an + for regularly varying time series X  t+h Xn,t := 1 . un −h0 ≤h≤h0 {kXt k > un } We want to apply functionals to whole ‘clusters’ of non-vanishing Xn,t ; different concepts of clusters possible Holger Drees

EVA of Time Series II

32/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Clusters

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

2

1.9

1.8

1.7

1.6

1.5

1.4

1.3

1.2

1.1

1 0

100

200

300

400

500

600

700

2

1.95

1.9

1.85

1.8

1.75

1.7

1.65 460

470

480

490

500

Holger Drees

510

520

530

540

EVA of Time Series II

550

33/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Clusters Exceedances over un (may) occur in clusters Possible interpretations of clusters: runs: Exceedances belong to different clusters if separated by at least l observations below un crossings: Exceedances belong to different clusters if separated by an observation below un −  blocks: Split sample into blocks; cluster = all exceedances in a block blocks approach: perhaps least natural definition, but mathematically easiest to handle will be used in what follows: mn = bn/rn c blocks of length rn Holger Drees

EVA of Time Series II

34/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Clusters Exceedances over un (may) occur in clusters Possible interpretations of clusters: runs: Exceedances belong to different clusters if separated by at least l observations below un crossings: Exceedances belong to different clusters if separated by an observation below un −  blocks: Split sample into blocks; cluster = all exceedances in a block blocks approach: perhaps least natural definition, but mathematically easiest to handle will be used in what follows: mn = bn/rn c blocks of length rn Holger Drees

EVA of Time Series II

34/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Clusters Exceedances over un (may) occur in clusters Possible interpretations of clusters: runs: Exceedances belong to different clusters if separated by at least l observations below un crossings: Exceedances belong to different clusters if separated by an observation below un −  blocks: Split sample into blocks; cluster = all exceedances in a block blocks approach: perhaps least natural definition, but mathematically easiest to handle will be used in what follows: mn = bn/rn c blocks of length rn Holger Drees

EVA of Time Series II

34/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Clusters Exceedances over un (may) occur in clusters Possible interpretations of clusters: runs: Exceedances belong to different clusters if separated by at least l observations below un crossings: Exceedances belong to different clusters if separated by an observation below un −  blocks: Split sample into blocks; cluster = all exceedances in a block blocks approach: perhaps least natural definition, but mathematically easiest to handle will be used in what follows: mn = bn/rn c blocks of length rn Holger Drees

EVA of Time Series II

34/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Cluster cores and cluster functionals Consider block Yn,j = (Xn,t )(j−1)rn bn }

m n rn  X t=1

 1{X > a x + b } − P{Xt > an x + bn } t n n

Holger Drees

EVA of Time Series II

37/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Example: empirical distribution of spectral process Recall that spectral process (Θt )t∈Z of Rd -valued regularly varying time series is defined via   X Xt  s ,..., ∈ A kX0 k > u P{(Θs , . . . , Θt ) ∈ A} = lim P u→∞ kX0 k kX0 k for suitable sets A. For simplicity, consider just one fixed lag, i.e. P{Θh ∈ A} = lim P u→∞

  X h ∈ A kX0 k > u kX0 k

Empirical counterpart of probability on r.h.s.: Pn t=1 1{Xt+h /kXt k ∈ A, kXt k > un } Pn pˆn,A := t=1 1{kXt k > un } Holger Drees

EVA of Time Series II

38/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Example: empirical distribution of spectral process Recall that spectral process (Θt )t∈Z of Rd -valued regularly varying time series is defined via   X Xt  s ,..., ∈ A kX0 k > u P{(Θs , . . . , Θt ) ∈ A} = lim P u→∞ kX0 k kX0 k for suitable sets A. For simplicity, consider just one fixed lag, i.e. P{Θh ∈ A} = lim P u→∞

  X h ∈ A kX0 k > u kX0 k

Empirical counterpart of probability on r.h.s.: Pn t=1 1{Xt+h /kXt k ∈ A, kXt k > un } Pn pˆn,A := t=1 1{kXt k > un } Holger Drees

EVA of Time Series II

38/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Example: empirical distribution of spectral process (cont.) Estimator of P{Θh ∈ A} Pn pˆn,A := ;

t=1

1{X /kX k ∈ A, kX k > u } t t n Pt+h n t=1 1{kXt k > un } Xn,t

 fA (y1,1 , y1,2 ), . . . , (yr ,1 , yr ,2 ) so that with vn := P{kX0 k > un } Zn (fA ) = √

(Xt , Xt+h ) 1{kX k > u } t n un r   X yi,2 := 1A , A ∈ Bd , kyi,1 k

:=

i=1

mn rn   1 X 1{Xt+h /kXt k∈A,kXt k>un } − P{Xt+h /kXt k ∈ A, kXt k > un } . nvn t=1

Family A of sets A must not be too complex to ensure uniform convergence! Gaussian limit for Zn immediately gives asymptotic normality of pˆn,A . Holger Drees

EVA of Time Series II

39/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Example: empirical distribution of spectral process (cont.) Estimator of P{Θh ∈ A} Pn pˆn,A := ;

t=1

1{X /kX k ∈ A, kX k > u } t t n Pt+h n t=1 1{kXt k > un } Xn,t

 fA (y1,1 , y1,2 ), . . . , (yr ,1 , yr ,2 ) so that with vn := P{kX0 k > un } Zn (fA ) = √

(Xt , Xt+h ) 1{kX k > u } t n un r   X yi,2 := 1A , A ∈ Bd , kyi,1 k

:=

i=1

mn rn   1 X 1{Xt+h /kXt k∈A,kXt k>un } − P{Xt+h /kXt k ∈ A, kXt k > un } . nvn t=1

Family A of sets A must not be too complex to ensure uniform convergence! Gaussian limit for Zn immediately gives asymptotic normality of pˆn,A . Holger Drees

EVA of Time Series II

39/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Example: order statistics in cluster of exceedances So far, only extremal dependence over a fixed range considered. Empirical process of cluster functionals allow for statistical analysis of more general dependence parameters. For a univariate stationary time series (Xt )t∈Z with cdf F ∈ D(Gγ ) let Xn,t :=

Xt − bn 1{X > b } t n an

and

fx1 ,...,xk (y1 , . . . , yr ) = 1{y

r :r

> x1 , yr −1:r > x2 , . . . , yr −k+1:r > xk }

Then  Zn (fx1 ,...,xk ) (x1 ,...,x )∈[0,∞)k tail empirical process of k largest order statistics in k cluster of exceedances over bn .

Holger Drees

EVA of Time Series II

40/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Example: order statistics in cluster of exceedances So far, only extremal dependence over a fixed range considered. Empirical process of cluster functionals allow for statistical analysis of more general dependence parameters. For a univariate stationary time series (Xt )t∈Z with cdf F ∈ D(Gγ ) let Xn,t :=

Xt − bn 1{X > b } t n an

and

fx1 ,...,xk (y1 , . . . , yr ) = 1{y

r :r

> x1 , yr −1:r > x2 , . . . , yr −k+1:r > xk }

Then  Zn (fx1 ,...,xk ) (x1 ,...,x )∈[0,∞)k tail empirical process of k largest order statistics in k cluster of exceedances over bn .

Holger Drees

EVA of Time Series II

40/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Convergence of empirical process of cluster functionals As usual, process convergence is proved in two steps: Prove convergence of all finite-dimensional marginal distributions (fidis) Prove either asymptotic tightness or asymptotic equicontinuity In the following sufficient conditions will be given for both steps to go through. Due to large generality, some of the conditions are rather abstract. We will therefore also discuss important special cases where more concrete conditions suffices. Throughout we will use the ‘big blocks, small blocks’ approach, i.e. we consider mn = bn/rn c big blocks of length rn = o(n) and cut out small blocks of length `n = o(rn ) at the end of each large block Moreover, we assume kind of β-mixing: let  l sup |P(B|Bn,1 )−P(B)| βn,k := sup E 1≤l≤n−k−1

n B∈Bn,l+k+1

j with Bn,i = σ (Xn,t )i≤t≤j



and assume n (A1) βn,`n → 0, rn vn → 0, nvn → ∞ rn Holger Drees

EVA of Time Series II

41/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Convergence of empirical process of cluster functionals As usual, process convergence is proved in two steps: Prove convergence of all finite-dimensional marginal distributions (fidis) Prove either asymptotic tightness or asymptotic equicontinuity In the following sufficient conditions will be given for both steps to go through. Due to large generality, some of the conditions are rather abstract. We will therefore also discuss important special cases where more concrete conditions suffices. Throughout we will use the ‘big blocks, small blocks’ approach, i.e. we consider mn = bn/rn c big blocks of length rn = o(n) and cut out small blocks of length `n = o(rn ) at the end of each large block Moreover, we assume kind of β-mixing: let  l )−P(B)| βn,k := sup E sup |P(B|Bn,1 1≤l≤n−k−1

n B∈Bn,l+k+1

j with Bn,i = σ (Xn,t )i≤t≤j



and assume n (A1) βn,`n → 0, rn vn → 0, nvn → ∞ rn Holger Drees

EVA of Time Series II

41/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Convergence of empirical process of cluster functionals As usual, process convergence is proved in two steps: Prove convergence of all finite-dimensional marginal distributions (fidis) Prove either asymptotic tightness or asymptotic equicontinuity In the following sufficient conditions will be given for both steps to go through. Due to large generality, some of the conditions are rather abstract. We will therefore also discuss important special cases where more concrete conditions suffices. Throughout we will use the ‘big blocks, small blocks’ approach, i.e. we consider mn = bn/rn c big blocks of length rn = o(n) and cut out small blocks of length `n = o(rn ) at the end of each large block Moreover, we assume kind of β-mixing: let  l βn,k := sup E sup |P(B|Bn,1 )−P(B)| 1≤l≤n−k−1

n B∈Bn,l+k+1

j with Bn,i = σ (Xn,t )i≤t≤j



and assume n (A1) βn,`n → 0, rn vn → 0, nvn → ∞ rn Holger Drees

EVA of Time Series II

41/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Convergence of empirical process of cluster functionals As usual, process convergence is proved in two steps: Prove convergence of all finite-dimensional marginal distributions (fidis) Prove either asymptotic tightness or asymptotic equicontinuity In the following sufficient conditions will be given for both steps to go through. Due to large generality, some of the conditions are rather abstract. We will therefore also discuss important special cases where more concrete conditions suffices. Throughout we will use the ‘big blocks, small blocks’ approach, i.e. we consider mn = bn/rn c big blocks of length rn = o(n) and cut out small blocks of length `n = o(rn ) at the end of each large block Moreover, we assume kind of β-mixing: let  l βn,k := sup E sup |P(B|Bn,1 )−P(B)| 1≤l≤n−k−1

n B∈Bn,l+k+1

j with Bn,i = σ (Xn,t )i≤t≤j



and assume n (A1) βn,`n → 0, rn vn → 0, nvn → ∞ rn Holger Drees

EVA of Time Series II

41/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Convergence of Fidis Suppose n βn,`n → 0, rn vn → 0, nvn → ∞ (A1) rn  1 (A2) Cov f (Yn,1 ), g (Yn,1 ) → c(f , g ), f , g ∈ F rn vn (A3) Lindeberg condition (stronger version used for equicontinuity) (A4) Omitting small block at end of each large block, i.e. replacing Yn,i = (Xn,t )(i−1)rn 0 n,1 (η(nvn )1/2 , ∞)  1 (B3) as. continuity: lim lim sup sup E (f (Yn,1 ) − g (Yn,1 ))2 = 0 ε↓0 n→∞ rn vn f ,g ∈F ,ρ(f ,g )≤ε (B4) random entropy condition: o nZ ε p ∗ lim lim sup P log N(u, F, dn ) du > η = 0 ∀ η > 0 ε↓0 n→∞

0

with N(u, F, dn ) = number of balls of radius u w.r.t. dn2 (f , g ) :=

mn 2 1 X ∗ ∗ f (Yn,j ) − g (Yn,j ) , nvn

∗ Yn,j i.i.d. copies of Yn,1 ,

j=1

needed to cover F. Then, under additional measurability conditions, Zn is asymptotic equicontinuous and it converges if the fidis converge. Holger Drees

EVA of Time Series II

43/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Asymptotic Tightness One can give alternative set of conditions using bracketing entropy: Instead of N(u, F, dn ) we use the bracketing number N[ ] (u, F, Ln2 ) = minimum no. K s.t. there ex. partition (Fu,k )1≤k≤K of F with  2  ≤ u 2 rn vn ∀ k E∗ sup f (Yn,1 ) − g (Yn,1 ) f ,g ∈Fu,k

In addition to β-mixing condition (A1), finiteness of envelope F (i.e. (B1)) and asymptotic continuity (B3) assume  √nv   n (C1) “Lindeberg”: E ∗ F (Yn,1 )1(η(nvn )1/2 ,∞) (F (Yn,1 )) = o ∀η > 0 mn (C2) bracketing entropy condition: Z εq lim lim sup log N[ ] (u, F, Ln2 ) du = 0 ε↓0 n→∞

0

Then Zn is asymptotically tight and it converges if the fidis converge. Holger Drees

EVA of Time Series II

44/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Asymptotic Tightness One can give alternative set of conditions using bracketing entropy: Instead of N(u, F, dn ) we use the bracketing number N[ ] (u, F, Ln2 ) = minimum no. K s.t. there ex. partition (Fu,k )1≤k≤K of F with  2  ≤ u 2 rn vn ∀ k E∗ sup f (Yn,1 ) − g (Yn,1 ) f ,g ∈Fu,k

In addition to β-mixing condition (A1), finiteness of envelope F (i.e. (B1)) and asymptotic continuity (B3) assume  √nv   n (C1) “Lindeberg”: E ∗ F (Yn,1 )1(η(nvn )1/2 ,∞) (F (Yn,1 )) = o ∀η > 0 mn (C2) bracketing entropy condition: Z εq lim lim sup log N[ ] (u, F, Ln2 ) du = 0 ε↓0 n→∞

0

Then Zn is asymptotically tight and it converges if the fidis converge. Holger Drees

EVA of Time Series II

44/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Asymptotic Tightness One can give alternative set of conditions using bracketing entropy: Instead of N(u, F, dn ) we use the bracketing number N[ ] (u, F, Ln2 ) = minimum no. K s.t. there ex. partition (Fu,k )1≤k≤K of F with  2  ≤ u 2 rn vn ∀ k E∗ sup f (Yn,1 ) − g (Yn,1 ) f ,g ∈Fu,k

In addition to β-mixing condition (A1), finiteness of envelope F (i.e. (B1)) and asymptotic continuity (B3) assume  √nv   n (C1) “Lindeberg”: E ∗ F (Yn,1 )1(η(nvn )1/2 ,∞) (F (Yn,1 )) = o ∀η > 0 mn (C2) bracketing entropy condition: Z εq lim lim sup log N[ ] (u, F, Ln2 ) du = 0 ε↓0 n→∞

0

Then Zn is asymptotically tight and it converges if the fidis converge. Holger Drees

EVA of Time Series II

44/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Special case: generalized tail array sums Rootz´en, Leadbetter and de Haan (1990) defined  Pn n with Xn,t = Xta−b and φ(0) = 0. tail array sums t=1 φ(Xn,t ) + n If Xn,t instead constructed from several consecutive observations, then sums convey information about extremal dependence. Hence for φ : (E , B(E )) → (R, B) s.t. φ(0) = 0 define Z˜n (φ)

:=



n  1 X φ(Xn,t ) − E φ(Xn,t ) nvn t=1 generalized (standardized) tail array sum

If n = mn rn , then Z˜n (φ) = Zn (gφ ) with gφ (y1 , . . . , yr ) :=

r X

φ(yj ).

j=1

In general, under weak conditions difference is asymptotically negligible. For processes of generalized tail array sums (Z˜n (φ))φ∈Φ one obtains simpler sufficient conditions for convergence. Holger Drees EVA of Time Series II

45/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Special case: generalized tail array sums Rootz´en, Leadbetter and de Haan (1990) defined  Pn n with Xn,t = Xta−b and φ(0) = 0. tail array sums t=1 φ(Xn,t ) + n If Xn,t instead constructed from several consecutive observations, then sums convey information about extremal dependence. Hence for φ : (E , B(E )) → (R, B) s.t. φ(0) = 0 define Z˜n (φ)

:=



n  1 X φ(Xn,t ) − E φ(Xn,t ) nvn t=1 generalized (standardized) tail array sum

If n = mn rn , then Z˜n (φ) = Zn (gφ ) with gφ (y1 , . . . , yr ) :=

r X

φ(yj ).

j=1

In general, under weak conditions difference is asymptotically negligible. For processes of generalized tail array sums (Z˜n (φ))φ∈Φ one obtains simpler sufficient conditions for convergence. Holger Drees EVA of Time Series II

45/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Special case: generalized tail array sums Rootz´en, Leadbetter and de Haan (1990) defined  Pn n with Xn,t = Xta−b and φ(0) = 0. tail array sums t=1 φ(Xn,t ) + n If Xn,t instead constructed from several consecutive observations, then sums convey information about extremal dependence. Hence for φ : (E , B(E )) → (R, B) s.t. φ(0) = 0 define Z˜n (φ)

:=



n  1 X φ(Xn,t ) − E φ(Xn,t ) nvn t=1 generalized (standardized) tail array sum

If n = mn rn , then Z˜n (φ) = Zn (gφ ) with gφ (y1 , . . . , yr ) :=

r X

φ(yj ).

j=1

In general, under weak conditions difference is asymptotically negligible. For processes of generalized tail array sums (Z˜n (φ))φ∈Φ one obtains simpler sufficient conditions for convergence. Holger Drees EVA of Time Series II

45/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Convergence of generalized tail array sums Corollary Suppose that φmax (x) := supφ∈Φ |φ(x)| is bounded and measurable, √ mixing cond. (A1) holds, rn = o( nvn ) and that the following finite cluster mean condition holds: For all k ∈ N there exists sn (k) ≥ 6= 0|Xn,0 6= 0) such that n,k P PP(X rn ∞ s∞ (k) = limn→∞ sn (k) ex. and k=1 → k=1 s∞ (k) < ∞. Then supφ∈Φ |Z˜n (φ) − Zn (gφ )| → 0 in outer probability. If, in addition, convergence of covariances (A2) holds and as. continuity cond. (B3), random entropy condition (B4) and measurability cond. hold, or bracketing entropy cond. (C2) holds with partition independent of n then both processes (Z˜n (φ))φ∈Φ and (Zn (gφ ))φ∈Φ converge to the same centered Gaussian limit process with covariance fct. c. Holger Drees

EVA of Time Series II

46/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Estimating the distribution of a spectral process Recall definition of spectral process (Θh )h∈Z :  kX k X  Xt 0 s L , ,..., kX0 k > u → L(kY0 k, Θs , . . . , Θt ) ∀ s, t ∈ Z, s ≤ t. u kX0 k kX0 k For simplicity, we only consider univariate positive time series. Goal: estimation of survival function F¯h (x) := P{Θh > x},

x > 0.

Direct or forward estimator: Pn Pn (f ) t=1 1{Xt+h /Xt > x, Xt > un } t=1 φh,x (Xn,t ) (f ) ˆ ¯ P P = n Fn,h (x) := n (0) t=1 1{Xt > un } t=1 φ (Xn,t ) with Xn,t (f )

φh,x (x−h , x0 , xh ) φ(0) (x−h , x0 , xh )

:=

(Xt−h , Xt , Xt+h ) 1{X > u } t n un

:= 1{x /x > x, x > 1} h 0 0 := 1{x > 1} 0 Holger Drees

EVA of Time Series II

47/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Estimating the distribution of a spectral process Recall definition of spectral process (Θh )h∈Z :  kX k X  Xt 0 s L , ,..., kX0 k > u → L(kY0 k, Θs , . . . , Θt ) ∀ s, t ∈ Z, s ≤ t. u kX0 k kX0 k For simplicity, we only consider univariate positive time series. Goal: estimation of survival function F¯h (x) := P{Θh > x},

x > 0.

Direct or forward estimator: Pn Pn (f ) t=1 1{Xt+h /Xt > x, Xt > un } t=1 φh,x (Xn,t ) (f ) ˆ ¯ P P Fn,h (x) := = n n (0) t=1 1{Xt > un } t=1 φ (Xn,t ) with Xn,t (f )

φh,x (x−h , x0 , xh ) φ(0) (x−h , x0 , xh )

:=

(Xt−h , Xt , Xt+h ) 1{X > u } t n un

:= 1{x /x > x, x > 1} h 0 0 := 1{x > 1} 0 Holger Drees

EVA of Time Series II

47/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Backward estimator of distribution of a spectral process Recall time change formula:   (Θ , . . . , Θ )   s t E f (Θs−i , . . . , Θt−i ) = E kΘi kα f kΘi k provided f (θs , . . . , θ0 , . . . , θt ) = 0 whenever θ0 = 0. Choose s = i = −h, t = 0, f (θ−h , . . . , θ0 ) = 1{θ /θ > x, θ > 0} to obtain 0 −h −h   F¯h (x) = E 1{Θ /Θ > x, Θ > 0} = E Θα −h 1{Θ0 /Θ−h > x, Θ−h > 0} h 0 0  = lim E (X−h /X0 )α 1{X /X > x} | X0 > u 0 −h u→∞ Empirical counterpart of rhs yields backward estimator Pn α t=1 (Xt−h /Xt ) 1{Xt /Xt−h > x, Xt > un } (b) ˆ ¯ P Fn,h (x) := n t=1 1{X > u } t

n

where, for the time being, we assume α known. Holger Drees

EVA of Time Series II

48/64

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Comparison of estimators Pn

¯ (f ) (x) Fˆ n,h

:=

¯ (b) (x) Fˆ n,h

:=

1{X /X > x, X > u } t n Pnt+h t 1 t=1 {Xt > un } Pn Xt−h α 1{X /X t=1 Xt t t−h > x, Xt > un } Pn t=1 1{X > u } t=1

t

n

Which estimator performs better? Heuristics: For large x there will be few observations Xt > un s.t. Xt+h > xXt > xun ; ¯n(f ,h) (x) will have large variance. hence direct estimator Fˆ In contrast, there will be more observations Xt > un fulfilling the “weaker” condition Xt > xXt−h and factor (Xt−h /Xt )α is small then, resulting in ¯n(b,h) (x) smaller variance of backward estimator Fˆ ˆ (f ,h) (x) to perform better. Conversely, for small x one expects F¯ n

Holger Drees

EVA of Time Series II

49/64

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Comparison of estimators Pn

¯ (f ) (x) Fˆ n,h

:=

¯ (b) (x) Fˆ n,h

:=

1{X /X > x, X > u } t n Pnt+h t 1 t=1 {Xt > un } Pn Xt−h α 1{X /X t=1 Xt t t−h > x, Xt > un } Pn t=1 1{X > u } t=1

t

n

Which estimator performs better? Heuristics: For large x there will be few observations Xt > un s.t. Xt+h > xXt > xun ; ¯n(f ,h) (x) will have large variance. hence direct estimator Fˆ In contrast, there will be more observations Xt > un fulfilling the “weaker” condition Xt > xXt−h and factor (Xt−h /Xt )α is small then, resulting in ¯n(b,h) (x) smaller variance of backward estimator Fˆ ˆ (f ,h) (x) to perform better. Conversely, for small x one expects F¯ n

Holger Drees

EVA of Time Series II

49/64

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Asymptotic behavior Here we only discuss backwards estimator; forward estimator simpler to analyze. Because of Pn Pn α (b) t=1 (Xt−h /Xt ) 1{Xt /Xt−h > x, Xt > un } t=1 φh,x (Xn,t ) (b) ˆ ¯ Pn = Pn Fn,h (x) = (0) t=1 1{X > u } t=1 φ (Xn,t ) t

n

with (Xt−h , Xt , Xt+h ) 1{X > u } t n u  x α n −h (b) φh,x (x−h , x0 , xh ) := 1{x /x > x, x > 1} 0 −h 0 x0 (0) φ (x−h , x0 , xh ) := 1{x > 1} Xn,t

:=

0

joint asymptotic normality of backward estimator for all x ≥ x0 and |h| ∈ {1, . . . , h0 } follows from limit theorem for generalized tail array sums  (b)  Z˜n (φ(0) ), Z˜n (φh,x ) x≥x ,|h|∈{1,...,h } . 0

0

Holger Drees

EVA of Time Series II

50/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Asymptotic behavior (cont.) Theorem Suppose n rn βn,`n

→ 0, rn = o((nvn )1/2 , rn vn → 0

finite cluster mean condition holds Fh is continuous on [x0 , ∞) Then  ˆ ¯ (f ) (x) − P(Xh /X0 > x | X0 > un ) F n,h    (nvn )1/2  ˆ (b) F¯n,h (x) − E (X−h /X0 )α 1{X /X > x} X0 > un 0 −h x ≥x 

0

|h| ∈ {1, . . . , h0 }

converges weakly to centered Gaussian process with covariance function determined by α and the distribution of (Θt )t∈Z . If un is sufficiently high (or, equivalently, vn sufficiently small) one may center at F¯ (h) (x) in both cases. Holger Drees

EVA of Time Series II

51/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

If α is unknown Usually the index of regular variation, α, is unknown and has to be replaced with suitable estimator in the definition of the backward estimator, e.g. Hill type estimator Pn t=1 1{Xt > un } . α ˆ n := Pn t=1 log(Xt /un )1{X > u } t

n

Under slightly stronger conditions as used above, one can still prove asymptotic normality, but now covariance function of limiting Gaussian process is different (still it depends only on (Θt )t∈Z and α). Since covariance function usually very complex, above results cannot be used directly to construct confidence regions. Instead we suggest to use a multiplier block bootstrap approach. Holger Drees

EVA of Time Series II

52/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

If α is unknown Usually the index of regular variation, α, is unknown and has to be replaced with suitable estimator in the definition of the backward estimator, e.g. Hill type estimator Pn t=1 1{Xt > un } . α ˆ n := Pn t=1 log(Xt /un )1{X > u } t

n

Under slightly stronger conditions as used above, one can still prove asymptotic normality, but now covariance function of limiting Gaussian process is different (still it depends only on (Θt )t∈Z and α). Since covariance function usually very complex, above results cannot be used directly to construct confidence regions. Instead we suggest to use a multiplier block bootstrap approach. Holger Drees

EVA of Time Series II

52/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Multiplier block bootstrap To capture serial dependence, whole blocks (not single obs.) are ‘resampled’ Alternatively, one may multiply each block with a random factor indep. the data. ; multiplier block bootstrap version of empirical cluster process: Zn,ξ (f ) := √

mn  1 X ξj f (Yn,j ) − Ef (Yn,j ) , nvn

f ∈ F,

j=1

where ξj are iid rv’s with E (ξj ) = 0, Var (ξj ) = 1, independent of (Xn,t )1≤t≤n . (One may also replace Ef (Yn,j ) with the average of f (Yn,j ), 1 ≤ j ≤ mn .) Under essentially the same conditions as before (with a slight strengthening if ξj are not bounded), Zn,ξ converges to the same limit as Zn . More importantly, conditionally on the data, Zn,ξ has the same asymptotic behavior (in probability) as Zn (unconditionally). Holger Drees

EVA of Time Series II

53/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Multiplier block bootstrap To capture serial dependence, whole blocks (not single obs.) are ‘resampled’ Alternatively, one may multiply each block with a random factor indep. the data. ; multiplier block bootstrap version of empirical cluster process: Zn,ξ (f ) := √

mn  1 X ξj f (Yn,j ) − Ef (Yn,j ) , nvn

f ∈ F,

j=1

where ξj are iid rv’s with E (ξj ) = 0, Var (ξj ) = 1, independent of (Xn,t )1≤t≤n . (One may also replace Ef (Yn,j ) with the average of f (Yn,j ), 1 ≤ j ≤ mn .) Under essentially the same conditions as before (with a slight strengthening if ξj are not bounded), Zn,ξ converges to the same limit as Zn . More importantly, conditionally on the data, Zn,ξ has the same asymptotic behavior (in probability) as Zn (unconditionally). Holger Drees

EVA of Time Series II

53/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Multiplier block bootstrap To capture serial dependence, whole blocks (not single obs.) are ‘resampled’ Alternatively, one may multiply each block with a random factor indep. the data. ; multiplier block bootstrap version of empirical cluster process: Zn,ξ (f ) := √

mn  1 X ξj f (Yn,j ) − Ef (Yn,j ) , nvn

f ∈ F,

j=1

where ξj are iid rv’s with E (ξj ) = 0, Var (ξj ) = 1, independent of (Xn,t )1≤t≤n . (One may also replace Ef (Yn,j ) with the average of f (Yn,j ), 1 ≤ j ≤ mn .) Under essentially the same conditions as before (with a slight strengthening if ξj are not bounded), Zn,ξ converges to the same limit as Zn . More importantly, conditionally on the data, Zn,ξ has the same asymptotic behavior (in probability) as Zn (unconditionally). Holger Drees

EVA of Time Series II

53/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Multiplier block bootstrap (cont.) Conditionally on the data Zn,ξ has the same asymptotic behavior (in probability) as Zn (unconditionally). To make this statement precise, one has to measure the distance between the conditionally distribution of Zn,ξ and the unconditional distribution of Zn . This is usually done via the so-called bounded Lipschitz metric. Denote by Eξ the expectation w.r.t. (ξj )j∈N (i.e. the conditional expectation given the data) and let  BL1 (`∞ (F)) := g : `∞ (F) → R | kg k∞ := sup |g (z)| ≤ 1, z∈`∞ (F )

|g (z1 ) − g (z2 )| ≤ sup |z1 (f ) − z2 (f )| for all z1 , z2 ∈ `∞ (F) . f ∈F

Then, under the above conditions (if ξj are bounded) Eξ (g (Zn,ξ )) − E (g (Zn )) → 0 sup

in probability.

g ∈BL1 (`∞ (F ))

Holger Drees

EVA of Time Series II

54/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Multiplier block bootstrap (cont.) Conditionally on the data Zn,ξ has the same asymptotic behavior (in probability) as Zn (unconditionally). To make this statement precise, one has to measure the distance between the conditionally distribution of Zn,ξ and the unconditional distribution of Zn . This is usually done via the so-called bounded Lipschitz metric. Denote by Eξ the expectation w.r.t. (ξj )j∈N (i.e. the conditional expectation given the data) and let  BL1 (`∞ (F)) := g : `∞ (F) → R | kg k∞ := sup |g (z)| ≤ 1, z∈`∞ (F )

|g (z1 ) − g (z2 )| ≤ sup |z1 (f ) − z2 (f )| for all z1 , z2 ∈ `∞ (F) . f ∈F

Then, under the above conditions (if ξj are bounded) Eξ (g (Zn,ξ )) − E (g (Zn )) → 0 sup

in probability.

g ∈BL1 (`∞ (F ))

Holger Drees

EVA of Time Series II

54/64

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Multiplier block bootstrap version of spectral estimators Divide observations into mn blocks of length rn ; denote by Ij := {(j − 1)rn + 1, . . . , jrn } the set of indices belonging to jth block. For simplicity assume n = mn rn . backward estimator ˆn Xt−h α 1{X /X Xt t t−h > x, Xt Pmn P 1 t∈Ij {Xt > un } j=1

Pmn P ˆ Fˆ¯n(b,h) (x) :=

j=1

where

t∈Ij

> un }

,

Pmn P j=1

α ˆ n := Pmn P j=1

t∈Ij

1{X > u } t n . log(Xt /un )1{X > u } t n t∈Ij

denotes the Hill estimator.

Holger Drees

EVA of Time Series II

55/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Multiplier block bootstrap version of spectral estimators Divide observations into mn blocks of length rn ; denote by Ij := {(j − 1)rn + 1, . . . , jrn } the set of indices belonging to jth block. For simplicity assume n = mn rn . bootstrap analog to the backward estimator Pmn ˆ ¯n∗(b,h) Fˆ (x) :=

j=1

αˆ ∗n P 1{X /X (1 + ξj ) t∈Ij XXt−h t t t−h > x, Xt > un } P Pmn , j=1 (1 + ξj ) t∈Ij 1{X > u } t

where

n

Pmn

P (1 + ξj ) t∈Ij 1{X > u } t n P α ˆ n∗ := Pmn . j=1 (1 + ξj ) t∈Ij log(Xt /un )1{Xt > un } j=1

denotes a bootstrap version of the Hill estimator and ξj are iid rvs independent of the data with E (ξj ) = 0, Var (Xj ) = 1. A bootstrap version of the forward estimator is defined analogously. Holger Drees

EVA of Time Series II

55/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Consistency of multiplier block bootstrap Theorem Under same conditions as used for asymptotic normality of estimators with F¯ (h) (x) as centering n   ¯n∗(f ,h) (x) − Fˆ ¯n(f ,h) (x) ≤ rt , sup Pξ (nvn )1/2 Fˆ r ,s∈R2h0

  o ˆ ˆ ¯n(b,h) (nvn )1/2 Fˆ¯n∗(b,h) (x) − Fˆ (x) ≤ st , ∀ |h| ∈ {1, . . . , h0 } n   ¯n(f ,h) (x) − F¯ (h) (x) ≤ rt , −P (nvn )1/2 Fˆ   o ˆ 1/2 ˆ (b,h) (h) ¯ ¯ (nvn ) Fn (x) − F (x) ≤ st , ∀ |h| ∈ {1, . . . , h0 } → 0 in probability, where Pξ denotes the probability w.r.t. (ξj )j∈N , i.e. conditionally on the data. From this result, confidence intervals for F¯ (h) (x) can be easily constructed. Holger Drees

EVA of Time Series II

56/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Simulation example GARCH(1,1) time series Xt = σt εt ,

2 2 σt2 = 0.0001 + 0.14Xt−1 + 0.84σt−1

with iid t4 -distributed innovations εt and sample size n = 2000. Then tail process exists with α = 2.6. We compare the estimation errors of forward and backward estimator based on largest 5% of absolute observations, considered as an estimator of the pre-asymptotic cdf −1 P(Xh /X0 ≤ x | X0 > F|X (0.95)) 0|

corresponding to the threshold used for estimation.

Holger Drees

EVA of Time Series II

57/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Simulation example GARCH(1,1) time series Xt = σt εt ,

2 2 σt2 = 0.0001 + 0.14Xt−1 + 0.84σt−1

with iid t4 -distributed innovations εt and sample size n = 2000. Then tail process exists with α = 2.6. We compare the estimation errors of forward and backward estimator based on largest 5% of absolute observations, considered as an estimator of the pre-asymptotic cdf −1 P(Xh /X0 ≤ x | X0 > F|X (0.95)) 0|

corresponding to the threshold used for estimation.

Holger Drees

EVA of Time Series II

57/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Performance of forward and backward estimators 0.1

0.1

0.08 0.05 0.06 0 0.04 -0.05 0.02

-0.1 -2

-1

0

1

2

0 -2

-1

0

1

2

Bias (left) and RMSE (right) of forward est. (blue), and backward est. (red) for lag h = 1 (solid) and h = 10 (dashed) Holger Drees

EVA of Time Series II

58/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Performance of forward and backward estimators 1.8 1.6 1.4 1.2 1 0.8 0.6 0.4 0.2 0 -2

-1

0

1

2

Ratio(RMSE of forward est.)/(RMSE of backward est.) for lag h = 1 (solid) and h = 10 (dashed) Holger Drees

EVA of Time Series II

59/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Performance of bootstrap confidence intervals Comparison of true coverage probabilities of confidence intervals for P(|Xh /X0 | > 1 | |X0 | > u), −20 ≤ h ≤ 20, with nominal size 95% constructed using multiplier block bootstrap and stationary bootstrap suggested by Davis, Mikosch and Cribben (2012) in a related context. backward estimator

forward estimator 1

1

0.95

0.95

0.9

0.9

0.85

0.85

0.8

0.8

0.75

0.75

0.7

0.7

0.65

0.65

0.6 -20

-10

0

10

20

0.6 -20

-10

0

10

20

Coverage probabilities using multiplier block bootstrap (red) and stationary bootstrap (blue) based on largest 5% of data for forward est. (left) and backward est. (right) Holger Drees

EVA of Time Series II

60/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Performance of bootstrap confidence intervals Comparison of true coverage probabilities of confidence intervals for P(|Xh /X0 | > 1 | |X0 | > u), −20 ≤ h ≤ 20, with nominal size 95% constructed using multiplier block bootstrap and stationary bootstrap suggested by Davis, Mikosch and Cribben (2012) in a related context. 1

forward estimator

1 0.95

0.95 0.9

0.9

0.85

0.85

0.8

0.8

0.75

0.75

0.7

0.7

0.65

0.65

0.6 -20

backward estimator

-10

0

10

20

0.6 -20

-10

0

10

20

Coverage probabilities using multiplier block bootstrap (red) and stationary bootstrap (blue) based on largest 2% of data for forward est. (left) and backward est. (right) Holger Drees

EVA of Time Series II

60/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Performance of bootstrap confidence intervals Comparison of true coverage probabilities of confidence intervals for P(|Xh /X0 | > 1 | |X0 | > u), −20 ≤ h ≤ 20, with nominal size 95% constructed using multiplier block bootstrap and stationary bootstrap suggested by Davis, Mikosch and Cribben (2012) in a related context. 1

forward estimator

1 0.95

0.95 0.9

0.9

0.85

0.85

0.8

0.8

0.75

0.75

0.7

0.7

0.65

0.65

0.6 -20

backward estimator

-10

0

10

20

0.6 -20

-10

0

10

20

Coverage probabilities using multiplier block bootstrap (red) and stationary bootstrap (blue) based on largest 2% of data for forward est. (left) and backward est. (right); dashed lines indicate rescaled bootstrap confidence intervals based on largest 5% of data Holger Drees

EVA of Time Series II

60/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Data example: S&P 500 Data: daily log-returns from S&P 500, observed 1995–2004 ^ P(| Θt | > 1 | Θ0 = − 1) 0.30

0.30

^ P(| Θt | > 1 | Θ0 = 1)

0.05

0.10

0.15

0.20

0.25

S&P independence

0.00

0.00

0.05

0.10

0.15

0.20

0.25

S&P independence

0

10

20

30

40

50

0

10

t

20

30

40

50

t

P(|Θt | > 1 | Θ0 = ±1) estimated by backward approach, 80% confidence intervals constructed with multiplier block bootstrap as shaded grey region; dashed lines indicate value in case of independence. Holger Drees

EVA of Time Series II

61/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Data example: S&P 500 Fitted models: Xt = σt εt with 2 2 + b1 σt−1 GARCH(1,1): σt2 = a0 + a1 Xt−1 δ δ APARCH(1,1): σt = a0 + a1 (|Xt−1 | − γ1 Xt−1 )δ + b1 σt−1 ^ P(| Θt | > 1 | Θ0 = − 1) 0.30

0.30

^ P(| Θt | > 1 | Θ0 = 1)

0.05

0.10

0.15

0.20

0.25

S&P GARCH APARCH independence

0.00

0.00

0.05

0.10

0.15

0.20

0.25

S&P GARCH APARCH independence

0

10

20

30

40

50

0

10

t

20

30

40

50

t

P(|Θt | > 1 | Θ0 = ±1) estimated by backward approach, approximate probabilities for fitted GARCH(1,1) and APARCH(1,1) models Holger Drees

EVA of Time Series II

61/64

Estimation of Marginal Parameters Empirical Processes of Cluster functionals

Definitions and Examples Sufficient Conditions for Convergence Application: Estimation of the Distribution of a Spectral Process Bootstrapped empirical processes of cluster functionals

Selected References Basrak, B. and Segers, J. (2009). Regularly varying multivariate time series. Stoch. Proc. Appl. 119, 1055–1080 Drees, H. (2000). Weighted Approximations of Tail Processes for ß-Mixing Random Variables. Ann. Appl. Probab. 10, 1274–1301. Drees, H. (2003). Extreme Quantile Estimation for Dependent Data with Applications to Finance. Bernoulli 9, 617–657. Drees, H. (2015). Bootstrapping Empirical Processes of Cluster Functionals with Application to Extremograms. arXiv:1511.00420 Holger Drees

Drees, H. and Rootz´ en, H. (2010). Limit theorems for empirical processes of cluster functionals. Ann. Stat. 38, 2145–2186 (correction note Ann. Stat., to appear) Drees, H., Segers, J. and Warchol., M.: Statistics for tail processes of Markov chains Extremes 18, 369–402 (2015) Janßen, A. and Segers, J. (2014) Markov tail chains. J. Appl. Probab. 51, 1133–1153 Rootz´ en, H. (2009). Weak convergence of the tail empirical function for dependent sequences. Stoch. Proc. Appl. 119, 468–490. EVA of Time Series II

62/64

Small Blocks Are Negligible Let Y¯n,1 = (Xn,i )1≤i≤rn −ln be the shortened version of the first block Yn,1 = (Xn,i )1≤i≤rn and ∆n,1 (f ) := f (Yn,1 ) − f (Y¯n,1 ). Moreover, let ∆∗n,j (f ), 1 ≤ j ≤ mn , denote iid copies of ∆n,1 (f ) Condition (A4) √

mn  1 X ∆∗n,j (f ) − E (∆∗n,j (f )) = oP (1) nvn

for all f ∈ F

j=1

(A4) satisfied iff ∆n,1 (f ) satisfies Lindeberg condition  √ P |∆n,1 (f ) − E ∆n,1 (f )| > nvn = o(rn /n) Holger Drees

EVA of Time Series II

back to fidis

63/64

Small Blocks Are Negligible Let Y¯n,1 = (Xn,i )1≤i≤rn −ln be the shortened version of the first block Yn,1 = (Xn,i )1≤i≤rn and ∆n,1 (f ) := f (Yn,1 ) − f (Y¯n,1 ). Moreover, let ∆∗n,j (f ), 1 ≤ j ≤ mn , denote iid copies of ∆n,1 (f ) Condition (A4) √

mn  1 X ∆∗n,j (f ) − E (∆∗n,j (f )) = oP (1) nvn

for all f ∈ F

j=1

(A4) satisfied iff ∆n,1 (f ) satisfies Lindeberg condition  √ P |∆n,1 (f ) − E ∆n,1 (f )| > nvn = o(rn /n) Holger Drees

EVA of Time Series II

back to fidis

63/64

Measurability Conditions For all n ∈ N, ε > 0, ei ∈ {−1, 0, 1} the maps bmn /2c

sup f ,g ∈F ,ρ(f ,g )