Outline Rates of Convergence Newton s Method. Rates of Covergence and Newton s Method

Outline Rates of Convergence Rates of Covergence and Newton’s Method Rates of Covergence and Newton’s Method Newton’s Method Outline Rates of C...
Author: Piers Stephens
48 downloads 2 Views 504KB Size
Outline

Rates of Convergence

Rates of Covergence and Newton’s Method

Rates of Covergence and Newton’s Method

Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Rates of Covergence and Newton’s Method

Rates of Convergence

Newton’s Method

Outline

Rates of Convergence

Rates of Convergence

We compare the performance of algorithms by their rate of convergence.

Rates of Covergence and Newton’s Method

Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Rates of Convergence

We compare the performance of algorithms by their rate of convergence.

That is, if x k → x¯, we are interested in how fast this happens.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Rates of Convergence

We compare the performance of algorithms by their rate of convergence.

That is, if x k → x¯, we are interested in how fast this happens.

We consider only quotient rates, or Q-rates of convergence.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Rates of Convergence Let {x ν } ⊂ Rn and x¯ ∈ Rn be such that x¯ν → x¯.

Rates of Covergence and Newton’s Method

Newton’s Method

Outline

Rates of Convergence

Rates of Convergence Let {x ν } ⊂ Rn and x¯ ∈ Rn be such that x¯ν → x¯. We say that x¯ν → x¯ at a linear rate if lim sup ν→∞

Rates of Covergence and Newton’s Method

kx ν+1 − x¯k 0 with kx0 − x¯k <  such that 1. g (x) = 0, 2. g 0 (x)−1 exists for x ∈ B(x; ) := {x ∈ Rn : kx − xk < } with sup{kg 0 (x)−1 k : x ∈ B(x; )] ≤ M1

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method Let g : Rn → Rn be differentiable, x 0 ∈ Rn , and J0 ∈ Rn×n . Suppose that there exists x¯, x0 ∈ Rn , and  > 0 with kx0 − x¯k <  such that 1. g (x) = 0, 2. g 0 (x)−1 exists for x ∈ B(x; ) := {x ∈ Rn : kx − xk < } with sup{kg 0 (x)−1 k : x ∈ B(x; )] ≤ M1 3. g 0 is Lipschitz continuous on c`B(x; ) with Lipschitz constant L, and

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method Let g : Rn → Rn be differentiable, x 0 ∈ Rn , and J0 ∈ Rn×n . Suppose that there exists x¯, x0 ∈ Rn , and  > 0 with kx0 − x¯k <  such that 1. g (x) = 0, 2. g 0 (x)−1 exists for x ∈ B(x; ) := {x ∈ Rn : kx − xk < } with sup{kg 0 (x)−1 k : x ∈ B(x; )] ≤ M1 3. g 0 is Lipschitz continuous on c`B(x; ) with Lipschitz constant L, and 0 0 −1 0 1 4. θ0 := LM − J0 )y 0 k, 2 kx − xk + M0 K < 1 where K ≥ k(g (x ) y 0 := g (x 0 )/kg (x 0 )k, and M0 = max{kg 0 (x)k : x ∈ B(x; )}.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method

Further suppose that iteration is initiated at x 0 where the Jk ’s are chosen to satisfy one of the following conditions;

where for each k = 1, 2, . . . , y k := g (x k )/kg (x k )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method

Further suppose that iteration is initiated at x 0 where the Jk ’s are chosen to satisfy one of the following conditions; (i) k(g 0 (x k )−1 − Jk )y k k ≤ K ,

where for each k = 1, 2, . . . , y k := g (x k )/kg (x k )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method

Further suppose that iteration is initiated at x 0 where the Jk ’s are chosen to satisfy one of the following conditions; (i) k(g 0 (x k )−1 − Jk )y k k ≤ K , (ii) k(g 0 (x k )−1 − Jk )y k k ≤ θ1k K for some θ1 ∈ (0, 1),

where for each k = 1, 2, . . . , y k := g (x k )/kg (x k )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method

Further suppose that iteration is initiated at x 0 where the Jk ’s are chosen to satisfy one of the following conditions; (i) k(g 0 (x k )−1 − Jk )y k k ≤ K , (ii) k(g 0 (x k )−1 − Jk )y k k ≤ θ1k K for some θ1 ∈ (0, 1), (iii) k(g 0 (x k )−1 − Jk )y k k ≤ min{M2 kx k − x k−1 k, K }, for some M2 > 0, or

where for each k = 1, 2, . . . , y k := g (x k )/kg (x k )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method

Further suppose that iteration is initiated at x 0 where the Jk ’s are chosen to satisfy one of the following conditions; (i) k(g 0 (x k )−1 − Jk )y k k ≤ K , (ii) k(g 0 (x k )−1 − Jk )y k k ≤ θ1k K for some θ1 ∈ (0, 1), (iii) k(g 0 (x k )−1 − Jk )y k k ≤ min{M2 kx k − x k−1 k, K }, for some M2 > 0, or (iv) k(g 0 (x k )−1 − Jk )y k k ≤ min{M2 kg (x k )k, K }, for some M3 > 0, where for each k = 1, 2, . . . , y k := g (x k )/kg (x k )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method

These hypotheses on the accuracy of the approximations Jk yield the following conclusions about the rate of convergence of the iterates x k .

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method

These hypotheses on the accuracy of the approximations Jk yield the following conclusions about the rate of convergence of the iterates x k . (a) k(g 0 (x k )−1 − Jk )y k k ≤ K =⇒ x k → x linearly.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method

These hypotheses on the accuracy of the approximations Jk yield the following conclusions about the rate of convergence of the iterates x k . (a) k(g 0 (x k )−1 − Jk )y k k ≤ K =⇒ x k → x linearly. (b) k(g 0 (x k )−1 − Jk )y k k ≤ θ1k K =⇒ x k → x superlinearly.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method

These hypotheses on the accuracy of the approximations Jk yield the following conclusions about the rate of convergence of the iterates x k . (a) k(g 0 (x k )−1 − Jk )y k k ≤ K =⇒ x k → x linearly. (b) k(g 0 (x k )−1 − Jk )y k k ≤ θ1k K =⇒ x k → x superlinearly. (c) k(g 0 (x k )−1 − Jk )y k k ≤ min{M2 kx k − x k−1 k, K } =⇒ x k → x two step quadratically.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Convergence of Newton’s Method

These hypotheses on the accuracy of the approximations Jk yield the following conclusions about the rate of convergence of the iterates x k . (a) k(g 0 (x k )−1 − Jk )y k k ≤ K =⇒ x k → x linearly. (b) k(g 0 (x k )−1 − Jk )y k k ≤ θ1k K =⇒ x k → x superlinearly. (c) k(g 0 (x k )−1 − Jk )y k k ≤ min{M2 kx k − x k−1 k, K } =⇒ x k → x two step quadratically. (d) k(g 0 (x k )−1 − Jk )y k k ≤ min{M2 kg (x k )k, K =⇒ x k → x quadratically.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method for Minimization: ∇f (x) = 0

Rates of Covergence and Newton’s Method

Newton’s Method

Outline

Rates of Convergence

Newton’s Method for Minimization: ∇f (x) = 0 Let f : Rn → R be twice continuously differentiable, x 0 ∈ Rn , and H0 ∈ Rn×n . Suppose that

Rates of Covergence and Newton’s Method

Newton’s Method

Outline

Rates of Convergence

Newton’s Method for Minimization: ∇f (x) = 0 Let f : Rn → R be twice continuously differentiable, x 0 ∈ Rn , and H0 ∈ Rn×n . Suppose that 1. there exists x ∈ Rn and  > kx 0 − x¯k such that f (x) ≤ f (x) whenever kx − x¯k ≤ ,

Rates of Covergence and Newton’s Method

Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Newton’s Method for Minimization: ∇f (x) = 0 Let f : Rn → R be twice continuously differentiable, x 0 ∈ Rn , and H0 ∈ Rn×n . Suppose that 1. there exists x ∈ Rn and  > kx 0 − x¯k such that f (x) ≤ f (x) whenever kx − x¯k ≤ , 2. there is a δ > 0 such that δkzk22 ≤ z T ∇2 f (x)z for all x ∈ B(x, ),

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Newton’s Method for Minimization: ∇f (x) = 0 Let f : Rn → R be twice continuously differentiable, x 0 ∈ Rn , and H0 ∈ Rn×n . Suppose that 1. there exists x ∈ Rn and  > kx 0 − x¯k such that f (x) ≤ f (x) whenever kx − x¯k ≤ , 2. there is a δ > 0 such that δkzk22 ≤ z T ∇2 f (x)z for all x ∈ B(x, ), 3. ∇2 f is Lipschitz continuous on clB(x; ) with Lipschitz constant L, and

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Newton’s Method for Minimization: ∇f (x) = 0 Let f : Rn → R be twice continuously differentiable, x 0 ∈ Rn , and H0 ∈ Rn×n . Suppose that 1. there exists x ∈ Rn and  > kx 0 − x¯k such that f (x) ≤ f (x) whenever kx − x¯k ≤ , 2. there is a δ > 0 such that δkzk22 ≤ z T ∇2 f (x)z for all x ∈ B(x, ), 3. ∇2 f is Lipschitz continuous on clB(x; ) with Lipschitz constant L, and L 4. θ0 := 2δ kx 0 − xk + M0 K < 1 where M0 > 0 satisfies T 2 z ∇ f (x)z ≤ M0 kzk22 for all x ∈ B(x, ) and K ≥ k(∇2 f (x 0 )−1 − H0 )y 0 k with y 0 = ∇f (x 0 )/k∇f (x 0 )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Newton’s Method for Minimization: ∇f (x) = 0 Further, suppose that the iteration x k+1 := x k − Hk ∇f (x k ) is initiated at x 0 where the Hk ’s are chosen to satisfy one of the following conditions:

where for each k = 1, 2, . . . y k := ∇f (x k )/k∇f (x k )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Newton’s Method for Minimization: ∇f (x) = 0 Further, suppose that the iteration x k+1 := x k − Hk ∇f (x k ) is initiated at x 0 where the Hk ’s are chosen to satisfy one of the following conditions: (i) k(∇2 f (x k )−1 − Hk )y k k ≤ K ,

where for each k = 1, 2, . . . y k := ∇f (x k )/k∇f (x k )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Newton’s Method for Minimization: ∇f (x) = 0 Further, suppose that the iteration x k+1 := x k − Hk ∇f (x k ) is initiated at x 0 where the Hk ’s are chosen to satisfy one of the following conditions: (i) k(∇2 f (x k )−1 − Hk )y k k ≤ K , (ii) k(∇2 f (x k )−1 − Hk )y k k ≤ θ1k K for some θ1 ∈ (0, 1),

where for each k = 1, 2, . . . y k := ∇f (x k )/k∇f (x k )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Newton’s Method for Minimization: ∇f (x) = 0 Further, suppose that the iteration x k+1 := x k − Hk ∇f (x k ) is initiated at x 0 where the Hk ’s are chosen to satisfy one of the following conditions: (i) k(∇2 f (x k )−1 − Hk )y k k ≤ K , (ii) k(∇2 f (x k )−1 − Hk )y k k ≤ θ1k K for some θ1 ∈ (0, 1), (iii) k(∇2 f (x k )−1 − Hk )y k k ≤ min{M2 kx k − x k−1 k, K }, for some M2 > 0, or

where for each k = 1, 2, . . . y k := ∇f (x k )/k∇f (x k )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Newton’s Method for Minimization: ∇f (x) = 0 Further, suppose that the iteration x k+1 := x k − Hk ∇f (x k ) is initiated at x 0 where the Hk ’s are chosen to satisfy one of the following conditions: (i) k(∇2 f (x k )−1 − Hk )y k k ≤ K , (ii) k(∇2 f (x k )−1 − Hk )y k k ≤ θ1k K for some θ1 ∈ (0, 1), (iii) k(∇2 f (x k )−1 − Hk )y k k ≤ min{M2 kx k − x k−1 k, K }, for some M2 > 0, or (iv) k(∇2 f (x k )−1 − Hk )y k k ≤ min{M2 k∇f (x k )k, K }, for some M3 > 0, where for each k = 1, 2, . . . y k := ∇f (x k )/k∇f (x k )k.

Rates of Covergence and Newton’s Method

Outline

Rates of Convergence

Newton’s Method

Newton’s Method for Minimization: ∇f (x) = 0

These hypotheses on the accuracy of the approximations Hk yield the following conclusions about the rate of convergence of the iterates x k . (a) If (i) holds, then x k → x linearly. (b) If (ii) holds, then x k → x superlinearly. (c) If (iii) holds, then x  → x two step quadratically. (d) If (iv) holds, then x k → k quadradically.

Rates of Covergence and Newton’s Method