By Borwein,Lewis

Show description

Read Online or Download Convex Analysis and Non Linear Optimization Theory and Examples PDF

Similar mathematics books

Meeting the Needs of Your Most Able Pupils in Maths (The Gifted and Talented Series)

Assembly the wishes of Your such a lot capable students: arithmetic offers particular information on: recognising excessive skill and power making plans, differentiation, extension and enrichment in Mathematicss instructor wondering talents help for extra capable students with special academic needs (dyslexia, ADHD, sensory impairment) homework recording and evaluate past the study room: visits, competitions, summer time colleges, masterclasses, hyperlinks with universities, companies and different firms.

Additional info for Convex Analysis and Non Linear Optimization Theory and Examples

Example text

9) ai , x ≤ 0 for i = 1, 2, . . , m, c, x > 0, x ∈ E. Proof. 9) has no solution. 8) has a solution by using induction on the number of elements m. The result is clear for m = 0. Suppose then that the result holds in any Euclidean space and for any set of m − 1 elements and any element c. Define a0 = −c. 9) shows there are scalars i λ0 , λ1 , . . , λm ≥ 0 in R, not all zero, satisfying λ0 c = m 1 λi a . If λ0 > 0 the proof is complete, so suppose λ0 = 0 and without loss of generality λm > 0.

7). Exercises and commentary Versions of the Lagrangian necessary conditions above appeared in [161] and [99]: for a survey, see [140]. The approach here is analogous to [72]. The Slater condition first appeared in [152]. 1. 3). 2. 3) to solve the following problems. 2 The value function 57 (a) ⎧ ⎪ ⎪ ⎪ ⎨ inf x21 + x22 − 6x1 − 2x2 + 10 subject to 2x1 + x2 − 2 ≤ 0, ⎪ x2 − 1 ≤ 0, ⎪ ⎪ ⎩ x ∈ R2 . (b) ⎧ ⎪ ⎪ ⎪ ⎨ ⎪ ⎪ ⎪ ⎩ inf −2x1 + x2 subject to x21 − x2 ≤ 0, x2 − 4 ≤ 0, x ∈ R2 . (c) ⎧ ⎪ ⎪ ⎪ ⎨ inf x1 + (2/x2 ) subject to −x2 + 1/2 ≤ 0, ⎪ −x1 + x22 ≤ 0, ⎪ ⎪ ⎩ x ∈ {(x1 , x2 ) | x2 > 0}.

Prove the following functions x ∈ R → f (x) are convex and calculate ∂f : (a) |x|; (b) δR+ ; √ (c) − x if x ≥ 0, and +∞ otherwise; (d) 0 if x < 0, 1 if x = 0, and +∞ otherwise. 6. 6 (Subgradients and directional derivatives). 7. 7. 8. (Subgradients of norm) Calculate ∂ · . 9. (Subgradients of maximum eigenvalue) Prove ∂λ1 (0) = {Y ∈ Sn+ | tr Y = 1}. 10. 2, Exercise 9 (Schur-convexity)). 48 Fenchel duality ∗ 11. Define a function f : Rn → R by f (x1 , x2 , . . , xn ) = maxj {xj }, let x¯ = 0 and d = (1, 1, .

Download PDF sample

Rated 4.04 of 5 – based on 14 votes