LeanAide GPT-4 documentation

Module: Mathlib.Analysis.Convex.Cone.Extension



RieszExtension.step

theorem RieszExtension.step : ∀ {E : Type u_2} [inst : AddCommGroup E] [inst_1 : Module ℝ E] (s : ConvexCone ℝ E) (f : E →ₗ.[ℝ] ℝ), (∀ (x : ↥f.domain), ↑x ∈ s → 0 ≤ ↑f x) → (∀ (y : E), ∃ x, ↑x + y ∈ s) → f.domain ≠ ⊤ → ∃ g, f < g ∧ ∀ (x : ↥g.domain), ↑x ∈ s → 0 ≤ ↑g x

This is a theorem related to the induction step in the M. Riesz extension theorem. It states that for any given convex cone `s` in a vector space `E`, and a partially defined linear map `f` from its domain to the set of real numbers, if `f` is nonnegative on the intersection of its domain and `p`, and the sum of `p` and `s` gives the whole vector space `E`, then if `f` is not defined on the whole `E`, we can extend it to a larger submodule without violating the non-negativity condition. This means a larger linear map `g` can be found such that `f` is a proper subset of `g` and `g` maintains the non-negativity condition on its domain.

More concisely: Given a convex cone `s` in a vector space `E`, a partially defined linear map `f` from `s` to the real numbers that is nonnegative on `s` intersected with the domain of `f`, and the sum of `p` and `s` equals `E`, there exists an extension `g` of `f` as a linear map on a larger submodule of `E` that maintains the non-negativity condition on its domain.

riesz_extension

theorem riesz_extension : ∀ {E : Type u_2} [inst : AddCommGroup E] [inst_1 : Module ℝ E] (s : ConvexCone ℝ E) (f : E →ₗ.[ℝ] ℝ), (∀ (x : ↥f.domain), ↑x ∈ s → 0 ≤ ↑f x) → (∀ (y : E), ∃ x, ↑x + y ∈ s) → ∃ g, (∀ (x : ↥f.domain), g ↑x = ↑f x) ∧ ∀ x ∈ s, 0 ≤ g x

The **Riesz extension theorem** states that for any convex cone `s` in a vector space `E`, a submodule `p`, and a linear function `f : p → ℝ` which is nonnegative on `p ∩ s` and for which `p + s = E`, there exists a globally defined linear function `g : E → ℝ` that matches `f` on `p` and is nonnegative on `s`. This means that if we have a certain nonnegative linear function defined on a submodule of a vector space, and this function along with a convex cone can cover the entire space, then we can extend this function to a global function over the entire vector space that preserves the nonnegativity on the cone.

More concisely: Given a convex cone `s` in a vector space `E`, a submodule `p`, and a linear function `f : p → ℝ` that is nonnegative on `p ∩ s` and satisfies `p + s = E`, there exists a global linear extension `g : E → ℝ` of `f` that is nonnegative on `s`.

exists_extension_of_le_sublinear

theorem exists_extension_of_le_sublinear : ∀ {E : Type u_2} [inst : AddCommGroup E] [inst_1 : Module ℝ E] (f : E →ₗ.[ℝ] ℝ) (N : E → ℝ), (∀ (c : ℝ), 0 < c → ∀ (x : E), N (c • x) = c * N x) → (∀ (x y : E), N (x + y) ≤ N x + N y) → (∀ (x : ↥f.domain), ↑f x ≤ N ↑x) → ∃ g, (∀ (x : ↥f.domain), g ↑x = ↑f x) ∧ ∀ (x : E), g x ≤ N x

The **Hahn-Banach theorem** in the context of the Lean theorem prover states that if `N` is a sublinear map from a vector space `E` to the real numbers and `f` is a linear map defined on a subspace of `E` such that `f(x)` is less than or equal to `N(x)` for all `x` in the domain of `f`, then there exists an extension of `f` to the whole space `E`, denoted by `g`, that retains the property `g(x) ≤ N(x)` for all `x` in the vector space `E`. The conditions for `N` to be a sublinear map are that it is positively homogeneous (for all real numbers `c` greater than 0 and all `x` in `E`, `N(c * x)` is equal to `c * N(x)`) and subadditive (`N(x + y)` is less than or equal to `N(x) + N(y)` for all `x, y` in `E`).

More concisely: Given a sublinear map `N` from a vector space `E` to the real numbers and a linear map `f` defined on a subspace of `E` with `f(x) ≤ N(x)` for all `x`, there exists an extension `g` of `f` to all of `E` such that `g(x) ≤ N(x)` for all `x` in `E`.