LeanAide GPT-4 documentation

Module: Mathlib.Probability.Moments


ProbabilityTheory.measure_le_le_exp_cgf

theorem ProbabilityTheory.measure_le_le_exp_cgf : ∀ {Ω : Type u_1} {m : MeasurableSpace Ω} {X : Ω → ℝ} {μ : MeasureTheory.Measure Ω} {t : ℝ} [inst : MeasureTheory.IsFiniteMeasure μ] (ε : ℝ), t ≤ 0 → MeasureTheory.Integrable (fun ω => (t * X ω).exp) μ → (↑↑μ {ω | X ω ≤ ε}).toReal ≤ (-t * ε + ProbabilityTheory.cgf X μ t).exp

The theorem is a formal statement of the **Chernoff bound** for the lower tail of a real random variable. In more human-readable terms, it states that for any real random variable `X` over a measurable space `Ω` with the measure `μ`, given a real number `t` which is less than or equal to 0, and a real number `ε`, if the function `(t * X ω).exp` is integrable with respect to the measure `μ`, then the measure of the set of outcomes where `X ω` is less than or equal to `ε` is less than or equal to the exponential of the sum of `-t * ε` and the cumulant generating function of `X` at `t`.

More concisely: For any real random variable `X` over a measurable space `(Ω, μ)`, if `t ≤ 0` and `(t * X ω).exp` is integrable, then `μ({ω : Ω | X ω ≤ ε}) ≤ exp(-t * ε + ∫(t * X dμ))`.

Mathlib.Probability.Moments._auxLemma.1

theorem Mathlib.Probability.Moments._auxLemma.1 : (¬False) = True

This theorem, named `_auxLemma.1` from the `Mathlib.Probability.Moments` module, states that the negation of `False` is `True` in Lean 4. This is a fundamental logical principle, corresponding to the fact that if something is not false (i.e., if it's not the case that something does not hold), then it is indeed true.

More concisely: The negation of False is True. (In Lean 4 logical terms: �� /******/∃ h : False, ¬h) ≡ True. )

ProbabilityTheory.measure_ge_le_exp_cgf

theorem ProbabilityTheory.measure_ge_le_exp_cgf : ∀ {Ω : Type u_1} {m : MeasurableSpace Ω} {X : Ω → ℝ} {μ : MeasureTheory.Measure Ω} {t : ℝ} [inst : MeasureTheory.IsFiniteMeasure μ] (ε : ℝ), 0 ≤ t → MeasureTheory.Integrable (fun ω => (t * X ω).exp) μ → (↑↑μ {ω | ε ≤ X ω}).toReal ≤ (-t * ε + ProbabilityTheory.cgf X μ t).exp

The theorem, known as the Chernoff bound, provides an upper bound on the probability that a real random variable `X` defined on a measurable space `Ω` exceeds a certain value `ε`. More specifically, under the conditions that `t` is non-negative and the function `(t * X ω).exp` is integrable with respect to the measure `μ`, the measure of the set of outcomes where `X ω` is greater than or equal to `ε` is less than or equal to the exponential of `-t * ε + ProbabilityTheory.cgf X μ t`, where `ProbabilityTheory.cgf X μ t` is the cumulant generating function of `X`.

More concisely: Under the given conditions, the measure of the set of outcomes where a real random variable exceeds a certain value is bounded above by the exponential of the negative product of the value and the rate, plus the cumulant generating function of the random variable evaluated at that rate.

ProbabilityTheory.measure_le_le_exp_mul_mgf

theorem ProbabilityTheory.measure_le_le_exp_mul_mgf : ∀ {Ω : Type u_1} {m : MeasurableSpace Ω} {X : Ω → ℝ} {μ : MeasureTheory.Measure Ω} {t : ℝ} [inst : MeasureTheory.IsFiniteMeasure μ] (ε : ℝ), t ≤ 0 → MeasureTheory.Integrable (fun ω => (t * X ω).exp) μ → (↑↑μ {ω | X ω ≤ ε}).toReal ≤ (-t * ε).exp * ProbabilityTheory.mgf X μ t

The theorem states the **Chernoff bound** on the lower tail of a real random variable. For any real random variable `X` defined on a measurable space `Ω` with measure `μ`, and any real number `t` such that `t` is less than or equal to `0`, if the function `(t * X ω).exp` is integrable with respect to `μ`, then the measure of the set of outcomes `ω` for which `X ω` is less than or equal to `ε` is less than or equal to the product of the exponential of `-t * ε` and the moment generating function of `X` at `t`. In mathematical terms, this can be expressed as `μ({ω | X ω ≤ ε}) ≤ exp(-t * ε) * E[exp(t * X)]`, where `E` denotes the expected value. This theorem provides a bound on the probability of the event that `X` takes on values less than or equal to `ε`.

More concisely: For any real random variable `X` on measurable space `(Ω, μ)` with integrable function `(t * X ω).exp`, the measure of `{ω | X ω ≤ ε}` is bounded above by `exp(-t * ε) * E[exp(t * X)]`.

ProbabilityTheory.IndepFun.exp_mul

theorem ProbabilityTheory.IndepFun.exp_mul : ∀ {Ω : Type u_1} {m : MeasurableSpace Ω} {μ : MeasureTheory.Measure Ω} {X Y : Ω → ℝ}, ProbabilityTheory.IndepFun X Y μ → ∀ (s t : ℝ), ProbabilityTheory.IndepFun (fun ω => (s * X ω).exp) (fun ω => (t * Y ω).exp) μ

This theorem states that in the context of probability theory, if two real-valued functions `X` and `Y` defined on a space `Ω` using a measure theory measure `μ` are independent, then for any two real numbers `s` and `t`, the functions `(s * X ω).exp` and `(t * Y ω).exp`, where `exp` denotes the exponential function, are also independent under the same measure `μ`. This is essentially a consequence of the `IndepFun.comp` theorem and is expected to be frequently used in subsequent developments.

More concisely: In probability theory, if functions `X` and `Y` defined on a measure space are independent, then their exponential functions `(s * X ω).exp` and `(t * Y ω).exp` are also independent for any real numbers `s` and `t`.

ProbabilityTheory.mgf_zero'

theorem ProbabilityTheory.mgf_zero' : ∀ {Ω : Type u_1} {m : MeasurableSpace Ω} {X : Ω → ℝ} {μ : MeasureTheory.Measure Ω}, ProbabilityTheory.mgf X μ 0 = (↑↑μ Set.univ).toReal

The theorem `ProbabilityTheory.mgf_zero'` states that for any type `Ω`, any measurable space `m` on `Ω`, any real-valued random variable `X` on `Ω`, and any measure `μ` on the measurable space, the moment generating function of `X` with respect to the measure `μ` evaluated at zero is equal to the real value of the total measure of the universal set. In mathematical terms, this is saying that if `X` is a real-valued random variable on a measurable space `(Ω, m)` with measure `μ`, then `M_X(0) = μ(Ω)`, where `M_X(t)` is the moment generating function of `X` and `Ω` represents the entire space.

More concisely: For any measurable space (Ω, m) with measure μ, and real-valued random variable X on Ω, the moment generating function M\_X(0) equals the total measure μ(Ω).

ProbabilityTheory.measure_ge_le_exp_mul_mgf

theorem ProbabilityTheory.measure_ge_le_exp_mul_mgf : ∀ {Ω : Type u_1} {m : MeasurableSpace Ω} {X : Ω → ℝ} {μ : MeasureTheory.Measure Ω} {t : ℝ} [inst : MeasureTheory.IsFiniteMeasure μ] (ε : ℝ), 0 ≤ t → MeasureTheory.Integrable (fun ω => (t * X ω).exp) μ → (↑↑μ {ω | ε ≤ X ω}).toReal ≤ (-t * ε).exp * ProbabilityTheory.mgf X μ t

The Chernoff bound theorem provides an upper limit on the tail distribution of a real random variable. For a given measurable space `Ω`, a real random variable `X`, a measure `μ` on `Ω`, and a real number `t` such that `0 ≤ t`, if the function `(t * X ω).exp` is integrable with respect to `μ`, then the measure of the set of outcomes where `X` exceeds `ε` is less than or equal to the exponential of `-t * ε` times the moment generating function of `X` at `t` with respect to `μ`. This theorem is useful in probability theory for bounding the probability of large deviations.

More concisely: For a real random variable `X` with integrable function `(t * X)` and measure `μ` on the measurable space `Ω`, the measure of `{ω ∈ Ω : X(ω) > ε}` is bounded above by `exp(-t * ε) * M(t)` where `M(t)` is the moment generating function of `X` with respect to `μ`.

ProbabilityTheory.IndepFun.mgf_add

theorem ProbabilityTheory.IndepFun.mgf_add : ∀ {Ω : Type u_1} {m : MeasurableSpace Ω} {μ : MeasureTheory.Measure Ω} {t : ℝ} {X Y : Ω → ℝ}, ProbabilityTheory.IndepFun X Y μ → MeasureTheory.AEStronglyMeasurable (fun ω => (t * X ω).exp) μ → MeasureTheory.AEStronglyMeasurable (fun ω => (t * Y ω).exp) μ → ProbabilityTheory.mgf (X + Y) μ t = ProbabilityTheory.mgf X μ t * ProbabilityTheory.mgf Y μ t

The theorem `ProbabilityTheory.IndepFun.mgf_add` in Probability Theory states that for any measurable space `Ω` with measure `μ` and real number `t`, and any two real-valued random variables `X` and `Y` on this space, if `X` and `Y` are independent and the functions `(fun ω => Real.exp (t * X ω))` and `(fun ω => Real.exp (t * Y ω))` are almost everywhere strongly measurable, then the moment generating function of the sum of `X` and `Y` at `t` is equal to the product of the moment generating functions of `X` and `Y` at `t`. In mathematical terms, this means if `X` and `Y` are independent, then $M_{X+Y}(t) = M_X(t) \cdot M_Y(t)$, where $M$ denotes the moment generating function.

More concisely: For independent random variables X and Y on a measurable space with almost everywhere strongly measurable moment generating functions, their sum's moment generating function is equal to the product of their individual moment generating functions.