Bernstein Inequalities (Probability Theory)

From Handwiki

In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let X1, ..., Xn be independent Bernoulli random variables taking values +1 and −1 with probability 1/2 (this distribution is also known as the Rademacher distribution), then for every positive [math]\displaystyle{ \varepsilon }[/math],

[math]\displaystyle{ \mathbb{P}\left (\left|\frac{1}{n}\sum_{i=1}^n X_i\right| \gt \varepsilon \right ) \leq 2\exp \left (-\frac{n\varepsilon^2}{2(1+\frac{\varepsilon}{3})} \right). }[/math]

Bernstein inequalities were proven and published by Sergei Bernstein in the 1920s and 1930s.[1][2][3][4] Later, these inequalities were rediscovered several times in various forms. Thus, special cases of the Bernstein inequalities are also known as the Chernoff bound, Hoeffding's inequality and Azuma's inequality.

Some of the inequalities

1. Let [math]\displaystyle{ X_1, \ldots, X_n }[/math] be independent zero-mean random variables. Suppose that [math]\displaystyle{ |X_i|\leq M }[/math] almost surely, for all [math]\displaystyle{ i. }[/math] Then, for all positive [math]\displaystyle{ t }[/math],

[math]\displaystyle{ \mathbb{P} \left (\sum_{i=1}^n X_i \geq t \right ) \leq \exp \left ( -\frac{\tfrac{1}{2} t^2}{\sum_{i = 1}^n \mathbb{E} \left[X_i^2 \right ]+\tfrac{1}{3} Mt} \right ). }[/math]

2. Let [math]\displaystyle{ X_1, \ldots, X_n }[/math] be independent zero-mean random variables. Suppose that for some positive real [math]\displaystyle{ L }[/math] and every integer [math]\displaystyle{ k \geq 2 }[/math],

[math]\displaystyle{ \mathbb{E} \left[ \left |X_i^k \right |\right ] \leq \frac{1}{2} \mathbb{E} \left[X_i^2\right] L^{k-2} k! }[/math]

Then

[math]\displaystyle{ \mathbb{P} \left (\sum_{i=1}^n X_i \geq 2t \sqrt{\sum \mathbb{E} \left [X_i^2 \right ]} \right ) \lt \exp(-t^2), \qquad \text{for}\quad 0 \leq t \leq \frac{1}{2L}\sqrt{\sum \mathbb{E} \left[X_j^2\right ]}. }[/math]

3. Let [math]\displaystyle{ X_1, \ldots, X_n }[/math] be independent zero-mean random variables. Suppose that

[math]\displaystyle{ \mathbb{E} \left[ \left |X_i^k \right |\right ] \leq \frac{k!}{4!} \left(\frac{L}{5}\right)^{k-4} }[/math]

for all integer [math]\displaystyle{ k \geq 4. }[/math] Denote

[math]\displaystyle{ A_k = \sum \mathbb{E} \left [ X_i^k\right ]. }[/math]

Then,

[math]\displaystyle{ \mathbb{P} \left( \left| \sum_{j=1}^n X_j - \frac{A_3 t^2}{3A_2} \right|\geq \sqrt{2A_2} \, t \left[ 1 + \frac{A_4 t^2}{6 A_2^2} \right] \right) \lt 2 \exp (- t^2), \qquad \text{for} \quad 0 \lt t \leq \frac{5 \sqrt{2A_2}}{4L}. }[/math]

4. Bernstein also proved generalizations of the inequalities above to weakly dependent random variables. For example, inequality (2) can be extended as follows. Let [math]\displaystyle{ X_1, \ldots, X_n }[/math] be possibly non-independent random variables. Suppose that for all integers [math]\displaystyle{ i\gt 0 }[/math],

[math]\displaystyle{ \begin{align} \mathbb{E} \left. \left [ X_i \right | X_1, \ldots, X_{i-1} \right ] &= 0, \\ \mathbb{E} \left. \left [ X_i^2 \right | X_1, \ldots, X_{i-1} \right ] &\leq R_i \mathbb{E} \left [ X_i^2 \right ], \\ \mathbb{E} \left. \left [ X_i^k \right | X_1, \ldots, X_{i-1} \right ] &\leq \tfrac{1}{2} \mathbb{E} \left. \left[ X_i^2 \right | X_1, \ldots, X_{i-1} \right ] L^{k-2} k! \end{align} }[/math]

Then

[math]\displaystyle{ \mathbb{P} \left( \sum_{i=1}^n X_i \geq 2t \sqrt{\sum_{i=1}^n R_i \mathbb{E}\left [ X_i^2 \right ]} \right) \lt \exp(-t^2), \qquad \text{for}\quad 0 \lt t \leq \frac{1}{2L} \sqrt{\sum_{i=1}^n R_i \mathbb{E} \left [X_i^2 \right ]}. }[/math]

More general results for martingales can be found in Fan et al. (2015).[5]

Proofs

The proofs are based on an application of Markov's inequality to the random variable

[math]\displaystyle{ \exp \left ( \lambda \sum_{j=1}^n X_j \right ), }[/math]

for a suitable choice of the parameter [math]\displaystyle{ \lambda \gt 0 }[/math].

Generalizations

The Bernstein inequality could be generalized to Gaussian random matrices. Let [math]\displaystyle{ G = g^H A g + 2 \operatorname{Re}(g^H a) }[/math] be a scalar where [math]\displaystyle{ A }[/math] is a complex Hermitian matrix and [math]\displaystyle{ a }[/math] is complex vector of size [math]\displaystyle{ N }[/math]. The vector [math]\displaystyle{ g \sim \mathcal{CN}(0,I) }[/math] is a Gaussian vector of size [math]\displaystyle{ N }[/math]. Then for any [math]\displaystyle{ \sigma \geq 0 }[/math], we have

[math]\displaystyle{ \mathbb{P} \left( G \leq \operatorname{tr}(A) - \sqrt{2\sigma}\sqrt{\Vert \operatorname{vec}(A) \Vert^2 + 2 \Vert a \Vert^2 } - \sigma s^-(A) \right) \lt \exp(-\sigma), }[/math]

where [math]\displaystyle{ \operatorname{vec} }[/math] is the vectorization operation. Also, [math]\displaystyle{ s^- (A) = \max(\lambda_{\max}(-A),0) }[/math] and [math]\displaystyle{ \lambda_{\max}(-A) }[/math] is the maximum eigenvalue of [math]\displaystyle{ -A }[/math]. The proof is detailed in here.[6] Another similar inequality is also formulated as

[math]\displaystyle{ \mathbb{P} \left( G \geq \operatorname{tr}(A) + \sqrt{2\sigma}\sqrt{\Vert \operatorname{vec}(A) \Vert^2 + 2 \Vert a \Vert^2 } + \sigma s^+(A) \right) \lt \exp(-\sigma), }[/math]

where [math]\displaystyle{ s^+(A) = \max(\lambda_{\max}(A),0) }[/math] and [math]\displaystyle{ \lambda_{\max}(A) }[/math] is the maximum eigenvalue of [math]\displaystyle{ A }[/math].

See also

  • Concentration inequality - a summary of tail-bounds on random variables.
  • Hoeffding's inequality

References

  1. S.N.Bernstein, "On a modification of Chebyshev's inequality and of the error formula of Laplace" vol. 4, #5 (original publication: Ann. Sci. Inst. Sav. Ukraine, Sect. Math. 1, 1924)
  2. Bernstein, S. N. (1937). "Об определенных модификациях неравенства Чебышева". Doklady Akademii Nauk SSSR 17 (6): 275–277. 
  3. S.N.Bernstein, "Theory of Probability" (Russian), Moscow, 1927
  4. J.V.Uspensky, "Introduction to Mathematical Probability", McGraw-Hill Book Company, 1937
  5. Fan, X.; Grama, I.; Liu, Q. (2015). "Exponential inequalities for martingales with applications". Electronic Journal of Probability (Electron. J. Probab. 20) 20: 1–22. doi:10.1214/EJP.v20-3496. http://projecteuclid.org/euclid.ejp/1465067107. 
  6. Ikhlef, Bechar (2009). "A Bernstein-type inequality for stochastic processes of quadratic forms of Gaussian variables". arXiv:0909.3595 [math.ST].

(according to: S.N.Bernstein, Collected Works, Nauka, 1964)

A modern translation of some of these results can also be found in Hazewinkel, Michiel, ed. (2001), "Bernstein inequality", Encyclopedia of Mathematics, Springer Science+Business Media B.V. / Kluwer Academic Publishers, ISBN 978-1-55608-010-4, https://www.encyclopediaofmath.org/index.php?title=Bernstein_inequality 



Retrieved from "https://handwiki.org/wiki/index.php?title=Bernstein_inequalities_(probability_theory)&oldid=3004266"

Categories: [Probabilistic inequalities]


Download as ZWI file | Last modified: 12/31/2023 17:27:53 | 18 views
☰ Source: https://handwiki.org/wiki/Bernstein_inequalities_(probability_theory) | License: CC BY-SA 3.0

ZWI signed:
  Encycloreader by the Knowledge Standards Foundation (KSF) ✓[what is this?]