Jump to content

Cantelli's inequality: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
No edit summary
Line 1: Line 1:
{{Context|date=August 2012}}
{{Context|date=August 2012}}
In [[probability theory]], '''Cantelli's inequality''' is a generalization of [[Chebychev's inequality]] in the case of a single "tail".<ref>''Research and practice in multiple criteria decision making: proceedings of the XIVth International Conference on Multiple Criteria Decision Making (MCDM), Charlottesville, Virginia, USA, June 8-12, 1998'', edited by Y.Y. Haimes and R.E. Steuer, [[Springer Verlag|Springer]], 2000, ISBN 3540672664.</ref><ref>[http://www.cse.buffalo.edu/~hungngo/classes/2011/Spring-694/lectures/l4.pdf "Tail and Concentration Inequalities" by Hung Q. Ngo]</ref><ref>[http://www.econ.upf.edu/~lugosi/anu.pdf "Concentration-of-measure inequalities" by Gábor Lugois]</ref> The inequality states that
In [[probability theory]], '''Cantelli's inequality''' is a generalization of [[Chebyshev's inequality]] in the case of a single "tail".<ref>''Research and practice in multiple criteria decision making: proceedings of the XIVth International Conference on Multiple Criteria Decision Making (MCDM), Charlottesville, Virginia, USA, June 8-12, 1998'', edited by Y.Y. Haimes and R.E. Steuer, [[Springer Verlag|Springer]], 2000, ISBN 3540672664.</ref><ref>[http://www.cse.buffalo.edu/~hungngo/classes/2011/Spring-694/lectures/l4.pdf "Tail and Concentration Inequalities" by Hung Q. Ngo]</ref><ref>[http://www.econ.upf.edu/~lugosi/anu.pdf "Concentration-of-measure inequalities" by Gábor Lugois]</ref> The inequality states that
:<math>\mathrm{Prob}(X-\mu \le \lambda) \le \frac{\sigma^2}{\sigma^2+\lambda^2}</math> for <math>\lambda < 0</math>
:<math>\mathrm{Prob}(X-\mu \le \lambda) \le \frac{\sigma^2}{\sigma^2+\lambda^2}</math> for <math>\lambda < 0</math>
:<math>\mathrm{Prob}(X-\mu \le \lambda) \ge 1 - \frac{\sigma^2}{\sigma^2+\lambda^2}</math> for <math>\lambda \ge 0</math>
:<math>\mathrm{Prob}(X-\mu \le \lambda) \ge 1 - \frac{\sigma^2}{\sigma^2+\lambda^2}</math> for <math>\lambda \ge 0</math>
Line 9: Line 9:
:<math>\sigma^2</math> is the [[variance]]
:<math>\sigma^2</math> is the [[variance]]


The inequality is due to [[Francesco Paolo Cantelli]]. The Chebyshev inequality implies that in any [[sample (statistics)|data sample]] or [[probability distribution]], "nearly all" values are close to the [[expected value|mean]] in terms of the [[absolute value]] of the difference between the points of the data sample and the arithmetic mean of the data sample. The Cantelli inequality (sometimes called the "Chebyshev-Cantelli inequality" or the "one-sided Chebyshev inequality") gives a way of estimating how the points of the data sample are bigger than or smaller than their arithmetic mean without the two tails of the absolute value estimate.
The inequality is due to [[Francesco Paolo Cantelli]].


==References==
==References==

Revision as of 18:12, 12 August 2012

In probability theory, Cantelli's inequality is a generalization of Chebyshev's inequality in the case of a single "tail".[1][2][3] The inequality states that

for
for

where

is a real-valued random variable,
is the probability measure,
is the expected value,
is the variance

The inequality is due to Francesco Paolo Cantelli. The Chebyshev inequality implies that in any data sample or probability distribution, "nearly all" values are close to the mean in terms of the absolute value of the difference between the points of the data sample and the arithmetic mean of the data sample. The Cantelli inequality (sometimes called the "Chebyshev-Cantelli inequality" or the "one-sided Chebyshev inequality") gives a way of estimating how the points of the data sample are bigger than or smaller than their arithmetic mean without the two tails of the absolute value estimate.

References

  1. ^ Research and practice in multiple criteria decision making: proceedings of the XIVth International Conference on Multiple Criteria Decision Making (MCDM), Charlottesville, Virginia, USA, June 8-12, 1998, edited by Y.Y. Haimes and R.E. Steuer, Springer, 2000, ISBN 3540672664.
  2. ^ "Tail and Concentration Inequalities" by Hung Q. Ngo
  3. ^ "Concentration-of-measure inequalities" by Gábor Lugois