Talk:Hoeffding's inequality

Latest comment: 9 months ago by 188.254.161.197 in topic Mistake

Mistake edit

I think the line

 

in the "Confidence intervals" section doesn't follow from the previous line. If you want the minimum number of samples   such that you can be at least  % confident that   is within   of  , then I think you actually want

 

which rearranges to give the above inequality for  .

Someone else comment: No. What is needed is the first (smallest) n which violate inequality. When   gets bigger, probability   (for making Type I error) must gets smaller, so probability of confidence interval become larger. Ambiguity is in words "solving the above for n gives us", when in fact what is needed is opposite of the solution to that inequality for the line mentioned. I will try to edit. — Preceding unsigned comment added by 188.254.161.197 (talk) 18:08, 8 August 2023 (UTC)Reply

Mistake edit

The article refers to Theorem 2 of Hoeffding's paper for the two inequalities:

 

I'm pretty sure that the correct versions have   in the exponential. I've edited the article accordingly.

[no title] edit

As stated, the inequality is not true (this is easy to see for n=1). There must be some other condition. coco

I noticed that you added the condition  , which is indeed required. Is there another condition we're missing? --MarkSweep 21:17, 21 August 2005 (UTC)Reply
I suggest that  194.25.15.170 (talk) 13:45, 16 March 2015 (UTC)Reply

Hoeffding's theorem 2 edit

I just reverted an edit which (re)introduced a mistake in the presentation of the inequality. As stated here, the inequality involves the probability

 

Note that S is the sum of n independent random variables. This probability could also be written as

 

which is how it appears in Hoeffding's paper (Theorem 2, p. 16, using slightly different notation). In other words, Hoeffding's formulation is in terms of the mean of n independent RVs, whereas the formulation used here is in terms of their sum. A recent edit changed this to

 

which is incorrect. --MarkSweep (call me collect) 18:50, 24 November 2005 (UTC)Reply

Why do we need that X_i's have finite first and second moments? It is not stated in the Hoefdding paper and after all, I think it follows from that X_i lies in [a_i, b_i] i.e. bounded interval.

The article has no mistakes - the condition t>0 is not nessecary and the last comment is obvious. Nevertheless, by a Hoeffding type inequality is meant an inequality, which uses the transform f(t)=exp(h(t-x)) to derive bounds for tail probabilities for sums of independent r.v. and martingales. Therefore, the written inequality is only one of the inequalities Hoeffding introduced, but, regarding it from a statistical point of view, that is not the most important result as it doesn't control the variance of the variables. I would suggest to add the other inequality and explain what is meant by saying "Hoeffding inequality", as it is not one thing.


I added the restriction   to the statement of the first inequality in the General case section mostly because that restriction is in the 1963 Hoeffding article. (I think that the restriction is necessary also.) -- Hein Aug 2017 — Preceding unsigned comment added by Irchans (talkcontribs) 14:32, 3 August 2017 (UTC)Reply

Special case of Bernstein's inequality edit

Can someone point out which Bernstein inequality Hoeffding's inequality is a special case of?--Steve Kroon 14:10, 22 May 2007 (UTC)Reply

The "which inequality is a special case of the other" is inconsistent in wikipeida: the article on Hoeffding ineq. says it's more general than Bernstein ineq. While the article on Bernstein ineq. says "special cases of Bernstein ineq. is ... Hoeffding ineq". Someone, please, fix this? -- 194.126.99.204 (talk) 13:11, 14 June 2012 (UTC)Reply

Bernstein is typically a stronger inequality, but they are generally incomparable. For sums of (unbiased) Rademachers, Hoeffding correctly asserts the behavior of \exp(-c*x^2) while Bernstein incorrectly gives \exp(-c*x) for large x. On the other hand, the sum of N [-1,1] variables with very small variance, Bernstein will give much tighter \exp(-c*x^2/Variance) for small x over the Hoeffding's \exp(-c*x^2/N). I rephrased it. --TheZuza777 (talk) 00:22, 30 September 2018 (UTC)Reply

Hoeffding's Inequality edit

Existing version is unreadable, Article ignores dependence of results on Bennett's Inequality —Preceding unsigned comment added by 122.106.62.116 (talk) 06:13, 27 February 2009 (UTC)Reply

Variable Bounds edit

The bounds for variables X_i seem misleading

 

Older revisions of this article (pre 2009), as well as the original paper state the bounds as

 

or

 

--18.111.112.208 (talk) 18:00, 11 December 2010 (UTC)Reply

It does not make any difference (I mean the first and the third version above, the second one is marginally stronger). What is misleading about it?—Emil J. 13:52, 13 December 2010 (UTC)Reply
I think that the third version deals with centered versions of variable? 194.25.15.170 (talk) 13:43, 16 March 2015 (UTC) SergeyReply


Problem with bounds edit

To apply Hoeffding Lemma to the zero-mean variables   ,we must have   almost surely.

But,   and   are not equivalent.

Central limit theorem edit

Can someone elaborate on the relationship to the CLT? As I understand this, these two theorems are closely related. Thanks. --138.246.2.177 (talk) 10:04, 3 December 2012 (UTC)Reply

What is the inequality? edit

I'm having a hard time finding the thing called "Hoeffding's Inequality". Should that be at the top somewhere? 24.10.157.3 (talk) 00:06, 23 December 2021 (UTC)Reply

I've moved things around, it should be more clear now. Fawly (talk) 05:54, 23 December 2021 (UTC)Reply