Wikipedia:Reference desk/Archives/Mathematics/2008 August 14

Mathematics desk
< August 13 << Jul | August | Sep >> August 15 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


August 14 edit

a and n are positive integers.

Claim: a is coprime to n if and only if  .

Prove or disprove. —Preceding unsigned comment added by 93.172.15.144 (talk) 09:56, 14 August 2008 (UTC)[reply]

The article you've linked to in the title has two proofs of the non-trivial implication. Algebraist 12:06, 14 August 2008 (UTC)[reply]

Deceptively distributed decimal digits edit

X and Y are independent continuous random variables both uniformly distributed between 0 and some upper limit a. Z is the leading (non-zero) digit in the decimal expansion of Y/X. So, for example, if X=2 and Y=5 then Z=2; if X=5 and Y=2 then Z=4. What is the probability that Z=1 ?

I expected Benford's law to apply, because I thought this was analogous to measuring an observable of magnitude Y in units of magnitude X. So I expected to find P(Z=1) = log10(2). But I think I have a geometric argument that shows that P(Z=1) is 1/3. Is this correct ? And, if so, why doesn't Benford's law apply ? Gandalf61 (talk) 10:17, 14 August 2008 (UTC)[reply]

The probability is independent of a, so set a=1. This gives some simplification. Within the unit square in the X,Y-plane consider the triangles for which 0.1≤X/Y<0.2, 1≤X/Y<2, 10≤X/Y<20, 100≤X/Y<200 etc and sum their areas. Is this what you did? Bo Jacoby (talk) 11:00, 14 August 2008 (UTC).[reply]
Yes, in essence that was my geometric argument that led to P(Z=1) = 1/3. Gandalf61 (talk) 11:10, 14 August 2008 (UTC)[reply]
Benford's law is just a rule of thumb, and I don't think there's any realistic situation in which it holds exactly. I'd expect it to hold approximately here, and it does: log10(2) ≈ 1/3. A precisely correct statement of the law would be that P(Z=1) is the integrated area from 0 to log10(2) of the distribution of the log10 of your random variable in  . Often that distribution is roughly flat, so the integral comes out to about 0.301. -- BenRG (talk) 11:31, 14 August 2008 (UTC)[reply]
Hmmm. 1/3 doesn't seem very close to log10(2). And the difference gets worse if you increase the base. In hexadecimal, Benford's law says the P(Z=1) should be log16(2) = 1/4, whereas the geometric method gives P(Z=1) = 3/10. In fact, as the base increases, Benford's law says that P(Z=1) should tend towards 0 because
 
whereas with the geometric method, P(Z=1) tends to a limit of 1/4 (because the area of the largest triangle, between the lines Y=X and Y=2X, is always 1/4 and the areas of all the other triangles become negligible). So I still think that either the geometric method is incorrect or Benford's law doesn't apply - but I don't know which. Gandalf61 (talk) 12:58, 14 August 2008 (UTC)[reply]
Your geometric method is fine; it's a proof. Benford's law never applies in the sense that you mean—it's not a theorem (except in base 2) and can't be used to prove anything. It's generally the case that it gets less accurate for higher bases. For this problem it's exactly right (for a leading digit of 1) in bases 2 and 4 and off by less than 1% in base 3. -- BenRG (talk) 14:56, 14 August 2008 (UTC)[reply]
But Benford's law isn't just theoretical - Benford supported in with statistical evidence in his original paper, and MathWorld mentions other evidence as well. What I am trying to work out is whether Y/X is a good model for measuring an arbitrary observable in arbitary units. What is the probability that the first significant digit in the base b expansion of an arbitary observable measured in arbitrary units (for example, the speed of light in furlongs per fortnight) is 1 ? Is it logb(2) as per Benford ? Or is it (b+2)/4(b-1) as per the Y/X model ? Gandalf61 (talk) 15:31, 14 August 2008 (UTC)[reply]
Oh, I see what you mean. Benford's law is better. What you're seeing in this problem is edge effects arising from the particular cutoff you've imposed on X and Y. If X is a physical quantity and Y is a unit, it's probably more realistic to assume a uniform distribution on log X and log Y rather than X and Y. The leading-1 case of Benford's law says that the fractional part of log X − log Y will lie in the range [0, 0.301) about 30.1% of the time, which will be true if the fractional part of log X − log Y is uniformly distributed, which will be roughly true as long as X and Y are allowed to range over enough orders of magnitude. -- BenRG (talk) 18:35, 14 August 2008 (UTC)[reply]
Gandalf, the answer is P(Z=1) = 1/9. The nine triangles from 0.1 to 1.0 have each the same area. The nine triangles from 0.01 to 0.1 have each the same area. &c. So each of the nine possible leading digits from 1 to 9, have the same probability, = 1/9. You do not even need to sum the geometric series. Bo Jacoby (talk) 14:18, 14 August 2008 (UTC).[reply]
Here is how I got P(Z=1) = 1/3. We have one sequence of triangles with bases down the right hand side of the square: 0.1X <=Y<0.2X, 0.01X<=Y<0.02X etc. with bases 1/10n and height 1 so total area of this sequence is 1/18. We have another sequence of triangles with bases along the top of the square: X <=Y<2X, 10X<=Y<20X etc. with bases 5/10n and height 1 so total area of this sequence is 5/18. And 1/18 + 5/18 = 1/3. Gandalf61 (talk) 14:35, 14 August 2008 (UTC)[reply]
You're correct, the total area is 1/3 (in general,  ). The nine triangles corresponding to different leading digits have different areas. -- BenRG (talk) 14:56, 14 August 2008 (UTC)[reply]
Sorry, I was too hasty. Bo Jacoby (talk) 17:13, 14 August 2008 (UTC).[reply]
When  , Benford can't apply at all; the first digit is uniformly distributed because it's uniform on each of  , etc. When  , on the other hand, you get a full half probability for 1 from  , and then uniform otherwise. The pattern repeats at  ; it only makes sense that the final probabilities resemble Benford's Law (since   benefits 1…4 but not 5…9), but nothing more precise may be assumed because there is a variable amount of uniformity mixed in and because the relative favor given 1 and then 1/2 and then 1/2/3 is not logarithmically distributed. --Tardis (talk) 18:14, 14 August 2008 (UTC)[reply]
As BenRG notes above, for Benford's law (in base b) to hold exactly for a random variable R, the variable S = (log R) mod (log b) must be uniformly distributed. If X and Y are both uniformly distributed between 0 and a, then the probability density and cumulative probability functions for X/Y and their logarithmic equivalents are
 
Thus, the logarithmic probability density peaks at 0, reflecting the fact that the construction makes it fairly likely that X and Y are of similar magnitude. Writing β = log b, the probability density function for log X/Y mod β is then
 
This cannot be constant for any b; therefore Benford's law (at least in its extended, multi-digit form) will not hold exactly in any base for this distribution. The probability density function does become approximately constant (over its domain 0 ≤ ηβ) as β approaches 0, so the law will more or less hold for small b. For large b, however, the probability density peaks sharply at the ends of the range; in particular, P(log X/Y mod β < η) tends (pointwise) to 1/2 - e-η/2 as β tends to infinity. (The second peak at β moves out to infinity, taking half of the probability mass with it.) Thus, as the geometric reasoning above suggests, P(log X/Y mod β < log 2) indeed approaches 1/2 - e-log 2/2 = 1/4 for large β. —Ilmari Karonen (talk) 22:16, 14 August 2008 (UTC)[reply]

Thank you for all the responses. Obviously if we start with a uniform distribution for log X and log Y over a wide enough range then a Benford-type distribution for Z follows almost trivially. I was hoping that there might be a simple explanation of why Benford's law is empirically seen to apply in less restricted scenarios. As far as I know, the only explanation of Benford's law that does not start out by assuming an underlying uniform log distribution or something close to that is Hill's 1996 paper on samples from "random" distributions, which is a fairly technical explanation. Gandalf61 (talk) 09:21, 15 August 2008 (UTC)[reply]

Well, a slightly better explanation is that, even if the log distribution isn't that close to uniform, "rolling it up" modulo log b will tend to make it more uniform if b is small enough. (Even your sharply peaked example gets pretty close for small b, the relative error P(log X/Y mod log b < log 2) / (logb 2) - 1 being, coincidentally, about b/100 for b up to 60 or so.) In particular, if the variable is approximately log-normally distributed (which the law of large numbers says is likely for variables that are products of many independent underlying factors) with a standard (log-)deviation of more than a few orders of magnitude, then Benford's law is likely to hold quite well. —Ilmari Karonen (talk) 12:46, 15 August 2008 (UTC)[reply]

2012 edit

what day of the week willb 18th August 2012 fall on —Preceding unsigned comment added by 86.135.111.151 (talk) 10:30, 14 August 2008 (UTC)[reply]

Saturday. You can doubleclick on Windows Clock on taskbar. See Calculating the day of the week. —Preceding unsigned comment added by 93.172.15.144 (talk) 10:47, 14 August 2008 (UTC)[reply]
You can also visit 2012 here, and click on the link to leap year starting on Sunday. -- Coneslayer (talk) 16:04, 14 August 2008 (UTC)[reply]
86.135.111.151: What do you expect will happen on August 18, 2012? None of these, I hope? --Bowlhover (talk) 19:09, 15 August 2008 (UTC)[reply]

A question about terminology in elementary category theory edit

How is it named, or how would you name, the following property for a map h:x-->y in some category C:

(P) for any f:x-->x there exists a g:y-->y such that gh=hf.

In other words, if h is put at the two horizontal edges of a square diagram, and f is put at the left edge, we can close a commutative diagram putting g on the right edge.

In fact an analog property may be considered in any algebraic context, that is f,g,h may be elements of a group or so, but has this a name? —Preceding unsigned comment added by 79.38.22.37 (talk) 13:48, 14 August 2008 (UTC)[reply]

How fast? edit

This is an odd question. I purchased the The Flash (TV series) on dvd a while back and one episode had the superhero running 10 miles in 30 seconds to escape a radio signal that would blow up a bomb. I'm sorry to say that I don't remember the formula for calculating speed. How fast is that anyway? --Ghostexorcist (talk) 19:52, 14 August 2008 (UTC)[reply]

Distance = (Speed) x (Time) and Speed = (Distance)/(Time). After you do the math, you figure that the superhero was going at 1200 mph.--El aprendelenguas (talk) 20:45, 14 August 2008 (UTC)[reply]
Which is still slower than the speed of light, which radio wave (being a very low frequency light) travel. --antilivedT | C | G 08:52, 15 August 2008 (UTC)[reply]