Wikipedia:Reference desk/Archives/Mathematics/2012 April 20

Mathematics desk
< April 19 << Mar | April | May >> April 21 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


April 20

edit

Limit question

edit

Here's a question for our able analysts. No differentiation is allowed, so no chain rule, no l'Hôpital's rule, and no Taylor series please.

 

We all know that the answer is 1/x, but what are the absolute bare minimum assumptions needed to prove it? Assume that

 

This implies that ln(a) + ln(b) = ln(ab), ln(a) – ln(b) = ln(a/b) and ln(ak) = k⋅ln(a). Is it possible to evaluate the limits using these and only these properties? Fly by Night (talk) 01:31, 20 April 2012 (UTC)[reply]

Please excuse me for modifying your post by swapping that h for an x. I guess "no differentiation" rules out the fundamental theorem of calculus.
Do you still allow the limit (1+x)^1/x definition of e? If so then
 
Sorry in advance if this is not what you're looking for. I thought by the time I asked and you confirmed you'd like to see it I would have had to type it anyway. Rschwieb (talk) 01:58, 20 April 2012 (UTC)[reply]
I've got a very ugly way to do this, find f and g functions of h and x so both limit to 1 / x and for all x ln's difference quotient is bound above and below by them for all h in a neighborhood of 0. I'd use f(x,h) = ln (exp(1/x) (x + h)/(x - h)) for the lower and g(x,h) = (1/h) [ln (x / x) + (h / x)] = 1 / x for the upper bound (This should follow from showing that exp(b) / b limits to infinity as b goes to 0). I worked this out at work on a post it note, so I apologize if there is any error;even if so, the general idea is sound. PS, I know you didn't introduce exp, however, there is x and y so ln x < 1 < ln y and ln is continous, so there is some z with ln z = 1; it follows that ln (z ** a) = a, so just define exp(x) as z ** x that z so ln z = 1, you don't need any properties specific to e, just that it exists and is > 1. Phoenixia1177 (talk) 04:36, 20 April 2012 (UTC)[reply]
If you're looking for a non-rigorous, but more intuitive argument, notice that your difference quotient can be written  , which is the average of 1 / t over the interval [x, x + h]. It makes sense that when h is 0 that this average would be 1 / x since the interval, at that point, is just the point x; the average of a single value is just that value. Like I said, not rigorous, but I think it makes more sense than weird limit arguments; not sure if that's useful to you or not, though :-) Phoenixia1177 (talk) 04:50, 20 April 2012 (UTC)[reply]
Sorry, to blather on, but I was thinking that you could make the above a real argument if you really wanted to. You could show that for the integral of step function so the interval divides up evenly for each value that the integral form gives the actual average. Then, from there, argue that as the interval shrinks to a point you can bound it above and below by a sequence of step functions that must limit to x; or some such. Really, it'd just be bounding the limit like above mixed with a real round about use of the definition of the integral as the limit of a sum over step functions; not pretty, but it could make the above a legitimate proof. Sorry if that is not easy to read, by the way, I don't get time to use wiki out of work, but am usually doing ten things at once at work...:-)Phoenixia1177 (talk) 04:57, 20 April 2012 (UTC)[reply]
To me, this seems to be the right approach given the constraints. For what it's worth, it's the mean value theorem for integrals (which some might say is "more fundamental" than the fundamental theorem—it is at least more intuitive). Sławomir Biały (talk) 12:13, 20 April 2012 (UTC)[reply]

Proof of order of operations

edit

How can we show using the field axioms of say, the real numbers that operations should be performed in a certain order (i.e. multiplication before addition)? I tried manipulating but it didn't really give me any insight...

 

We can't. It's just a matter of convention whether   or   It is not 'provable'. --CiaPan (talk) 12:54, 20 April 2012 (UTC)[reply]
Gotcha, but I'm just surprised it's not a consequence of the axioms. I mean, what gives multiplication the right over addition, for instance? Suppose if write it out like  , instead of  ? Why? — Preceding unsigned comment added by 77.100.75.68 (talk) 13:13, 20 April 2012 (UTC)[reply]
It's just the way we decided to write things. We could have all decided that all statements must be fully parenthesized to be well formed, but that would be more writing so mathematicians decided to have the conventions we use. They are merely conventions. Consider this: if we had a completely different notation would we need different axioms even though we are expressing the same idea? Or consider this statement a b 3 * +, which is a+b*3 in postfix. RJFJR (talk) 13:29, 20 April 2012 (UTC)[reply]
You might think of it as being a consequence of "notational" (or syntactic) axioms: the rules that specify (in a context-free grammar, say) how to interpret mathematical notation. Notation is just a convenience; Frege's Begriffsschrift does without notation (in some sense), by simply drawing trees. Similarly, programmers in Lisp-like languages, like RJFJR suggests, fully parenthesize their expressions, so they say (+ a (* b c)). But mathematicians have found it convenient to come up with a notation that is terser in the common case. Paul (Stansifer) 05:31, 22 April 2012 (UTC)[reply]

Recursively enumerable theory

edit

There are theories that are complete, consistent, and include elementary arithmetic, but no such theory can be effective. (quoted) Could you give any examples? --151.75.56.185 (talk) 15:27, 20 April 2012 (UTC)[reply]

True arithmetic. Widener (talk) 17:13, 20 April 2012 (UTC)[reply]