Talk:Measure-preserving dynamical system

Latest comment: 2 years ago by 2601:200:C000:1A0:65EE:1F3C:5AB:B9C0 in topic This would be helpful

Redirect edit

Why am I redirected from Kolmogorov entropy to this page? (User:128.227.48.154 on 17 an 2006)

Because its the only page on WP at this time that even partially defines the Kolmogorov entropy (down near the bottom of the article). linas 18:17, 17 January 2006 (UTC)Reply
Then it would be better a good idea to put the term Kolmogorov entropy into the article to reduce confusion.
I thought it was Kolmagorov-Sinai entropy or KS entropy, if it had to be named after somebody. What about "topological entropy"? -- 130.94.162.61 23:51, 27 February 2006 (UTC)Reply
That whole section should be split off and greatly expanded. There's so much more that could be said, using all sorts of different notations and terminology....linas (talk) 04:00, 17 July 2012 (UTC)Reply

problem with definition of generator edit

There seems to be a problem with the following defnition:

"A partition Q is called a generator if μ-almost every point x has a unique symbolic name."

No matter what partition is chosen, every point has a unique symbolic name.


[edit]

Consider the function f defined on the integers by f(x)=x+1 for x odd and f(x)=x-1 for x even. Let Q be the partition into even and odds on Z. We have that each element has the symbolic name EOEOEO... or OEOEOE....(On a side note: this is my first post, and I am not sure if this is the proper way to post on a Talk Page, any help would be appreciated:) Phoenix1177

Well, then f fails to generate!? linas (talk) 04:04, 17 July 2012 (UTC)Reply

Discussion edit

What is written there in Discussion is completely missleading; it is definitelty not the reason why one defines measure preserving transformations via the inverse of  . Even if one asked   there would still exist many such   and many of them are even ergodic (or even mixing). For example, have a look at certain interval exchange transformations; as they are all bijections, they preserve the underlaying measure (Lebesgue in this case) in both directions. The only true reason why we define the measure preserving via the inverse images is that any mapping   which (i) preserves intersections, unions and complements and (ii) sends   to   (observe that these are exactly the properties we need) is of type   for some surjective   while the construction via   does not cover all such possible set to set maps.--140.78.94.103 (talk) 10:14, 14 March 2008 (UTC)Reply

I've removed this nonsense. Arcfrk (talk) 10:38, 21 March 2008 (UTC)Reply
Perhaps some variant of above comments should be added to the article. linas (talk) 03:43, 17 July 2012 (UTC)Reply
Done. 67.198.37.16 (talk) 20:15, 18 September 2020 (UTC)Reply

Example of Measure-theoretic Entropy edit

In the section measure-theoretic entropy it says that the KS entropy of the bernoulli process is log 2. Shouldn't this be the bernoulli map? — Preceding unsigned comment added by 130.216.209.138 (talk) 06:15, 17 April 2012 (UTC)Reply

The map and the process are the same thing, just using different notation, right? Well, almost the same thing, the process is explicitly defined on the cantor set, whereas the map is defined for the reals. But the reals are just a quotient space of the cantor set, so more-or-less the same thing, they differ by a set of measure zero... linas (talk) 03:39, 17 July 2012 (UTC)Reply

Incorrectness edit

"since every real number has a unique binary expansion" —Preceding unsigned comment added by 132.72.45.223 (talk) 11:16, 9 November 2010 (UTC)Reply

Performed lazy-mans fix of linking almost every instead.linas (talk) 03:48, 17 July 2012 (UTC)Reply

Example of measure-preserving map T edit

I believe the example x -> 2x mod 1 not to be an example of a measure-preserving map. Consider for instance the interval [0.1, 0.9] whose preimage under T would be [0.05, 0.45], with Lebesgues measures of 0.8 and 0.4 respectively.

TODO: Metric entropy edit

This article defines measure-entropy, but not metric entropy. 67.198.37.16 (talk) 20:16, 18 September 2020 (UTC)Reply

This would be helpful edit

The definition of measure-theoretic entropy relies on the function f(x) = -x log(x).

It would be very helpful if someone knowledgeable on the subject would include in the article an explanation of why this particular function is used. 2601:200:C000:1A0:65EE:1F3C:5AB:B9C0 (talk) 01:12, 10 August 2021 (UTC)Reply