Talk:Boltzmann constant/Archive 4

Archive 1 Archive 2 Archive 3 Archive 4


Maxwell and Bolzmann and the box with fast atoms all at the same velocity

We seem to be going in circles. Perhaps it would break the cycle if we do this step by step. Damorbel raised an excellent question, What do you think the Maxwell-Boltzmann distribution is? Assuming it was intended rhetorically, it leads naturally into another. Damorbel, what would you say the units are for the average of the Maxwell-Boltzmann distribution? --Vaughan Pratt (talk) 22:19, 3 November 2011 (UTC)

Vaughan Pratt, why do you want to start a discussion on the Maxwell-Boltzmann distribution here? --Damorbel (talk) 06:52, 4 November 2011 (UTC)
Why? In order to illustrate that "temperature" requires a system of many parts in equilibration with another system at the same temperature, to be a meaninful concept. What determines such a "temperature" is the amount of entropy generated in a system for each bit of energy added or subtracted. Heat will flow in the direction in which the largest entropy change is generated per per unit of transfered heat energy, since that process is the one that causes an increase in entropy in the universe, which means it happens spontaneously. That is all that "termperature" means: it defines direction of heat flow. Heat flow is a meaningless concept when talking about individual atoms or particles, and one cannot even figure out the diection of heat flow if the parts of two reservoirs or objects that are in contact, are not in thermodynamic equilibrium within themselves, first. Each part that touches, must have a "temperature." The heat flow direction defines these temperatures; temperature is a short of form of figuring out how much potential systems have to transfer heat to some other object, after their energy contents are separately equilibrated.

For example, consider a 22.4 L box at absolute zero with an interior at absolute zero, into which two 10 gram crystals of solid neon (also at absolute zero, or as close to it as you like) have been fired by cannons in opposite directions, at 1100 m/sec relative to the box. As they head toward impact with each other in the center of the box, it is sealed. Question: what is the temperature of this system, according to you? What is the temperature of each neon atom, as seen in the lab frame? What is the entropy of the system?

Suppose the crystals miss each other and bounce between walls of the box, perfectly elastically. If thermal contact could be make between this box and another box of neon of the same size, filled with gas at STP, which direction would heat flow?

Now, suppose the crystals are allowed to break up and fill the box with neon gas as their kinetic energy is distributed randomly, in a Maxwell-Boltzmann distribution. Has the temperature of the system changed? Has the entropy changed? What is the direction of heat flow now between this box and the box holding the same amount of neon at STP?

Note: I suppose all this amounts to much the same as asking what would happen if Maxwell's demon managed to take the mole of atoms in a 22.4 L box of neon at standard temp and pressure, and get them going all at the same velocity, in 3 orthogonal directions, all while conserving momentum and energy. Clearly the entropy changes. Does the temperature change? That's a question for you. Starting with the crystals, I've merely run that thought experiment in the opposite direction, in the direction of spontaneity. SBHarris 19:16, 4 November 2011 (UTC)

Sbharris, nothing that you write above concerns the Boltzmann constant or is revision by the International Committee for Weights and Measures. --Damorbel (talk) 09:14, 5 November 2011 (UTC)
Alas, nor does anything you've written, so long as you continue to insist that "temperature" is anything other than a statistical property of an aggregate system, like entropy is. This affects how constants are measured. The Bolzmann constant is always inseparably connected to temperature, because it connects temperature and energy scales. However, although there are some physical constants that can be measured from properties of single atoms in traps, the Bolzmann constant, since it requires a connection to "temperature," is not one of them. Never has been, and it never will be. Got that? If you think otherwise, get over it. SBHarris 21:09, 5 November 2011 (UTC)
How then does your argument square with the definition of the Boltzmann constant as "the kinetic energy of a particle", any particle - it doesn't have to be an atom or a molecule, "at a temperature T?" The particle can easily be as big as a pollen grain (for particles many times larger than an atom or a molecule see Brownian motion and 'the Feynman Lectures on Physics', v1 p41-1 ff); the only requirement is for the particle, whatever its size, to exchange energy with other particles (also of any size) by means random collision. I put my often repeated question to you - 'what size do these particles have to be for you recognise them as having a temperature, according to the definition of the Boltzmann constant?'
PS For simplicity I am not considering 'numbers of degrees of freedom per particle' here, a particle's number of degrees of freedom is a separate matter. --Damorbel (talk) 10:44, 6 November 2011 (UTC)
If a pollen grain is moving at the speed of sound, what is its temperature? My understanding is that temperature is not defined when only the speed is known. As you implied above, temperature is related to collisions, which itself implies a large number of particles. So, it is not the size of the particles that matters, but the number of particles per cubic volume. This leads immediately to an interesting question - What is the temperature of empty space? Q Science (talk) 20:15, 6 November 2011 (UTC)
"If a pollen ... what is its temperature". The short answer is that it depends on the mass of the particle because you have defined the velocity, it's like this because the energy of any particle is related to temperature by the product of kB and T. In kinetic theory the particles collide with each other and the walls of the vessel confining them, this means their speed and direction are constantly changing and, since they are confined in some sort of container, their net velocity is zero.
If you check kinetic theory you will see that the kinetic energy 1/2mv2 is equal to 3/2kBT, you can see from this that the temperature is independent from the mass m of the particle but the velocity v is proportional to the square root of the temperature (and so is the speed of sound). There is a better guide to gases and kinetic theory here [1] --Damorbel (talk) 21:38, 6 November 2011 (UTC)
Which is hotter, a one gram hailstone moving at 1,000 m/s or a one gram red-hot cinder moving at 2 m/s ? SpinningSpark 22:37, 6 November 2011 (UTC)
Spinningspark, the best picture I can paint for you is the Armour-piercing discarding sabot. This weapon is fired from a gun when the propellent (at low temperature) is ignited, which turns the propellant and its stored energy into a hot, high pressure gas. This gas expands in the gun's barrel, doing work on the projectile by accelerating it out of the barrel. 'Doing work on the projectile' cools the gas because its thermal (random kinetic) energy is changed into the (forward) kinetic energy of the projectile; the projectile remains fairly cool while in the air but it is carrying a large amount of the (thermal) energy of the gas with it because it is perhaps 2kg. of uranium moving at between 1 and 2 km/s, thus about 6MJ. When the projectile hits its target the velocity is reduced to zero by friction and the energy, that originally comes from the propellant, is released as heat in a very small area of the target.
Thus the chemical energy in a cold compound (the propellant) is transferred to a (hot) gas (kinetic energy of the gas particles), to a (fairly) cool projectile giving it lot of forward kinetic energy and finally to a very hot (because of a lot of thermal (kinetic) energy) target.--Damorbel (talk) 07:20, 7 November 2011 (UTC)
Damorbel, I agree with your description except that the velocity should be the average velocity and not the instantaneous velocity. The instantaneous velocity can be anything from zero to an extremely high value. However, temperature is defined to be related to only the average velocity assuming that the instantaneous velocity follows a Maxwell distribution. Your reference supports this interpretation. Q Science (talk) 23:08, 6 November 2011 (UTC)
I don't see how you can say that the temperature of a particle can only be related to the average particle energy. Sure, when you are trying to measure temperature with a thermometer what you measure is the average energy of the particles but that is a limitation of the thermometer, not a requirement of the physics. Other ways of measuring temperature (defined as 'energy per particle') are more sensitive, one example being the photoelectric effect where electrons are ejected from material only if the impacting particle has sufficient energy. --Damorbel (talk) 07:20, 7 November 2011 (UTC)

Energy, not temperature. You can't talk about the temperature of an individual photon, either. Energy, yes. Temperature, no. SBHarris 07:28, 7 November 2011 (UTC)

According to Feynman ('the Feynman Lectures on Physics', v1 p41-1 ff) temperature is 'energy per particle'. He is not the only one, but I venture to say he is a reliable source. Now perhaps you wish to check if photons are particles, we can follow that path if you like, please say if you so wish; however on the basis of energy 'per particle' photons also have a temperature, just as they have momentum even though they have no mass! --Damorbel (talk) 09:38, 7 November 2011 (UTC)
My copy of Feynman is not searchable and I can not find your quote. However, the paper continuously refers to "mean kinetic energy". Q Science (talk) 14:46, 7 November 2011 (UTC)
On p41-1 l.15 (of the small text) Feymman writes "the mean kinetic energy of a small particle suspended in a liquid or a gas will be 3/2kT even though...". Now the kinetic energy is 1/2mv2. I don't quite see why the mean energy of a large number of atoms should represent the temperature but the instantaneous energy does not represent the instantaneous temperature. Most thermometers are slow to react and need time to measure a temperature, but slow reaction is not a requirement for a temperature to exist; it is not the thermometer that produces the temperature. The possibility exists (and molecular interactions that are temperature sensitive are a good example) of a method of determining the temperature of an individual particle. Further, the fact that a large number of molecules seemingly reach a temperature such as a melting point, means that they have individually acquired enough energy to break the crystalline bonds holding the solid together does not mean that the bonds will break when a mole or some other measure is needed for a melting point to be established. --Damorbel (talk) 16:00, 7 November 2011 (UTC)
Because A implies B cannot be used to argue that B implies A. Because the mean energy per particle can be calculated from the temperature, doesn't imply the energy of a single particle tells you anything about the temperature. As in my example above, you can calculate GDP per capita (mean income) from a country, but no one person has a "mean income." Individual people have individual incomes. Temperature is already a mean quantity and no individual quantiy tells you anything about it, anymore than I can take income of a single person and try to infer a national mean income from it.

Try going back to your idea that individual photons have a temperature. Certainly a collection of photons at different frequences in a blackbody distribution have a collective temperature, but they are a collection. What about one photon of given energy? What about a monochromatic beam of photons at a single energy?

This is not an academic exercise. Your microwave oven heats things that are already so hot they are emitting in the infrared (at about 0.001 cm). But the photons from the oven have a wavelength of several cm. So why does the energy go from oven to food, and not the other way around? Could it be because you're not talking about two things of different temperature at all? The food has a temperature, but what about the microwave output? Remember, the photons from the Big Bang have a wavelength of about 0.1 cm, so your microwave oven photons are "colder" than that, if you insist that their temperature is an individual thing, and even when there are great numbers, it doesn't require a distribution. So by your theory, they shouldn't be able to heat anything much above 0.2 Kelvin or so. My oatmeal gets considerably hotter. In fact, you can put a neon glow tube into a microwave (with a cup of water next to it for a load) and see the neon plasma ionize at thousands of degrees K. What's going on, pray tell? Something wrong with your theory?

A beam of monochromatic microwaves is a lot like a beam of atoms all moving in one direction at one velocity. You think it has a "temperature" you can calculate as energy/particle. I'm here to tell you that since it has an entropy/energy as low as you like (depending on how monochromatic it is) it can induce temperatures as high as you like. Think about that next time you see plasma sparks at very high temps, from metal points inside your microwave oven. SBHarris 17:29, 7 November 2011 (UTC)

Sbharris When your neon tube is put in the microwave it is the electric field generated by a very large number of electrons oscillating in a magnetron osscillator that ionises the gas, just as it can make sparks (sparks are a high pressure version of a glow discharge).
Further you write:- "A beam of monochromatic microwaves is a lot like a beam of atoms all moving in one direction at one velocity. You think it has a "temperature" you can calculate as energy/particle. I'm here to tell you that since it has an entropy/energy as low as you like (depending on how monochromatic it is) it can induce temperatures as high as you like." ---- I'm not sure about the microwave beam being equivalent to a beam of particles, particles do not travel at the speed of light but for the rest you are correct. A large collection of particles does not have, according to the Maxwell-Boltzmann distribution of energy (it is mono-energetic) the system is in severe disequilibrium. The energy that the particles your beam has will appear somewhere in a Maxwell-Boltzmann distribution, a distribution that will have a peak at a wavelength corresponding to some temperatute, this is the maximum temperature your beam can heat anything to.--Damorbel (talk) 13:22, 8 November 2011 (UTC)
Damorbel, I found the second quote in my copy of Feynman. Thanks. It is my understanding that temperature is defined as a bulk property of matter and is not defined for individual atoms. Think of this as just one of the conventions of language. As far as I can tell, most of the editors here, as well as Feynman, are using this convention. Q Science (talk) 20:41, 7 November 2011 (UTC)
Q Science your argument leaves me baffled! The formula for the energy of a particle with three degrees of freedom is E =3/2kBT where T is temperature, by rearranging T = 2/3 E/kB o the temperature T of the particle is explicit, it isn't the temperature of any other particle, how can it be? The basic principle of Kinetic theory is that the particles making up the ensemble are independent except for random collisions. In a given sample the particles have a range of energies, if the energy per particle is expressed as 1/2 mv2 or 3/2kBT what is the difference? The collision process is always going to produce a wide range of particle energies, calling it 'kinetic energy' or 'particle temperature' is identical.--Damorbel (talk) 13:01, 8 November 2011 (UTC)
The difference is that you can't seem to understand the difference between the energy of a single particle E, and the average (mean) energy of particle in of a collection of particles <E>. You're told about the last, and you assume the first. But they are not the same. Feynman says in vol. 1 chapter 41 "the mean kinetic energy of a small particle suspended in a liquid or a gas will be 3/2kT even though...". Do you see the word mean? He is talking about average energy per particle not the energy of each particular particle (since obviously every particle doesn't have the average energy). You write "The formula for the energy of a particle with three degrees of freedom is E =3/2kBT where T is temperature," but that is incorrect, and is not the one Feynman gives in chapter 39 on gas kinetic theory. That E is the average energy per particle, not the energy of a particle (any given particle). The formula you give is wrong as you define the variables. E there canoot be the energy of a single particle. Rather, E there in that formula is <E>, the mean/average E. No text you have says anything else, yet you refuse to believe your texts. You refuse to believe Feynman, even though you quote him! Why don't you look at chapter 39 instead of chapter 41, where you will find that every time Feynman talks about energy of gas molecules in the context of temperature, he uses <E>, the mean energy. And he defines temperature in terms of the mean energy. That's how temperature is defined. Once it is defined, any energy associated with temperature must be a mean energy. Several people have told you that above, and it's not penetrating!

I'm disappointed in you about the microwaves. Yes, photons are particles (see photoelectric effect and Compton effect). You can have one photon. You can't switch to talking about the electric field of an EM wave when it suits you, but talk about individual particles of light (as you do above-- you brought the subject up) when you think that will serve your argument. Nature must do the same thing no matter which way you chose to view it/her.

Yes, you're quite right that the maximum temperature a beam can heat anything to, is the slice out of the Stefan-Bolzmann power law it represents for a black body, but that depends on the narrowness of frequency, and the beam's maldistribution of frequencies away from that of a from black body spectrum (which is the analog of Maxwell-Bolzmann distribution for boson photons). But this does not fit your definition of temperature. First, your definition clearly doesn't work for a single photon. Second, it allows me to adjust the "heating power" of a beam of a given energy, simply by making it more monochromatic, but keeping its power the same. If "temperature" is simply energy per particle in any circumstance (or an average of each particle energy for any particle collection), I shouldn't be able to do that, since the number of particles and the energy are not changed by this, yet the heating power of the beam (what it can do to temperatures of objects) is.

Finally, even if you don't accept the quantum theory of EM radiation, you can replace a microwave beam with a beam of atoms, all moving at a velocity which has a very narrow spectrum (that is, they are all going nearly the same velocity, as closely as physical possible). I did that in my previous examples and you missed the point. The smaller you get such a beam velocity spread, the hotter that beam can heat another collection of gas you aim it at. Do you understand this point? If I shoot a beam of atoms at a bottle of gas, even if the beam atoms have far less kinetic energy than the average energy of the molecules in the gas in the bottle, I will heat the bottle and the gas, so long as the power of the atom beam exceeds the power of that velocity-spread channel in the Maxwell-Bolzmann temp that I aim it at (and assuming that the incoming gas molecules can bounce off the bottle, so they can transfer some energy without having to stay behind and be equilibrated). If atoms in any object are constrained (as in solid objects) a bombardment of atoms of any temperature-- even close to absolute zero, will heat the object, so long as the density of the beam is high enough. That's more or less what you do when you throw a baseball at a wall! The baseball can be cooler than the wall (in terms of what you measure when you stick a themometer in both), and even can be colder when you count its kinetic energy, and yet it will still heat it when it strikes, and you can keep that up till the wall is very hot. The point is that you can arrange this with a baseball so the energy transfer is only in one-direction, because the atoms in the baseball are only moving in one direction.

You could go farther and hit a paddle-wheel with slow baseballs outside a container of gas, and this wheel could turn a crank connected to a fan inside the box that heats the gas by friction. This works no matter what speed of the baseball, since the wheel always turns only one direction, even with a slow cold ball. You could do the same with a gas beam, turing a paddle wheel outside the box connected to a fan inside. You begin to see the role entropy plays in energy transfer, since none of this would work if the gas outside didn't have not only energy/per particle, but a DIRECTED energy per particle (low entropy per energy).

The gas certainly has a temperature, but what about the beam? (or the cold baseball?). If you're going to insist the beam does also, and insist that it is defined as the mean kinetic energy of the beam atoms (even though it is maldistributed) then you must say this is a situation in which heat is transferred from cold to hot, spontaneously. It's exactly the same situation as a microwave beam or laser directed into a hole in a black body radiator. Thus, we either need to give up the second law of thermodynamics (heat goes from hot to cold, those being defined in terms of temperature), or else we must give up your (private) definition of "temperature." Which is it to be? SBHarris 16:14, 8 November 2011 (UTC)

User Sbharris you wrote "Do you see the word mean? " My response [2] to Heald makes it quite clear that, overtime, all particles in an ensemble will have the same average temperature; also at any instant the sum of the energies (which energies have the Maxwell-Boltzmann distribution) of all the particle divided by AN gives the ensemble temperature. BTW you write "That E is the average energy per particle, not the energy of a particle." Do you mean by this that "the energy of a particle" is not an allowable concept? That a photon cannot be considered as a particle with an energy E = hv? That the mechanical energy of (massive) particles cannot be E =3/2kBT?
Further you write "You can't switch to talking about the electric field of an EM wave when it suits you". Only when it is appropriate. For a microwave oven the photon energy is about 10-5 that of infrared photons of IR. So to get any useful tranfer of energy at microwave frequencies the Magnetron has perhaps 1020 electrons oscillating in its cavities; counting the photons is possible but the old fashioned way of measuring the electric and magnetic fields (or electromagnetic field) due to the electrons and their flow is much more familiar and has few if any disadvantages.
Finally I have not yet had an answer to the question I posed about ensembles of particles not in equilibrium, thus not having a Maxwell-Boltzmann distribution. What is the temperature of such an ensemble? What is the meaning of the average energy of the particles in such an ensemble? --Damorbel (talk) 14:11, 9 November 2011 (UTC)
And another final question that has not been answered. If an atom or a molecule is too small to have a temperature according to the definition of the Boltzmann constant, what then is the smallest particle that can have a temperature? --Damorbel (talk) 18:04, 9 November 2011 (UTC)
It isn't that an atom is too small, the problem is the number of atoms per unit volume. Given a container, if the probability of an atom hitting another atom is "much larger" than the probability of hitting the wall of the container, then there are enough atoms. And no, I don't have a value for "much larger". BTW, this is why I think there is a problem providing a temperature for the thermosphere (more than 1,500 °C), there aren't enough molecules per unit volume. For instance, "above 160 kilometers ... the density is so low that molecular interactions are too infrequent to permit the transmission of sound". Q Science (talk) 07:44, 10 November 2011 (UTC)
Q Science, you write "the problem is the number of atoms per unit volume" Yes I agree completely. For the Maxwell-Boltzmann distribution to apply the particles must be exchanging energy in a random way, if their density is so low that they are exchanging momentum with the container walls more often than with each other this is not the case. You point out that "above 160 kilometers ... the density is so low that molecular interactions are too infrequent to permit the transmission of sound", too true! The transmission of sound requires the exchange of momentum by the particles of gas, that is why the speed of sound is a function of the square root of the gas temperature T, but only if the particles are able to exchange (most of their) energy (momentum) by collision, no collisions = no sound - at all! --Damorbel (talk) 13:11, 10 November 2011 (UTC)

Hello everyone. Just to let you know, I have made a minor edit and replaced the reference "Boltzmann's constant" with "the Boltzmann constant", merely to ensure consistency throughout the article. However, I did not touch the quotation from Planck's Nobel Lecture - which proved to be accurate, nor the occurrence in the title of item #7 in the "References" list - which I cannot verify. --NicAdi 14:35, 29 December 2011 (UTC)

Archived

  • I have archived the hugely long discussion on this page. Since no one is offering sourced proposals for improving this article it really does not belong here per WP:TALK. Please do not reopen it. SpinningSpark 19:04, 7 November 2011 (UTC)
  • Please do not act unilaterally without seeking consensus, as (last I checked) nobody died and made you king. Per WP:BRD, other comments are needed.

    Personally, I'm learning a lot from this discussion about how to present this difficult subject to readers who don't understand it well. SBHarris 19:26, 7 November 2011 (UTC)

  • Spinningspark, I don't agree with your archiving. You complain about the lack of sources; there are plenty of sources, too many is not necessarily helpful. You don't comment on the sources given, how do you maintain this position? --Damorbel (talk) 19:49, 7 November 2011 (UTC)
  • Wikipedia is not a forum. It's not a place to learn "how to present this difficult subject to readers who don't understand it well", valuable though that may be. Damorbel has had long enough on this (policy would say far too long), and so far hasn't put up a single source that explicitly supports his contention.
  • Let's give Damorbel a chance to say whether he accepts SBHarris's last point about the microwave, so we can put this whole thing to bed with consensus achieved. That would be the best outcome. But if Damorbel doesn't accept it, I say let's call the discussion dead there, and archive it on that basis. Jheald (talk) 20:18, 7 November 2011 (UTC)
  • My problem with the "archive" is that it blanked the history page. On other pages, that does not happen. Please fix it. Q Science (talk) 20:45, 7 November 2011 (UTC)
  • Yeah, archiving should be by copy-and-paste, not redirect. We're going to need a history-merging specialist to patch this up. Jheald (talk) 20:56, 7 November 2011 (UTC)
  • Worse than redirect-- this was an "archive" by page-move and redirect, which effectively moved the page's revision history [3] to the archive page's history, rather than left it at the main article talk page revision history, where it belongs. So per WP:MOVE it's going to take an admin to fix it straightforwardly. The real problem is that a new page for Talk:Bolzmann constant was created, and thus its page-change history along with it. Now it can't be written-over by an ordinary editor. What a FUBAR. SBHarris 23:35, 7 November 2011 (UTC)
  • {{Histmerge|Talk:Boltzmann constant/Archive 1}}{{Histmerge|Talk:Boltzmann constant/Archive 2}}
    I added the two Histmerge tags and a comment at Wikipedia:Cut_and_paste_move_repair_holding_pen Q Science (talk) 06:52, 8 November 2011 (UTC)
  • I have now archived 4 old discussions back to Talk:Boltzmann constant/Archive 2. Anthony Appleyard (talk) 10:35, 8 November 2011 (UTC)
Anthony Appleyard - thanks! --Damorbel (talk) 12:31, 8 November 2011 (UTC)

Latest Revision

I have removed the text:-

, which must necessarily be observed at the collective or bulk level

Which is quite wrong. There is no need to 'observe temperature at the bulk level'

Temperature is an intensive quantity which means it is independent of the amount ('number of particles) present.

Surely it takes onl a little imagination to realise that temperature is the measure of the energy of a particle and the average energy of 'N' particles when there are a number (N) particles in the system? (the system must have only one temperature i.e. it must be in equilibrium, of course.

As for observe, what is that supposed to mean? In quantum systems particles with insufficient energy are unable to initiate reactions; check Einstein's Nobel prize winning photo emission paper of 1905. In this paper he showed how only photons with sufficient energy (= high enough temperature) were able to displace electrons. --Damorbel (talk) 15:48, 5 December 2012 (UTC)

Latest Revision (2)

The argument for the revision above is also applies to all relationships in the article between microscopic (particle level) effects and macroscopic (or bulk) systems.

Macroscopic systems have an undefined number of particles, a Mole (unit) is an example with NA (the Avogadro number) of particles but a number 106 smaller changes nothing.

Having a large number of particles in a system places an additional requirement on the (Boltzmann constant) relationship between particle energy and temperature. To establish a measurable temperature for such a system the average energy of the particles must be related by the Boltzmann constant.

I have removed or modified a number of texts in the article :=

1/ The Boltzmann constant, k, is a bridge between macroscopic and microscopic physics, since temperature (T) makes sense only in the macroscopic world, while the quantity kT gives a quantity of energy which is on the order of the average energy of a given atom in a substance with a temperature T.

It is not "a bridge.... etc." The are various requirements for systems with > 1 particle; one is where the particles are able to exchange energy in random way, this is the basis of thermal equilibrium, the Maxwell Boltzmann distribution describes the energy distribtion of such a system. Another multi-particle sytem with a defined temperature is one where all the particles have the same energy; this is unusual because it is not an equilibrium condition but can arise at low densities where the particles seldom exchange energy through collisions.

This statement:-

while the quantity kT gives a quantity of energy which is on the order of the average energy of a given atom in a substance with a temperature T.

is wildly untrue!

In particle physics it is the molecule, not the atom that is the relevant unit. Polyatomic molecules such as carbon dioxide have a much higher heat capacity than monatomics such as helium, diatomics (H2, O2 ) are in between. To understand the reasons why, the article Degrees of freedom (physics and chemistry) helps explain.

Please, gentlemen, argue and discuss before inserting such statements!

Therefore the section "Bridge from macroscopic to microscopic physics" contributes nothing to the Boltzmann constant article and will be deleted. --Damorbel (talk) 07:38, 6 December 2012 (UTC)

"Role in the equipartition of energy" ?

The Boltzmann constant will shortly replace the Kelvin as one of the seven base units in the International System of Units (SI) because it can be determined more accurately than the triple point (see http://iopscience.iop.org/0026-1394/48/3/008/pdf/0026-1394_48_3_008.pdf) more accurately

Currently the Boltzmann constant can be determined to 6 places (kB = 1.380 651(17) × 10−23 JK−1) by measuring johnson noise. There is no way the Boltzmann constant can be regarded as:-

the thermal energy carried by each microscopic "degree of freedom" in the system is on the order of magnitude of kT/2

There are a number of notable features here, the constant is measured using electron, not gas, temperature, so the article, when limiting itself to atoms of gas is far too restrictive.

The Boltzmann constant is applicable to all particles, even grains of pollen, as observed by Einstein in his 1905 paper on Brownian motion entitled :-

On the Motion of Small Particles Suspended in a Stationary Liquid, as Required by the Molecular Kinetic Theory of Heat --Damorbel (talk) 08:09, 6 December 2012 (UTC)

Role in the equipartition of energy (2)

The section Role in the equipartition of energy has no content about the equipartition of energy. I have inserted a link. --Damorbel (talk) 08:58, 6 December 2012 (UTC)

I have changed the link into a {{main}} template.
Since you're aware of the article on Equipartition of energy, which is a good thing, can I direct your attention to the section Equipartition of energy#Failure_due_to_quantum_effects.
It's the section which starts

The law of equipartition breaks down when the thermal energy kBT is significantly smaller than the spacing between energy levels. Equipartition no longer holds because it is a poor approximation to assume that the energy levels form a smooth continuum, which is required in the derivations of the equipartition theorem above.

This is why it is a mistake to think of temperature as the average energy per degree of freedom. In quantum systems, where the energy levels no longer form a smooth continuum, that ceases to be true, and one needs to move to a more sophisticated idea of temperature.
In turn, that is why it important to say that "the thermal energy carried by each microscopic "degree of freedom" in the system is on the order of magnitude of kT/2", rather than saying it is equal to kT/2 -- because
  • (1) in quantum systems thermal activity in those degrees of freedom can get "frozen out", depending on the density of states; and
  • (2) even in classical systems (like your Brownian motion system), it is only as an average over all the molecules in the system that the thermal energy equals kT/2 -- the thermal energy of a particular individual degree of freedom will have a probability distribution, so at any particular time will only be on the order of kT/2.
Jheald (talk) 13:57, 6 December 2012 (UTC)
I have also reverted your substantial deletion to the section "Bridge from macroscopic to microscopic physics"
The key point the section is making is that PV=nRT is an equation that is entirely about moles of gas -- the quatities involved can all be measured at an entirely macroscopic level. But changing from nR to Nk moves that frame of reference to thinking about individual molecules of gas. It opens a doorway to a microscopic view of thermal physics, where one then looks to relate pressure to the kinetic theory of molecules, and temperature to the properties of ensembles of molecules.
I think that is a valuable proposition, and one worth keeping in the article. Jheald (talk) 14:07, 6 December 2012 (UTC)
I'm sorry you chose to undo my contribution before this discussion. You write:-
This is why it is a mistake to think of temperature as the average energy per degree of freedom
on the reasoning that, becase of "Failure due to quantum effects" :-
Equipartition no longer holds because it is a poor approximation to assume that the energy levels form a smooth continuum
In thermal system where the molecules have sufficient energy to excite quantum transitions, the energy in the quantum states is 'in quanta', i.e. fixed lumps in common parlance. Being fixed these, quanta are not part of the kinetic energy of the particles; for this reason kinetic (or thermal) physics excludes energy stored in quantum levels. The same is seen in other, none kinetic (i.e potential ) energy forms such as chemical bonds and intermolecular forces such as van der Waals forces. These arguments are fundamental to quantum theory and are the basis of Planck's law of radiation.
The equipartition of energy article also refers to other cases where it 'breaks down'. It has, for coupled oscillators :-
equipartition often breaks down for such systems, because there is no exchange of energy between the normal modes. In an extreme situation, the modes are independent and so their energies are independently conserved
It should, of course, be obvious that systems that cannot exchange energy will not partition energy equally. Again, the definition of thermal systems requires that the system particles freely exchange energy according to the Maxwell Boltzmann distribution; quantum interactions are not, by definition, free exchanges of energy.
Further, about the "Bridge from macroscopic to microscopic physics":-
The Boltzmann constant is relevant to all sorts of macroscopic physics and this should be mentioned in the article, but this constant is a property of a particle, any particle, that can exchange mechanical energy (difficult for neutrinos!) But atoms, molecules, electrons, protons, etc. all qualify as particles. Electrons are really interesting in this respect because, in semiconductors at working temperature, the band gap of a doped semiconductor is near the thermal energy of an electron (see carrier generation and recombination and the working temperature influences the conduction process much more than in (metal) conductors. All of these matters are a consequence of he Boltzmann relation and should be mentioned in the article but they are of secondary relevance. Matters of primary interest in this article are the value of the constant and how it is measured - such information is entirely relevant yet doesn't appear in the article.
Now I don't want to indulge in an edit war so I'm not going to restore your 'undo' but I do think you should consider doing it yourself, if you are convinced by the arguments I have presented (for the second time).
Have a nice day! --Damorbel (talk) 11:31, 7 December 2012 (UTC)
So "quanta are not part of the kinetic energy of the particles" ?
That's interesting. Have you considered applying that statement to diatomic molecules in a gas at room temperature? Most people would consider the energy of the molecules stretching and unstretching (vibration) or tumbling (rotation perpendicular to their axis) as exactly what they mean as part of the kinetic energy of the molecules.
 
Specific heat capacity of a diatomic gas (real gases)
 
Specific heat capacity of a diatomic gas (idealised)
But if you look at the real data, in terms of the heat capacity per molecule, this is exactly where you don't see equipartition -- you don't see a neat heat capacity of (1/2)k per degree of freedom. Instead you see much messier data, with the molecules only reaching a heat capacity of (7/2)k at high temperatures -- and if you look at the halogens, you can see even slightly overshooting (7/2)k.
You write: "Being fixed these quanta are not part of the kinetic energy of the particles; for this reason kinetic (or thermal) physics excludes energy stored in quantum levels."
But thermal physics precisely does not exclude these rotational and vibrational energies: they are vital to include to accurately account for the heat capacity.
Again, the definition of thermal systems requires that the system particles freely exchange energy according to the Maxwell Boltzmann distribution; quantum interactions are not, by definition, free exchanges of energy.
Thermal systems need to be able to exchange energy, but not necessarily according to a Maxwell Boltzmann distribution. The distribution you actually get will be a combination of the Boltzmann distribution and the density of states, which might not be either quadratic or continuous.
Diatomic gas molecules very much are themal systems. Their rotation energies and (at higher temperatures) vibration energies are thermalised, and are readily exchanged. The issue is, they have a ladder of possible energies rather than a continuum of possible energies, so exchange of energy between them knocks them up and down this ladder. And this means their heat capacity departs from k/2 per degree of freedom. But the heat capacity is not zero -- there is a distribution of occupation levels on the energy ladders, so you can't say that these degrees of freedom are not thermalised.
Finally you write: "The Boltzmann constant ... is a property of a particle, any particle, that can exchange mechanical energy"
It isn't. The Boltzmann constant is a property of the system of units used to measure entropy. It sets the mapping between bits and Joules per Kelvin.
A discussion of the history of how this constant of proportionality has been determined might be useful, from R/NA, through various quantum-mechanical measurements, to today when it is now proposed to be set by fiat like the speed of light. But the bottom line is that the Boltzmann constant is essentially to do with how we define a Kelvin, rather than being a property of a particle. Jheald (talk) 12:59, 7 December 2012 (UTC)

Jheald, commenting upon my assertion that:-

"quanta are not part of the kinetic energy of the particles".

And then you write:-

But thermal physics precisely does not exclude these rotational and vibrational energies, they are vital to include to accurately account for the heat capacity

What you refer to as these rotational and vibrational energies are not quantum states of the particles and I certainly do not exclude them from the particle energy contributing to heat capacity etc., etc.; of course they do! All I am saying is that the energy locked in particle quantum states is not part of the kinetic energy = 1/2v2that is exchanged during thermal collisions.

Again you write:-

Thermal systems need to be able to exchange energy, but not necessarily according to a Maxwell Boltzmann distribution.

Who ever said they did? The Maxwell Boltzmann distribution only applies in equlibrium conditions i.e. when the particles exchange energy freely and with equal probability.

You write :-

Diatomic gas molecules ..... Their rotation energies and (at higher temperatures) vibration energies are thermalised,...

Yes, because at low temperatures the vibration energies are too small to overcome the diatomic binding forces, thus there is little or no energy in the (elastic) binding at low temperatures.

Further, you write:-

The issue is, they have a ladder of possible energies rather than a continuum of possible energies, so exchange of energy between them knocks them up and down this ladder. And this means their heat capacity departs from k/2 per degree of freedom.

Not true. When energy is locked in a quantum state it only has quantum degrees of freedom which will not be activated by quantum events with lower energy, that is why Einstein got the 1922 Nobel prize. The energy in the inner quantum states of atoms is relatively high, for the inner electrons of metals it is extremely high e.g. X-Rays. The quantum states of molecules are in the range of thermal energy, 1.24 meV - 1.7 eV, which corresponds to the kinetic energy of molecules at intermediate tempertures so there is a great deal of overlap, effectively leading to a continuum, as can be seen from the Planck energy spectrum which never goes to zero.

You write:-

so you can't say that these degrees of freedom are not thermalised.

Ultimately, at high enough temperatures, all atoms will be stripped of their electrons i.e. 100% ionised and all the electrons and protons will be thermal and none of them will be in particle aggregates such as atoms and molecules. Such conditions are said to exist at the centre of stars.

The reason for my original deletion, which you 'undid' was to eliminate the very real confusion between the Bolztmann constant and the application of it. The Boltzmann constant is a very simple but important ratio between temperature and particle energy, it has the same dimension as entropy, joules/K but entropy is about a system of particles whereas the Boltzmann constant is about a single degree of freedom. In view of all this I would like you to undo the deletion of my contribution.--Damorbel (talk) 10:18, 8 December 2012 (UTC)

Just to be absolutely clear. The "rotational and vibrational energies" are quantised (see eg our articles on Rotational spectroscopy and on Molecular vibration), and this quantisation is important.
Of course atomic and molecular electron orbitals are quantised too, but it wasn't quantisation of electrons I was discussing above. I was specifically discussing the quantisation of energies of the rotational and vibrational kinetic degrees of freedom of the molecules themselves as whole molecules.
The rotational and vibrational energies do indeed have a ladder of states, and the occupation of these is thermalised. It is a reasonably correct description to say that two molecules get knocked up and down these ladders, interchanging energy, when they collide.
At everyday temperatures, as the heat capacity plots make clear, these ladders of energies are not frozen out -- otherwise the heat capacity would be stuck at (3/2) k; nor are they so dense compared to the thermal energy that they can be treated as a continuum -- otherwise the heat capacity would be the full (7/2) k. Instead they have to be treated for what they are -- a thermalised ladder of quantum energy states. As a result, as I wrote above, this means the heat capacity per molecule departs from k/2 per kinetic degree of freedom.
So it's quite wrong to say things like "kinetic (or thermal) physics excludes energy stored in quantum levels"; and it's better to say "the thermal energy carried by each microscopic "degree of freedom" in the system is on the order of magnitude of kT/2", rather than saying it is equal to kT/2, as further discussed in my first post above.
And that's why it's not a very good idea to think of the Boltzmann constant primarily as a ratio of a temperature to an equipartion energy. The Boltzmann constant is more universal than that. It fixes the scale for our customary units of entropy -- whether the entropy associated with a single degree of freedom (and yes, you can have entropy associated with a single degree of freedom), or whether associated with a system of particles.
Equipartition of energy follows as a consequence if there is a smooth quadratic density of states. Non-smooth or non-quadratic densities of states don't produce equipartition. The Boltzmann constant fixes the units of entropy. This is the relationship that is fundamental. That, through the relation (1/T) = dS/dE, is then what fixes the numerical temperature associated with idealised equipartition. Jheald (talk) 12:12, 8 December 2012 (UTC)
You write:-
Just to be absolutely clear. The "rotational and vibrational energies" are quantised
This is not an informed argument. Can you name me an interaction that is not quantised?
I was specifically discussing the quantisation of energies of the rotational and vibrational kinetic degrees of freedom of the molecules themselves as whole molecules.
And my answer was that the thermal (i.e.) translational energies are of the same order and are thus indistinguishable.
You write :-
So it's quite wrong to say things like "kinetic (or thermal) physics excludes energy stored in quantum levels"
The thermal energies are of the order 0.025 - 1.00 meV; these thermal photons will do nothing to quantum states 1.1eV and higher, their energy is excluded from the kinetic interactions; thus there is no equipartition with these quantum states.
You write :-
The Boltzmann constant fixes the units of entropy
How can it do this?
The Boltzmann constant is a microscopic unit (per degree of freedom) with one temperature T = Q/kB Entropy is a macroscopic unit dependent on the energy distribution in a system of particles. --Damorbel (talk) 13:18, 8 December 2012 (UTC)
 
Or, if you want to be slightly more sophisticated about it,
 
So if you've got a probability distribution, it's got an entropy. That can be probabilities associated with one degree of freedom; it can be the probability distribution of the combined state of a whole system of particles. It doesn't matter. If it has a probability distribution, then it has an entropy. What k does is just change the units from nats (or bits) to whatever you want to use as your preferred measure of entropy -- kilocalories per Rankine, or kilogram (furlongs per fortnight) squared per Rømer, or whatever.
That's how it fixes your units of entropy. Then, once you have chosen units for energy and entropy, temperature follows directly, from (1/T) = dS/dE.
Now, to try just once more to get over my other point to you:
The energies of molecular rotation and vibration at room temperature are kinetic, are thermal, and are quantised -- and the fact of that quantisation matters, because it directly affects the heat capacity associated with the degree of freedom.
I'm not talking about electrons here. I'm not talking about photons. I'm talking about the kinetic rotation energy of the molecules themselves, which is interchanged when they bash into each other.
Talking about temperature as related to the average energy per degree of freedom might be okay when introducing the subject to children; but it's not a good long-term standpoint, because there are too many systems for which it is not true, either because the density of states is not perfectly quadratic, or (as above) because it's quantised, with energy gaps of a size that means the quantisation cannot be ignored. A better long-term strategy is to think of temperature as (1/T) = dS/dE, becuase that's more fundamental, more general, and encompasses equipartition as a special case when the density of states is smooth and quadratic. Jheald (talk) 14:31, 8 December 2012 (UTC)

The article is about the Boltzmann constant, but Jheald writes about :-

if you've got a probability distribution, it's got an entropy.

The article is about the Boltzmann constant which is the energy of a single particle per Kelvin, but Jheald writes about:-

That's how it fixes your units of entropy. As if entropy was a constant like the Boltzmann constant !

The article is about the Boltzmann constant, but Jheald writes about :-

I'm not talking about electrons here. I'm not talking about photons. I'm talking about the kinetic rotation energy of the molecules themselves, which is interchanged when they bash into each other.

Oh dear! Electrons? Photons? The kinetic rotation energy of the molecules? Yes Jheald, I know what your talking about and it isn't the Boltzmann constant!

The energies of molecular rotation and vibration at room temperature are kinetic, are thermal, and are quantised -- and the fact of that quantisation matters, because it directly affects the heat capacity associated with the degree of freedom.

Oh dear! Where is the Boltzmann constant in this?

Now I know that you have nothing to contribute to the article on the Boltzmann constant, what a shame!

And now I feel free to restore my contribution after your deletion. --Damorbel (talk) 18:00, 8 December 2012 (UTC)

I have proposed at WT:PHYSICS that you be community banned from further editing articles and talk pages related to thermodynamics. Sometimes basic competence is required. Jheald (talk) 21:37, 8 December 2012 (UTC)
What part of the article is being discussed here? I think the discussion is motivated by the deletion (and reinstatement) of Boltzmann_constant#Bridge_from_macroscopic_to_microscopic_physics, but it wanders all over the place. It should be simple to resolve the issue - find sources for the content, then archive this discussion. The burden of evidence is on the person who adds or restores material; but Damorbel should allow others some time to find the sources. RockMagnetist (talk) 18:44, 11 December 2012 (UTC)
I removed the tags and added a source in that section... Maschen (talk) 18:58, 11 December 2012 (UTC)
Archive 1 Archive 2 Archive 3 Archive 4