Wikipedia:Reference desk/Archives/Science/2014 January 19

Science desk
< January 18 << Dec | January | Feb >> January 20 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


January 19

edit

Bleach

edit

If bleach is so harmful to human health, why is it used so widely as a cleaning product and even in swimming pools we swim in? Clover345 (talk) 12:21, 19 January 2014 (UTC)[reply]

It's a matter of use and concentration. Dihydrogen monoxide is deadly in large quantities, too, and yet it's in nearly every kind of food and even indispensable for our water supply. I don't think that bleach is used in swimming pools - chlorine, which is an active ingredient in many kinds of bleach is also used for disinfecting swimming pools, but I think the delivery mechanism is quite different. I could be wrong, though - chlorine bleach, ubiquitous in the US, is very rare in Germany. --Stephan Schulz (talk) 12:29, 19 January 2014 (UTC)[reply]
See bleach for more information. There are lots of different bleaches.--Shantavira|feed me 13:56, 19 January 2014 (UTC)[reply]
I believe Sodium hypochlorite is used in swimming pools sometimes, but generally in home swimming pools rather than larger public ones which may use direct chlorination (i.e. with chlorine gas). We do have an article Swimming pool sanitation which suggests something similar although also suggests only large commercial public swimming pools use direct chlorination.
(Of course not everyone use chlorination, bromination is another alternative and there are other possible methods.)
However people may or may not use sodium hypohlorite in the form of bleach (by which I mean something sold as bleach in the laundry or cleaning section). I think pool supply stores may generally have more specialised products, these may include tablets or powder/granules which I think are normally calcium hypochlorite or sometimes may be lithium hypochlorite, or sodium hypochlorite solutions in higher concentration probably not marketed as bleach. But provided you avoid junk [1], I'm not sure there's any advantage to choosing any of these, I imagine the best is whatever gives you the appropriate concentration for the lowest cost and is easy to store and safely apply. (The only real concern seems to be the calcium contributing to water hardness [2] and of course the difference between storing and applying a solid vs a liquid. I believe you normally have to predissolve the solids to ensure they get properly dissolved.)
I acknowledge, as per the article, swimming pool sanitation can be complicated with regular testing etc required. It seems one common addition is cyanuric acid or alternatively products which contain the already chlorinated cyanurates [[Trichloroisocyanuric acid] or Dichloroisocyanuric acid (these are commonly called stabilised chlorines in pool supplies) as the greatly increase the halflife of the chlorination (increases sunlight stability). And you also have to deal with the pH etc.
But it does seem to me there's also a degree of lack of knowledge. For example, there seems to be a concern over the build up of cyanuric acid when you use the chlorinated cyanurates and the need for water replacement or higher levels of chlorine [3] [4]. But I don't understand why this is a problem, it seems to me the logical thing is to use stabilised chlorines initially and then once you reach a suitable level, only add unstablised chlorines (whether bleach, sodium hypochorlite solutions for pools, calcium hypochlorite tablets without cyanuric acid or whatever) until cyanuric acid levels start to drop to a level below what you want. (This site [5] seems to agree.) Alternatively it may be easier to add the cyanuric acid seperately.
So it wouldn't surprise me if people are using calcium hypochlorite or even sodium hypochlorite solutions sold by pool supply companies when they are actually paying more to achieve the same chlorination as they would achieve using ordinary sodium hypochlorite bleach simply because they never considered it i.e. not for convience or anything else.
BTW AFAIK the situation is similar for disinfection of water. Municipal water supplies usually use direct chlorination. But simply adding sodium hypochloride bleach is recommend for smaller scale operations such as after a natural disaster or in areas without potable water supplies. In fact, the CDC recommends sodium hypochloride over calcium hypochloride tablets because of concerns over the quality and consistency of tablets. [6] I would have thought the same applies to bleach/sodium hypochloride but I guess you don't have to worry about size and perhaps the variation is lower.
Nil Einne (talk) 19:12, 19 January 2014 (UTC)[reply]

Many people don't realise that one of the popular sterilising fluids for baby's bottles [7] is basically a dilute solution (16.5%) of Sodium hypochlorite (bleach). The manufacturers do say, however, that "The purification process during the manufacture ensures complete removal of all heavy metal ions, which would normally act as a catalyst to chemically break down many hypochlorites, causing instability". This product can also be used, at the right dilution, for purifying drinking water. Richerman (talk) 19:39, 19 January 2014 (UTC)[reply]

Bleach is, according to the EPA, a method of last resort for purifying drinking water. I learned the same advice from "extreme survival"-style camping: bleach is more effective than most of your ordinary camping-gear (like a filter or iodine droplets) at killing the nasties in your drinking water; it's more potent, lasts longer, acts faster, and it can kill you if you use too much in your water purification process. Nimur (talk) 20:06, 19 January 2014 (UTC)[reply]
You seem to have indented under the wrong reply - I didn't suggest using household bleach to disinfect water. The 16.5% solution is much safer to use. Richerman (talk) 20:54, 19 January 2014 (UTC)[reply]
There would see to be very little difference between using a weak bleach soloution (reading the FAQ, it's 2% *bleach*, 16.5% *salt*) and just using less bleach. The advantage of the baby bottle cleaner seems to come from removing all the heavy metal ions, favouring a safer decomposition of the bleach. Since the ions would presumably be present in the water anyway, you lose this advantage whether you use the solution or just straight bleach. MChesterMC (talk) 09:48, 20 January 2014 (UTC)[reply]
Sorry my mistake, it is a 2% solution. However, using a concentrate of a lower dilution (which is actually non-toxic according to their blurb) means that there is no chance of poisoning yourself if you get the dilution wrong. Richerman (talk) 17:16, 20 January 2014 (UTC)[reply]
The CDC suggests a bit lower (0.5-1%) although they don't seem to have a problem with diluting bleach to get it [8] but if you're going to store it you may need to raise the pH so it doesn't degrade too quickly (well their info isn't really for the consumer). Interestingly it seems even full strength bleach isn't as dangerous as you may think. They mention many accidental ingestions don't have significant adverse health effects, although you may be able to kill yourself with 250-500mL (they suggest it's unlikely people will accidentally ingest such large quantities because of the taste). On the whole, it sounds like you'd have to get things very wrong although I guess you may think the water is supposed to taste that bad. (It goes without saying you shouldn't intentionally test any of this.) Of course there are other risks like eye injuries or chlorine gas poisoning from careless use or accidental mixing. Nil Einne (talk) 16:38, 21 January 2014 (UTC)[reply]

Organic livestock more humane?

edit

Someone told me that organic livestock is more humane from an ethical point of view than regular livestock. Is this true? I've always been skeptical of organic food claims being more healthy and better for the environment, so I'm also skeptical of this claim. ScienceApe (talk) 17:42, 19 January 2014 (UTC)[reply]

Exposing "organic livestock" to an environment with a high degree of organic chemistry (extra amounts of pesticides and fertilizers, etc.) seems not to be very humane or ethical.   71.20.250.51 (talk) 18:04, 19 January 2014 (UTC)[reply]
There is no reason for organic raising to be more ethical. The animals can still be slaughtered inhumanely and kept in terrible conditions without pesticides. The two are unrelated. Mingmingla (talk) 18:18, 19 January 2014 (UTC)[reply]
  • Let me point you to our article on organic beef. The part of the USDA rules that is relevant is that organic beef cattle have to be born and raised on pasture, and more importantly, must have unrestricted outdoor access. That means they don't spend time in feedlots, which are usually pretty unpleasant. Looie496 (talk) 18:21, 19 January 2014 (UTC)[reply]
(e/c) And presumably such animals benefit from, for example, not being fattened artificially. We could do with an article on organic livestock, it is mentioned only briefly in our organic farming article. The free range article is probably the closest we have.--Shantavira|feed me 18:24, 19 January 2014 (UTC)[reply]
Actually, the USDA says "Due to the number of variables involved in pasture-raised agricultural systems, the USDA has not developed a federal definition for pasture-raised products" which means, for example, that an animal which lived its entire life in a pasture and an animal which saw one once on the way to slaughter could both be labelled as "pasture" cattle. Matt Deres (talk) 19:26, 19 January 2014 (UTC)[reply]

The standards for USDA and UK Soil Association are articles.mercola.com/sites/articles/archive/2010/03/25/new-definitions-for-organic-meat-and-milk-issued.aspx [unreliable fringe source?] here] and here. Basically you need to check the label to see which standards apply. There are, of course, other labels that guarantee ethical standards for livestock that are not necessarily organically reared, such as some of the ones shown here. Richerman (talk) 18:56, 19 January 2014 (UTC)[reply]

One of the criteria the USDA lists for Organic Beef is the cattle, "Never receive antibiotics". My friend told me that consuming animals being given antibiotics can lead to antibiotic resistance. Is this true? ScienceApe (talk) 16:33, 20 January 2014 (UTC)[reply]

The thing to think about is antibiotic resistance in bacteria, not in humans or cattle. And yes, *every* use of antibiotics technically favors antibiotic-resistant bacteria. This is why they don't use antibiotic cleaners in space! This is also why doctors stress not quitting an antibiotic course early. If you want more info on this topic, you should probably open a new thread. SemanticMantis (talk) 16:47, 20 January 2014 (UTC)[reply]
(ec) It is possible for the bacterial population in your gut to become resistant, but even if it that didn't happen the indiscriminate use of antibiotics causes resistant organisms to evolve and get into the environment see:[9]. Richerman (talk) 16:54, 20 January 2014 (UTC)[reply]

Damaged molecules?

edit
  Resolved
 – 17:43, 20 January 2014 (UTC)

A "nutrition expert" interviewed on a TV health segment mentioned that commercial processing of vegetable oils "damages the oil molecule". This sounds like BS to me; either a molecule is what it is, or it isn't, right? Or, is it possible to "damage a molecule" such that it looses its health benefit? ~:71.20.250.51 (talk) 21:28, 19 January 2014 (UTC)[reply]

Well, sure; molecules can get changed in all kinds of ways. Just consider what a hamburger patty goes through as it goes from being raw to being cooked to being burnt to being nothing but charcoal. Eating that lump of black char is unlikely to provide much in the way of vitamins. Could you provide more detail about what the oil supposedly went through to become less nutritious? Did it have to do with hydrogenation? Matt Deres (talk) 21:50, 19 January 2014 (UTC)[reply]
That changes molecules from one molecule to other molecule(s). Processing doesn't "damage" an oil molecule (right?). -Re: "more details" --this post is in response to what seemed like mumbo-jumbo from an "expert" on TV (the processed oil you buy at a store is no good, etc.); this was on network programming, not an infomercial. ~:71.20.250.51 (talk) 22:04, 19 January 2014 (UTC)[reply]
Sure. Damage = change from useful to less useful. If I take a hammer to a window, I change it. Into something less useful. Changes to oil molecules that make them less useful could reasonably be described as damage. --Jayron32 23:41, 19 January 2014 (UTC)[reply]
You can "damage" some molecules - like when you denature proteins (cooking egg whites, for instance). I don't think this is possible for oils. The usual problem with oils is when the molecules are changed into other molecules by adding hydrogen to it. Trans fat is created this way. 75.41.109.190 (talk) 22:33, 19 January 2014 (UTC)[reply]
So, (let's say) you take oil from the Borobudur Ubatuba plant, and send some to the EvilFood Industrial Processing Corp., and some to the GreenyGood Food Coddler's Co-op., you could end up with two different Borobudur Ubatuba oils? ~:71.20.250.51 (talk) 00:09, 20 January 2014 (UTC)[reply]
  • The most common processes that damage oils and other lipids are known as rancidification -- that article will give you more information. However there are also other processes that can occur. For example if you put oil

together with something that is strongly alkaline, the result is soap -- something you don't want in your food. Looie496 (talk) 05:15, 20 January 2014 (UTC)[reply]

This is becoming an argument about terminology. Sure, if you "change" a molecule, you'll have to either change the number and type of atoms within it (which makes a "different" molecule) - or you might possibly make it be folded differently.
Cyclohexane (to pick a very simple example) can fold into several different shapes (called "Chair", "Half Chair", "Boat" and "Twist-boat") - those forms all have the same chemical formula, the same number and type of atoms - and they are even connected together in the same way - so they are all "Cyclohexane". But I suppose that if you wanted "chair-Cyclohexane" and some process turned it into "boat-Cyclohexane" - you might say that it had been "damaged" by the process because the chemical properties have been slightly changed...but it's only a matter of terminology to say whether that's the "same" molecule or a "different" molecule -and "damaged" is such a loaded word in a situation where the molecule can trivially be flipped back into the preferred form. When we get to something as complicated as an "oil" - molecules will be changing and flipping shape all the time - so if you just do nothing whatever to a bucket of the stuff, it would slowly change (by some definitions of the word "change"). Perhaps that can be considered "damage" or perhaps "improvement" - or "unchanged" - depending on what use you're planning to put it to.
But even when the number of atoms changes - we have a terminology issue in the case of any long-chain molecule. We simply don't consider a molecule with 10,000 repeated units to be a "different" molecule from one with (say) 9,500 units. So you could take a polyethylene molecule, chop it in half - and you'd still say that you have "polyethylene" - even though the properties have changed and the strict chemical formula is different.
So I'm not sure this question is really very meaningful without a lot of clarification about the terminology we're using - and the purpose to which it'll ultimately be used.
SteveBaker (talk) 16:27, 20 January 2014 (UTC)[reply]
Thank you everyone, this does clarify the subject for me. I hadn't considered Protein folding, Molecular clustering, etc. Perhaps the "expert" wasn't totally off-base afterall. ~Eric:71.20.250.51 (talk) 17:43, 20 January 2014 (UTC)[reply]
Well, I wouldn't go that far! Nutritionalists (especially the ones interviewed on TV shows) aren't chemists! There is no such thing as "an oil molecule" - vegetable oil is a complicated mixture of many different chemicals - and clearly it's very likely that any sort of processing will change both configuration and chemical formula of the resulting molecules. The big question is whether this constitutes "damage" - in the sense of the oil not being so useful after this processing stage. Clearly, these commercial processes wouldn't be applied to the oil if there was no benefit to the manufacturer - perhaps they're improving shelf life - or making the stuff clearer and more visually appealing in a bottle on the supermarket shelves. It's plausible that something they do to "improve" it from their perspective also "damages" it from the perspective of the nutritionalist by (say) removing some flavors, vitamins or fibre. I'd bet that almost any change could be viewed as "damage" from someone's point of view.
The nutritionalist's answer is (IMHO) premium bullshit because it's a vague and lazy description. Had we been told "commercial processing lowers the boiling point, making it harder to cook without it smoking"...or..."commercial processing removes the longer chain molecules ruining the texture of bread made with it"...or whatever - then I'd listen and take note - but this sounds far too much like a blanket "Commercially processed anything is unspeakably horrible" kind of a claim, tossing in the word "molecule" because it sounds scientific - which is common amongst people who speak about food on mindlessly stupid TV health shows!
SteveBaker (talk) 00:28, 21 January 2014 (UTC)[reply]

A doubt about metric expansion of space: the galaxies that now are, for example, receeding from us at 100 km/s did ever receed from us at that speed or when they were nearer they receeded more slowly? If yes, this would contrast with the extimations of universe age, because it was calculated assuming that the recession was constant. 95.235.225.100 (talk) 22:06, 19 January 2014 (UTC)[reply]

No one assumes the recession is constant through time. The rate of expansion changes through time due to both the slowing effect of gravitational attraction and the accelerating effect of dark energy. Dragons flight (talk) 05:28, 20 January 2014 (UTC)[reply]
Coming up with an estimate of the age of the universe by simply assuming that the Hubble parameter has had about the same value during the life of the universe works surprisingly well. Even Alan Sandage's 1958 estimate for H of 75 (km/s)/Mpc amounts to a Hubble time of 13.05 billion years, and the current measured value for H of 67.80±0.77 (km/s)/Mpc amounts to a Hubble time of 14.43 billion years, both of which are surprisingly close to the current best measurement for the age of the universe of 13.798±0.037 billion years. However, it's been a long time since the determination of the age of the universe was done so crudely by professionals in the field.
Yes, it was discovered in 1998 that the expansion of the universe is accelerating slightly (see Accelerating universe), but the surprise in that observation wasn't that the Hubble parameter hasn't been precisely constant, but rather that the expansion of the universe wasn't decelerating somewhat. I.e., before 1998's observation, the presumption was that the deceleration parameter was likely positive, not that the deceleration parameter was zero.
The best current measurements of the age of the universe use observational data from NASA's Wilkinson Microwave Anisotropy Probe and the Planck spacecraft, and use the complicated ΛCDM model of the universe, which is a lot more sophisticated than simply assuming that the Hubble parameter has been constant, and estimating the age of the universe as its inverse. Red Act (talk) 06:50, 20 January 2014 (UTC)[reply]
 
The image on the right shows why the naive estimates of universe age based on constant expansion speed worked so well—the linear extrapolation (solid line) happens to intersect the horizontal axis at almost the same place as the ΛCDM curve (dashed line). -- BenRG (talk) 20:28, 21 January 2014 (UTC)[reply]

I meant if, as example, 6,9 billions years ago (the half the universe's age) the same galaxies were receeding from us at (adproximately) half the speed today or their velocity was more or less that of present. Thanks for answering.93.45.32.204 (talk) 16:56, 20 January 2014 (UTC)[reply]

Their past velocities relative to us would be similar to present. Dragons flight (talk) 01:41, 21 January 2014 (UTC)[reply]
This is a good question, and can be answered with mathematical precision. The universe's expansion rate is given by the Friedmann equations. In particular, the Hubble "constant" at any given time is:
 
where H is the Hubble constant, H_0 is its present value, a is the scale factor (1 today, 0.5 when the universe was 50% the modern size, etc), and the rest are constants. In our universe, following the lambda-CDM model,   and   are nearly 0,  , and  . Since H is by definition da/dt divided by a, we have:
 
To my surprise, this can be solved analytically, and the result is:
 
Next we just need to compute da/dt:
 
Click here to see a graph of this function. The x axis represents time, where 1 is roughly the current age of the universe. The y axis is the ratio of your hypothetical galaxy's recession speed to its current recession speed. For example, the graph is roughly 0.95 at a time of 0.5. This means that 6.8 billion years ago, a galaxy that's receding at 100 km/s today was receding at 95 km/s.
BTW can someone please check my calculations? I have an inkling that something's wrong, because I don't remember the Friedmann equations being analytically solvable for lambda-CDM and was shocked that they had a simple solution. --Bowlhover (talk) 08:32, 21 January 2014 (UTC)[reply]
I found the same solution and I was surprised for the same reason, but I don't see how it can be wrong. Maybe we should publish. -- BenRG (talk) 20:28, 21 January 2014 (UTC)[reply]
Outside my expertise, but this unpublished manuscript seems relevant "Solution of the Friedman Equation, Determining ... Acceleration and Age of the Universe "[10], also these course notes [11] Still, just a preprint, and I don't have time to compare your work to theirs :) SemanticMantis (talk) 15:08, 22 January 2014 (UTC)[reply]

Situations where a law change allowed researchers to gather useful data due to before/after comparisons?

edit

I'm looking for examples where important social science data was able to be gathered because a new law went into effect, which allowed for before/after comparisons.

I'm especially interested in examples where the new law was initially rolled out to only a subset of the population, and that population subset was determined by something random like the last digit of their SSN or license plate, and this allowed researchers to treat it as a [inadvertent] randomized controlled trial.

Anywhere in the world is fine. --Hirsutism (talk) 22:22, 19 January 2014 (UTC)[reply]

I'm not sure this is exactly what you're after - because it's not "social science" and not exactly a "law" that was involved. But when the 9-11 attacks in Sept 2011 forced every passenger plane in the USA (and most in Canada) to be grounded for three days, researchers were finally able to deduce what effect their contrails had on the weather - showing a variation of between one and two degrees centigrade while the aircraft were grounded. This is an experiment that would be impossible to conduct without a fortuitous governmental imposition like that. Fortuitous things like long power-outages reveal all sorts of interesting things such as changes in birth rate, crimes and even literacy (parents read to their TV-deprived children after that massive power outage in the Northern USA - and enough of them stuck with doing it after the power came back on to make a measurable difference to childhood literacy rates over the next year or so)...but again, that's not a law that was passed. SteveBaker (talk) 16:09, 20 January 2014 (UTC)[reply]
Cite, please, for the "1–2 degrees Celsius"? --50.100.193.107 (talk) 22:05, 21 January 2014 (UTC)[reply]
There's a bit on it in our contrail article, and you can follow references from there. Looks like the day/night temperature swing was slightly larger during that period: Contrail#September_11.2C_2001_climate_impact_study Katie R (talk) 16:32, 22 January 2014 (UTC)[reply]
  • What you are looking for is a specific type of Natural experiment. That term is very common in biological sciences (e.g. before/after analysis of a hurricane), but it applies equally well to the social sciences, when the researchers are not in control of the treatment or grouping. Here is an example I recall, "Does Daylight Saving Time Save Energy? Evidence from a Natural Experiment in Indiana" [12], that made use of different counties changing their time at different times. It's in an economics journal, and is focused on energy, but the ultimate drivers are sociological processes. Anyway, with the right term in mind, searching google scholar for /"natural experiment" law sociology/ (e.g. here [13]) produces myriad relevant hits, touching on child labor, recycling practices, etc., so that should get you started. SemanticMantis (talk) 17:07, 20 January 2014 (UTC)[reply]

One example is that in 2008 Oregon Expanded Medicaid to included low income uninsured adults. However, there was a lottery as there were only 30,000 openings out of 90,000 applicants. They have now been able to compare health care usage and health outcomes for two very similar groups where the only difference appears to be random selection to the insured vs uninsured group. see:http://www.latimes.com/science/sciencenow/la-sci-sn-expanding-medicaid-increases-emergency-room-visits-study-finds-20140102,0,2102014.story#axzz2rA2rAqvD — Preceding unsigned comment added by 166.107.101.87 (talk) 20:31, 22 January 2014 (UTC)[reply]