Wikipedia:Reference desk/Archives/Humanities/2010 September 4

Humanities desk
< September 3 << Aug | September | Oct >> September 5 >
Welcome to the Wikipedia Humanities Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


September 4 edit

Absolute pitch edit

Hi again all. My music teacher last semester said that it's very likely that I have or could develop with some training the absolute pitch. (This is probably based on I have played piano for almost all my life, I speak a tonal language natively, and both my grandparents have it). I know what it is, but my question is how do I know if I have it, and how can I train or develop it? 99.13.222.181 (talk) 00:03, 4 September 2010 (UTC)[reply]

If I remember correctly, absolute pitch is genetic, not learned. If you have it, you can develop it; if you don't, you can't. All absolute pitch means, effectively, is that you can recall and reproduce pitch accurately from memory: in other words, if I say "Sing a C", you can produce the correct pitch on cue. easiest way to test if you have absolute pitch is to pick a note on the keyboard, hit it a few times and try to memorize it, and then come back to the keyboard the next day and try to sing the note again, testing it by hitting the same key afterward. If you get it right (or very close) then you likely have it. --Ludwigs2 00:17, 4 September 2010 (UTC)[reply]
As always Wikipedia has an article or two about this... Absolute_pitch#Nature_vs._nurture, Ear training--Aspro (talk) 07:20, 4 September 2010 (UTC)[reply]
This is often known as perfect pitch, which relates to the same article page. However, there is another "gift" which I only met one person with it; that is the ability of "seeing music". MacOfJesus (talk) 20:39, 4 September 2010 (UTC)[reply]
That's a form of synesthesia, which isn't all that uncommon among musicians. ---Sluzzelin talk 20:42, 4 September 2010 (UTC)[reply]
Thank you; I did not know it's name. Beethoven, apparently had it, explained why his deafness did not hinder him. MacOfJesus (talk) 20:50, 4 September 2010 (UTC)[reply]
I don't think we need to adduce synaesthesia in Beethoven's case, just because he was deaf. Close your eyes, turn off all external music, and silently imagine someone singing your country's national anthem. Or, imagine them singing it deliberately off key. You've got it, haven't you. But does that mean you're experiencing synaesthesia? No. If, whenever you hear music with your eyes open, you see letters, numbers, colours, shapes; you get certain tastes in your mouth; you experience certain smells - or vice-versa - that's synaesthesia. -- Jack of Oz ... speak! ... 22:25, 4 September 2010 (UTC)[reply]
Thank you, This one person, who reached dizzy heights in Music, convinced me that he had something that was different. His ability to "balance" music was extrardinary. I take your point that a learning process can bring this about, but to what level? MacOfJesus (talk) 07:33, 5 September 2010 (UTC)[reply]
The question is, why would you want perfect pitch? It would presumably be fixed to the pitches of whatever instrument you used to learn it, making other pitches and tunings sound 'off'. Would you learn concert pitch? What will you do if you have to play an instrument that is tuned to a different pitch? How much recorded music will sound distractingly 'off' to you? Far better to learn relative pitch and (possibly) a reference sound for a given note, so you can recall it if needed. Although even relative pitch can be distracting: I wish someone had told me about just intonation and equal temperament when I was tuning and retuning my violin, not understanding why the strings could seem to be tuned to perfect fifths and yet a perfectly in-tune scale (based on scales I played on the piano) could sound a little off. Nowadays I hear a range of acceptable tunings as I play two strings together, choosing how 'perfect' I want it to be. 86.164.78.91 (talk) 17:15, 5 September 2010 (UTC)[reply]

last question, I promise! In older English you see the long s. What are the rules about where ot use the long s? I need to use them in a project —Preceding unsigned comment added by 99.13.222.181 (talk) 00:16, 4 September 2010 (UTC)[reply]

It was used most places except after another long "s" or at the end of a word. By the way, I don't think you have a real long s character above; our article has "ſ"... AnonMoos (talk) 01:10, 4 September 2010 (UTC)[reply]
The long s looks different in different fonts. While the glyph used above is Unicode U+0283 (Voiceless postalveolar fricative - not to be confused with U+01A9, Esh (letter), or U+222B, Integral symbol), it does have the same look (on my computer) as some of the more script style long s's - like the first one in the long s article, whereas the actual long s glyph (U+017F) looks more like the one in the Paradiſe Loſt example on that page. (Personally, I prefer it with the descender, but realize that would be more appropriately handled by changing the font.) -- 174.21.233.249 (talk) 03:42, 4 September 2010 (UTC)[reply]
The "ʃ" that you use in the title is or is very similar to what we use in my Phonetics class for the French sound "ch", which is like the English sound "sh". The "ſ" proposed above looks more correct to me. Falconusp t c 04:24, 4 September 2010 (UTC)[reply]
The long s stops at the line and has no descender. It is generally used as the first of a pair of s's in a word, or as a single s. The small s started out as the form used at the end of a word. --TammyMoet (talk) 08:55, 4 September 2010 (UTC)[reply]
You don't need to use long s, just recognize it, and enjoy the Flanders and Swann rendition of the old fong "Greenfleeves".--Wetman (talk) 18:20, 4 September 2010 (UTC)[reply]
Or Shakespeare's "Where the bee fucks there fuck I". (Sorry.) AndyJones (talk) 18:19, 5 September 2010 (UTC)[reply]

Family Fable edit

There is a story of my family that I was always told and was wondering how I can do some (academic) research on the subject. It goes like this: Brigget Von Schyler was to be married in the morning. Alas, her fiance died in an unforeseen carriage wreck in which he drowned. Brigget was consoled by the [insert noble title] and taken into his castle to help with chores. Unbeknownst to Brigget the duke had actually had her fiance killed to take her as his mistress! Buisness being done, Fritz Carl Schyler Raadt is born. Being an illegitimate child, he soon immigrates to Montreal and then to Buffalo, New York in the late 19th century. Fritz's son's son's son's son is me.

Some facts are known, like Fritz was indeed raised at Orebygaard estate and did immigrate to Montreal. His mother was named Brigget. No one, however, knows who his father is.

The story takes yet another mysterious twist when Fritz's grandneice visited Lolland in the 1930's. Upon inspection of the castle she found that the "hall of portraits" was sealed off, unable to compare fritz's photograph with that of the noble-in-question.

Even more interesting, I have recently discovered from a comrade of mine who knows Danish that my last name (Raadt) can be translated 'Advisory Committee,' which seemingly comes from a committee which was convened to name this recently born illegitimate child, and naming him after the committee which was formed to name him.

Altogether quite interesting. I want to know how to get some more facts. Any help would be greatly appreciated! schyler (talk) 01:18, 4 September 2010 (UTC)[reply]

PS. I am posting a related question to translate a letter into Danish to mail to the Bed and Breakfast that now occupies the estate.

Your best bet is to consult genealogy resources in the area where this is proposed to have happened (See Genealogy#Genealogical_research_process if you haven't already). Frequently the village churches recorded births, deaths, marriages and the like. As you have names of people, you can look them up, to confirm they existed and lived in the appropriate time period. Occasionally, the record will also indicate what people's jobs were, and where they were employed. As the events have to do with a nobleman, there may also be some indication of them in the local paper. Looking up church records and newspaper records may require you visiting the locality in person, though (although a few places have put records online). On this side of the Atlantic, you can double check ship manifests to ensure that Fritz actually immigrated when and from where he said he did, as opposed to making the story up to impress his wife/grandchildren. (Which happens surprisingly often.) (Also, while I wouldn't necessarily recommend pursuing it, I will note that as a direct male-line descendant of Fritz, it would be possible to confirm/disprove the identity of Fritz's father if you could locate a (legitimate) male-line descendant of the nobleman, and apply Genetic genealogy.) -- 174.21.233.249 (talk) 04:00, 4 September 2010 (UTC)[reply]
"Even more interesting, I have recently discovered from a comrade of mine who knows Danish that my last name (Raadt) can be translated 'Advisory Committee,' which seemingly comes from a committee which was convened to name this recently born illegitimate child, and naming him after the committee which was formed to name him." This sounds highly unlikely. In Danish it would be "Raad" or "Raadet", and I have never heard of a committee being formed to name illegimate children nor that such a committee would name the child after itself. Such a name would sound odd to Danes. I suspect "Raadt" is of Dutch origin, at least there seems to be a lot of people on Google from the Netherlands and South Africa of that name. You may also consider that the name of the ancestor may have been "Birgit" or "Birgitte" or something similar. "Brigget" is not a Danish or German name, in fact I am not sure it is a name at all. --Saddhiyama (talk) 09:23, 4 September 2010 (UTC)[reply]
You may also want to check out the Danish Wikipedia article on Orebygaard. It has a list of owners, showing the noble family in question would have to be Rosenørn-Lehn, of which the progenitor was Otto Ditlev Rosenørn-Lehn (1821-29) who was the foreign minister of Denmark for 21 years. Christian IX of Denmark, George I of Greece and Alexander III of Russia was present at his funeral. It is a nice story, but I must say that it often turns out that the truth is a lot more prosaic than the family legends. --Saddhiyama (talk) 09:32, 4 September 2010 (UTC)[reply]

Greatest disparity in standard of living between neighboring countries edit

 
This map may help as well. schyler (talk)

I was just thinking about differences between San Diego and Tijuana. This made me wonder, which two neighboring countries have the greatest difference in standard of living? My hunch is to say North and South Korea have the greatest difference. Other than these two, what are some other prominent examples? Where would the difference between the US and Mexico rank? Do they have the greatest difference in the Western Hemisphere? (not a homework question, curious) Thanks, 24.62.245.13 (talk) 03:00, 4 September 2010 (UTC)[reply]

The Gini coefficient measures this. schyler (talk) 03:25, 4 September 2010 (UTC)[reply]
Not exactly. The Gini coefficient measures the income disparity within a country, it's not terribly useful in measuring differences in overall or average standard of living between two different countries. For example, by Gini coefficient, China and the U.S. have comparible values; that doesn't mean that China and the U.S. have the same standard of living, it means that they have a similar "spread" of standard of living within their borders. That is, the difference between the richest and poorest Americans is similar to the difference between the richest and poorest Chinese. What the OP would want to compare is something like GDP per capita or GDP (PPP) per capita. Perusing List of countries by GDP (PPP) per capita and the map below will give better results. Using the CIA World Fact Book numbers from the List article, it looks like something like Zimbabwe ($100 per capita) vs. South Africa ($10,100) is a pretty stark difference; it represents a difference of over 100 fold, or 10,000% difference. Using the same set of data, the U.S. ($46,400) vs. Mexico ($13,500), or about a 3.5 fold (about 350%) difference isn't nearly as great. South Korea ($28,000) vs. North Korea ($1,900) is also pretty big at about 15 fold (1,500%).—Preceding unsigned comment added by Jayron32 (talkcontribs) 03:54, 4 September 2010
 
This map is better for the comparison the OP wants to do.
Botswana ($13,100) > Zimbabwe actually gives a higher differential than South Africa > Zimbabwe - at least 130 times, using those figures. Some other interesting differences:
Libya $15,200 > Niger $700 (21.7 times)
Equatorial Guinea $36,600 > Cameroun $2,300 (15.9 times)
Kuwait $54,100 > Iraq $3,600 (15.0 times)
Israel $28,400 > West Bank / Gaza $2,900 (9.8 times)
Oman $23,900 > Yemen $2,500 (9.6 times)
Argentina $13,800 > Paraguay $4,100 (3.4 times)
Instructive ! Ghmyrtle (talk) 08:12, 4 September 2010 (UTC)[reply]
Spain 29k > Morocco 4.6k (~7 times) Steewi (talk) 04:35, 6 September 2010 (UTC)[reply]

Thanks a lot, this turned out to be a much different list than I expected. 24.62.245.13 (talk) 19:54, 7 September 2010 (UTC)[reply]

Without providing data, just off the top of my head, Saudi Arabia / Yeman; North Korea / South Korea; Australia / Papua New Guinea; Singapore / Indonesia; Russia / Finland; Hong Kong / China (if looking at WTO, ADB, Interpol or other “country” members); and Japan / Russia. DOR (HK) (talk) 06:53, 8 September 2010 (UTC)[reply]

Selective service system edit

Will Barrack Obama remove the selective service system for good in the near future and if not, when will the selective service system be completely be removed and abolished? Is it close to being abolished at the moment? I heard that troops in Iraq were withdrawn so I think that's an indication that selective service system will be abolished. Am I right — Preceding unsigned comment added by 114.72.226.126 (talk)

I don't think the connection you're making between the Iraq withdrawal and the Selective Service System is necessarily valid - the USA has concluded many conflicts and undertaken many troop withdrawals since the inception of the system and hasn't removed it. I haven't seen anything to suggest there are any concrete plans to remove the system entirely - the general idea being that it's much easier to keep such a system ticking over without actually conscripting anyone, rather than having to create it again from scratch should conscription become urgently necessary in the future. ~ mazca talk 11:57, 4 September 2010 (UTC)[reply]
President Ford ended the selective service in 1975 (largely as a backlash to the Vietnam war), but it was reinstated in 1980 by President Carver (and retroactively applied to most of the people who had been exempt). See Selective Service System#History. So it's not unprecedented for it to be discontinued after a long, unpopular war. However, if I were a betting man (and I don't have a crystal ball, mind you), I wouldn't put money on it. The war in Iraq (unlike Vietnam) was fought by a volunteer army, and there hasn't been any backlash against a draft, because there hasn't been one. Buddy431 (talk) 13:08, 4 September 2010 (UTC)[reply]

But haven't people finally realized conscription is a bad thing and should be completely abolished? I mean Australia and the UK have abolished it. And Australia is a small country no where near the military power of the US and yet they've still abolished it. I don't get why tthe US having one of the strongest military in the world want to keep the selective service. Is there a valid reason for this? Could someone eexplain? And anyways I though US was a modern country that has moved on from conscription where as countries like singapore aren't modern so cconsciprtion hasn't been removed. How come it's been removed in Australia a small country but not in the US? Thanks guys ... —Preceding unsigned comment added by 114.72.242.18 (talk) 13:11, 4 September 2010 (UTC)[reply]

There's a big difference between actually conscripting people and simply keeping the system active in case of a possible need to conscript people in the future. I'm not sure I take your point that conscription is unambiguously "a bad thing and should be completely abolished" - the reason it's generally been abandoned in recent decades is because there hasn't been a particularly manpower-intensive, wide-ranging war being fought. In World War II, for example, the outcome may have been very different if the USA, Britain, etc had not chosen to implement it. I suspect the US government keeps the Selective Service System around so that, should a massive military problem occur and conscription become necessary again, it could be implemented with minimal difficulty. ~ mazca talk 13:41, 4 September 2010 (UTC)[reply]

OK. But Britain has completely abolished it. So if Britain had any plans to bring it back or implement it wouldn't they have done something similar to the US with the selective service and all? They didn't. That's kind of what I'm asking. Britain has abolished it. That suggests people at least in Britain (and don't forget Australia) have moved on from the idea of a draft. Even if a war were to break out the draft wouldn't be reinstalled in Britain would it? So the US keeping it means a lot different. And my other question is more on this "moving on" business. I mean in the middle ages people were so religious they used to prosecute people who dared speak against the church. That's considered ridiculous nowadays. THat will never happen again. And as people become more educated other things like that are changing as well. Like rascism has stopped whereas it was common a few decades ago. Women are now considered equal to men. That kind of thing. WHat I'm asking is does this trend have any likely impact on the idea of conscription? It seems so. Slowly countries are removing the idea of national service and abolishing it by law. Like Spain. SO this means that even if a huge war started, since conscription is abolished by LAW, no one can be conscripted. Will this likely happen in the US? doesn't also the recent revolutions mean that a world war is impossible since people now think more before they act? And my other question was that the US has the biggest military in the world. THey'll be stronger than any other military in the world. So what's the need for having conscription. I can't think of an emergency that would require more than 20 million professional highly trained soldiers.

What makes you think Singapore isn't modern? It's one of the most developed countries in Asia Moreover both China and India have bigger militaries. :/ 76.230.146.8 (talk) 15:30, 4 September 2010 (UTC)[reply]
Your approach is one of Teleology. It's not uncommon, all humans like to use it - whatever our views, we say we're making progress, going forward, as if there is some ideal endpoint that we will eventually reach if we keep going where everything's perfect. It's easy, comforting, but completely wrong. Examine, for example, the thousand years of regression Europe took after the fall of the Roman Empire - technologically we overtook Rome a few centuries ago, socially, in terms of minority rights and attitudes, there's a good argument to be made that we're barely their equals (I'm aware I'm describing this in a teleological manner. As I said, not uncommon). Your assertions that religious extremism, and minority oppression, and military conscription will never return are optimistic, but baseless. We can but hope --Saalstin (talk) 17:03, 4 September 2010 (UTC)[reply]
From a UK perspective conscription was ceased a long time ago. It keeps being raised as worthwhile, predominantly from a right wing political perspective as a panacea. There are a number of reasons why the military wouldn't want to implement it in our current situation. In fact it's not been abolished, it's just not implemented. Manning figures do account for a need for a massive increase in manpower, but the circumstances that would lead to that are quite extreme.
From a practical perspective there is neither the training infrastructure to accommodate the sudden increase in numbers, nor really the time available to integrate the new personnel into the system. Training for modern warfare takes a long time, even infanteer training takes over six months.
ALR (talk) 19:46, 4 September 2010 (UTC)[reply]
I don't see any link between being "modern" and not have selective service. Plenty of industrially and culturally "modern" (by Western standards) states have conscription laws on the books, even if they haven't been used for a long time. And the idea that "people think more before they act" these days is pretty much unsupportable. The idea that we won't have any "world wars" of any major sort in the future is also pretty much just speculation and not tied to any actual recent events. (In fact, one could argue that the possibility of major conflicts has been greater in the last 10 years than it was in the 10 years before that). --Mr.98 (talk) 19:51, 4 September 2010 (UTC)[reply]
Countries that drop selective service will reinstate it if they need a larger number of soldiers than volunteers will supply. ←Baseball Bugs What's up, Doc? carrots→ 19:56, 4 September 2010 (UTC)[reply]
As far as the UK is concerned, I don't think anyone is advocating conscription to meet any military need - and the sort of military operations that the UK has been involved in recently require highly trained professional personnel - conscripts would probably be quite unsuitable. You do hear of people (typically right-wing as someone mentioned above) saying that "National Service" (as it used to be called here) should be reintroduced in some form, but this would be for the purpose of instilling discipline etc in young people, or just getting them off the streets, and possibly achieving some (non-military) purpose as a by-product, --rossb (talk) 20:35, 4 September 2010 (UTC)[reply]
Last I heard, the Swiss still do just that: Every able-bodied adult male is expected to have the skills to defend the country if necessary. Rest assured that if the need arises, the U.S., the U.K. and anyone else who currently lacks a draft, will be quick to reinstate it... or to try to, anyway. ←Baseball Bugs What's up, Doc? carrots→ 20:39, 4 September 2010 (UTC)[reply]
The Swiss system is a key part of their political neutrality position, every able bodied male is a member of the military reserve and is required to fulfil a reserve training commitment accordingly.
From a command perspective I don't see any realistic circumstance where it would happen. Even when we were waiting for the third shock army in the Fulda Gap there was no expectation of requiring a conscription system. We've moved quite far beyond a situation where we need a low skill, massive force. From a UK perspective it would also lead to a massive change in how we conduct warfare. The whole concept of Mission Command, that is fundamental to how we in the UK conduct warfare, is predicated on a motivated, educated and well trained individual who has made the choice to be in the force.
ALR (talk) 21:24, 4 September 2010 (UTC)[reply]
Enemies have a way of screwing up your plans. ←Baseball Bugs What's up, Doc? carrots→ 22:16, 4 September 2010 (UTC)[reply]
From a British Military perspective our main enemies are in Whitehall and the White House.
ALR (talk) 22:28, 4 September 2010 (UTC)[reply]
So most of your Heathrow Airport bombings originate in the USA??? ←Baseball Bugs What's up, Doc? carrots→ 22:32, 4 September 2010 (UTC)[reply]
The Heathrow farce wasn't a threat to the British Military.
ALR (talk) 22:36, 4 September 2010 (UTC)[reply]
I'm sure Neville Chamberlain thought the same way in 1938. ←Baseball Bugs What's up, Doc? carrots→ 22:38, 4 September 2010 (UTC)[reply]
I'm unsurprised that the subtlety is lost on you, but the biggest threat to those of us who are in the military are the political decisions of those who direct our actions, make decisions on the funding that we receive, and legitimise our actions.
From a personal perspective as a former Squadron Commander that means that politicians are responsible for those decisions that constrained my actions in command, defined what I was expected to achieve, and what I was expected to achieve it with.
If we restrict our attentions purely to the Counter-Insurgency and Counter-Terrorism domains then political decision making has a direct influence on the threats that we face. The two are very different beasts, with different drivers, and different mitigations and actions required. You may be aware that Chamberlain was dealing with a state actor in 1938, not a terrorist threat. The dynamics in dealing with each of those are mildly different, as I'm sure you appreciate.
ALR (talk) 22:48, 4 September 2010 (UTC)[reply]
General James Mattoon Scott had similar thoughts, but he went just a little too far. -- Jack of Oz ... speak! ... 23:21, 4 September 2010 (UTC)[reply]

Public opinion of conscription edit

If conscription was introduced suddenly because of a world war III or something like that what would be the public opinion of it? (based on current or recent polls) Would people start revolting so the government won't conscript anyone anyway? (based on current or recent polls) And would it be possible to dodge a draft like that by just going to another country or is it harder to do that nowadays because of technology? (based on federal laws) I hear the current US senate is 402 - 2 majority do not want conscription. HOw would this number change in a world war? (based on recent wars such as Vietnam war) Last question, why do the US having the biggest military in the world want to have sa conscription called selective service? (this is a kind of a continuation from a question above but it's a different question so I ask here). Notice the things in brackets are the information I'd like my answers in (like the public opinion question can be answered based on recent polls) so I'm not asking opinions or anything. —Preceding unsigned comment added by 114.72.208.57 (talk) 23:42, 4 September 2010 (UTC)[reply]

Volunteers are reminded that we deal in factual information on the Reference Desk, rather than speculation, and that parts of the above question may not be answerable. Please confine your responses to public opinion survey data, references to scholarly works, and other reliable sources. The Reference Desk is not a debating forum. TenOfAllTrades(talk) 23:56, 4 September 2010 (UTC)[reply]
The U.S. Senate has only 100 members not 404. The U.S. House of Representatives has 435. So your numbers seem off.
Where conscription is concerned, the "leading edge" is no longer about military service, but various plans (highly unpopular outside a certain sector of the government) to force "volunteer" service in a "civilian national security force" (see Rahm Emanuel, who served as a volunteer for the IDF). In other countries, as I understand it, volunteer service is often presented as a draft alternative for conscientious objectors, sometimes with a longer required term. To many Americans the idea seems foreign, unfree, communist, exploitative, and seems likely if not intended to damage cultural variations.
Where repealing the draft is concerned, it seems like Democrat-controlled legislatures (but alas, not so much those controlled by Republicans) profess a great limitation on their time to take action, and many good bills e.g. GINA have been lost for many successive years simply because they "didn't get to it". Oddly, they always seem to have time for symbolic resolutions and little holidays no one ever heard of. Wnt (talk) 14:31, 9 September 2010 (UTC)[reply]

Having read the articles, I still don't get why someone will nowadays use the second one. A modern graphical tool for layout could, theoretically, have all options that a non-graphical tool has, and indeed even export the result in Latex. What is wrong about seeing what the output will be?--Quest09 (talk) 13:07, 4 September 2010 (UTC)[reply]

Nothing wrong with WYSIWYG, of course. Some people feel they get better control, better precision, and better understanding of the underlying structure of a thing if they can see a representation of the markup.
This is especially true for non-print formats like web-pages where there is no single "what you get". (A web page looks different depending on browser type and version, browser font settings, size of browser window, zoom levels, etc, etc.) APL (talk) 00:01, 5 September 2010 (UTC)[reply]
The distinction between the two is that you tell a traditional WYSIWYG editor "this is italicized text in 24 pt sans-serif font", whereas you tell a WYSIWYM editor "this is a chapter heading" and "chapter headings should be italicized in 24 pt sans-serif font". There are several benefits to this. For one, it ensures consistency between all of similar elements, and it makes it easy to change things (For example, if you wanted book titles to be bold, 20 pt serif font instead, a traditional WYSIWYG editor would make you track down and change all 20 chapter headings individually, whereas a WYSIWYM editor has a single location you change.) Another thing frequently cited as an advantage is separating producing content from formatting issues. It's not uncommon in a WYSIWYG editor to be typing along, and then come across a differently formatted section, and then waste 5-10 minutes getting it to look right. A WYSIWYM editor encourages you to produce all the content first, and only after you have completed that do you spend time tweaking the document to make it look right. It's this last point which is probably the reason for any conscious decision not to let you see the final formatted document while you're editing it. The creator of the WYSIWYM tool doesn't want you to get "distracted" by formatting while in the content-production phase. (There are also technical reasons - most WYSIWYM systems like latex rely on batch-process toolchains, where it's computationally intensive to reformat a document, so you're not going to get the instant feedback like in a WYSIWYG editor. But, of course, the reason for that batch design is the "separation of presentation and content" philosophy of the WYSIWYM tool creators.) But you're correct in that you could have a semantic markup editor with instant formatting feedback, though "purists" may not think of it as WYSIWYM, as that term is specifically a counter-reaction to WYSIWYG. -- 174.21.233.249 (talk) 00:58, 5 September 2010 (UTC)[reply]
Mathematical notation is easier to read in its rendered form ( ), but it's way easier to write and edit the LaTeX \Gamma \vdash s_{0,i} \in r \quad f(i,s) \Rightarrow Gamma \vdash f \cdot s \in \mathbb{X} than to muck around with a symbol map and click around in an equation editor. You can even write macros to simplify your textual notation, which makes life easier when you decide to adjust your notation. Paul (Stansifer) 03:56, 5 September 2010 (UTC)[reply]

national anthem edit

I noticed that most national anthems are happy (duh) Are there any sad ones? By sad I don't mean violent because violent can well be jubilant (ie, in La Marseillaise, they sing happily about how their enemies blood will water their fields) Thanks. 76.230.146.8 (talk) 15:23, 4 September 2010 (UTC)[reply]

How about India's, it's actually in Bengali, written by Tagore. It's not sad but it's boring. That's worse than being sad. No ?  Jon Ascton  (talk) 07:01, 5 September 2010 (UTC)[reply]
I'd like to hear of one. Hatikvah isn't really a candidate, because as our article says: The harmony of Hatikvah is arranged modally and mostly follows a minor scale, which is often perceived as mournful in tone and is rarely encountered in national anthems. However, as the title "The Hope" and the words suggest, the import of the song is optimistic and the overall spirit uplifting. Marnanel (talk) 15:30, 4 September 2010 (UTC)[reply]
The Japanese Kimi ga yo is pretty slow and with prolonged vocals. I'm not sure if that counts as "sad", but there you have it. TomorrowTime (talk) 15:47, 4 September 2010 (UTC)[reply]
Well, in that case, the UK's God Save The Queen would be a candidate - it's not sad, just very boring (my opinion). I personally find Kimi Ga Yo to be a bit more interesting. --KägeTorä - (影虎) (TALK) 16:01, 4 September 2010 (UTC)[reply]
National anthems are (as a rule) a pro-nation celebration of the nation's existence. You might find some different emotional tones (for instance the American national anthem is a bit whiny, IMO), but you're not going to find one that's sad - who'd choose a mournful anthem to celebrate their nation? Club music is danceable, rock music is loud, folk music is acoustic, and anthems are uplifting - that's the genre. --Ludwigs2 16:17, 4 September 2010 (UTC)[reply]
"Whiny"? Have you read the verse where they mock the fleeing British military? ←Baseball Bugs What's up, Doc? carrots→ 19:30, 4 September 2010 (UTC)[reply]
"Shhh! Be vewy, vewy quiet. We're fishing for wabbits!" 87.81.230.195 (talk) 22:32, 4 September 2010 (UTC)[reply]
Rabbits are primarily land-dwellers. You might be thinking of beaver. Better head to your local singles bar and check them out. ←Baseball Bugs What's up, Doc? carrots→ 22:46, 4 September 2010 (UTC)[reply]
(Tell it to President Carter!)APL (talk) 23:56, 4 September 2010 (UTC)[reply]
Hungary's anthem, "Himnusz", isn't too cheerful. And the lyrics get more miserable as it goes on - the last verse starts "Pity, God, the Magyar, then...". Warofdreams talk 19:38, 4 September 2010 (UTC)[reply]
I've always wondered about the Moroccan anthem, the "Hymne Chérifien". The lyrics are very optimistic, but the song is in minor and, though perhaps not sad, the music does sound grim. ---Sluzzelin talk 22:55, 4 September 2010 (UTC)[reply]
I'd imagine that to be rare at best, for reasons cited above. I suppose you could generate this with a historical perspective - the English translation of the Soviet anthem began 'Unbreakable Union of freeborn Republics/Great Russia has welded forever to stand!'. Bright, self-confident, optimistic, arguably every single claim in those two lines proven false. For those who believed in it, and witnessed its destruction, the benefit of hindsight must make that at least slightly bitter. --Saalstin (talk) 13:24, 5 September 2010 (UTC)[reply]

If Google is owned by two Jews, why does it include links to antisemitic and racist websites? edit

It seems a puzzle. Can anyone explain it?--SuperFeminineState (talk) 16:12, 4 September 2010 (UTC)[reply]

Because it doesn't censor its results, more or less. Because believing in freedom of speech means tolerating opinions you don't agree with. See their explanation that comes up when you search for things like "Jew", as well. --Mr.98 (talk) 18:16, 4 September 2010 (UTC)[reply]
Google's great claim is its credibility: that it uses complicated distortion-suppressing heuristics to pick search results and doesn't stack its answers (as do almost all other search engines) to favor paid advertisers or a particular government. Once it started doing that, apart from the principles involved (allowing users to conduct their own research and reach their own conclusions, as well fighting censorship), it would in fact feed and give a spurious credibility to the conspiracy theories of such sites. —— Shakescene (talk) 05:15, 5 September 2010 (UTC)[reply]


This may not be an appropriate answer for RD, but is nevertheless a humorous anecdote pointing to Jewish creativity. Those with antisemitic bent of mind may even find it offensive ! Read at your own risk.
Google is here to tell you about everything. OK ? Historically Jews had the God-given responsibility to tell us everything. Believe me, we'd be lost without them. First came Moses, a Jew. There was no Google then, so when we questioned him "where is everything ?" He pointed to the sky. Then came Jesus, a Jew. When we asked him where is everything, he pointed to a little lower,i.e. to his heart. Then came Karl Marx, a Jew. We asked him the same question, he pointed towards to his stomach. Then comes another Jew, Sigmund Freud. When questioned where was his everything, he pointed even lower than Marx did, that is to his you-know-what. Then comes the greatest, the most intelligent Jew ever, Einstein. He didn't point anywhere but answered "Everything's Relative !.
You see, now we have their Google to tell us where everything is, and it works.  Jon Ascton  (talk) 02:30, 5 September 2010 (UTC)[reply]
And where did Groucho Marx point? APL (talk) 08:06, 5 September 2010 (UTC)[reply]
Peter Ustinov put it simply: "The Jews are a wonderful people. They produced Jesus Christ and Karl Marx, two of the most influential men in history. And they had the sense to believe neither of them." (unsourced) Cuddlyable3 (talk) 13:57, 5 September 2010 (UTC)[reply]
Cuddylable3, you of all people would appreciate knowing the correct Ustinov quote is: "I believe that the Jews have made a contribution to the human condition out of all proportion to their numbers: I believe them to be an immense people. Not only have they supplied the world with two leaders of the stature of Jesus Christ and Karl Marx, but they have even indulged in the luxury of following neither one nor the other." [1]. Cheers. -- Jack of Oz ... speak! ... 19:56, 5 September 2010 (UTC)[reply]
Mr 98 calls my response "most nonsensical and useless answer of the day." Isn't that a bit wicked ?


OK, you win the prize for "most nonsensical and useless answer of the day." Congrats. --Mr.98 (talk) 03:03, 5 September 2010 (UTC)[reply]

Theory of knowledge edit

G'day all! I'm taking a TOK class as a graduation requirement. Some of the stuff we're examining (the professor stresses that we not say "learning about" because that suggests rote memorization) implies that you should know it before doing any other kind of learning so you can look at it critically and not just take everything at face value. Then why don't schools teach TOK in elementary school instead of all the way up at the end of high school? It doesn't really require any prerequisites other than an open mind, which elementary schoolers have better than high-schoolers, I would magine. 99.13.223.146 (talk) 17:48, 4 September 2010 (UTC)[reply]

Here is a long response that isn't entirely specific to your question, which has a lot of different dimensions, but perhaps will raise some of the "theoretical"/non-practical (e.g., not related to how curriculum choice actually works) issues.
In my experience there are generally two major approaches to what to do about things like this. One is to say, "teach them a bunch of facts and basic stuff first, then when they have gotten all that more or less down, teach them how the whole edifice that said facts are built up on is somewhat more pliable than they've been lead to believe." The other says, "teach them that facts and theories are pliable first, and they'll learn things more deeply from that point forward." There are legitimate arguments on each side.
Just as an example, when I was in high school, "conceptual math" was all the rage. "Teach them the basic concepts, don't make them do rote memorization!" Well, it sounds nice on paper, but for myself and most of my classmates, from what I gather, now know some basic concepts in an OK way, but have really hard times actually trying to apply them, because a lot of those "rote memorization" steps are really just about training your mind to do certain types of common problems. I vaguely remember how matrices are supposed to operate, for example, from a fundamental level, but could I do matrix multiplication without looking up how to do it again? Not really. Can I quickly divide and multiply numbers in my head? No. This is anecdotal, of course, and perhaps on the whole people who do these kinds of programs do better than those who do the "traditional" approach, but I remain a bit skeptical. I don't think it served me well, in any case. I think if I had done something a little more "disciplined," I'd have developed more of the patterns that would keep me from getting hung up on the really basic math stuff, and would have let me branch off more intelligently later on. As it is, I'm basically not numerate, and find even calculating a 15% tip in my head very frustrating. Obviously the school doesn't take all of the blame for this, but I'm not sure a focus on "concepts" was the right path for me (I'm the kind of student that finds "concepts" ridiculously easy, and "implementation" quite hard, so I skirted through high school math, and then hit a total brick wall when I got to college).
Another anecdote. At one university I was at, they teach history by basically throwing facts and memorization at you for the first couple of years, and then gradually making the classes more specific and heady as you go through the program, and finally you end up learning methodology in your very last class. That is, you start by learning "just historical facts," you end by learning "how historical facts are made." At another university I was at, they try to start with "how historical facts are made" before you actually have any "historical facts" given to you. Then presumably once you know how they are made, you go out and learn a bunch of new facts. My experience is that the history undergraduates at the first university do better, because learning "the facts" is more about getting the baseline information in place that you'll later be able to add to, modify, and poke holes in, as you accumulate more information and experience. (Having a basic feeling for what happened in World War II, for example, makes assimilating new facts, details, and arguments about World War II easier.) At the second university, the students often forget the theory/methodology stuff pretty quickly, because there isn't anything for it to "adhere" to, or they decide, "well, all facts are arbitrary and all historians are biased, so who is to say which is true?" Which is kind of a flip, 19-year-old response that doesn't really encourage taking the "learning the facts" stuff seriously. I'm sure there are exceptional students who really benefit from the latter case, but my general inclination is to think that the great whole benefit from the former. Again, just anecdotal. Not data.
All of which is to say... I think there could be two very different philosophies at play here. One is, "teach the theory of knowledge first, and the knowledge second," and the other is the exact opposite. My personal experience leads me to think that teaching the facts first, and the theory second, works better (and is more fun, anyway — who doesn't like to find out, when they get to college, that everything they've learned is somewhat false?). But there are good arguments on the other side of it as well, and plenty of room for honest disagreement.
I've always thought basic psychological biases should be taught earlier on than they are. A book like How We Know What Isn't So should probably be required high school reading for all students, I think, because it's about the limits of human knowledge in a very practical, "how not to get suckered and duped" kind of way, without tying it up with too much abstract philosophy, and roots everything in common human experiences and familiar human cognitive biases (e.g. belief in lucky numbers, "hot streaks" of sports players, etc.). --Mr.98 (talk) 18:29, 4 September 2010 (UTC)[reply]


(e/c)TOK = Theory of Knowledge, yes? and I'm interested - you say you're in high school, but high schools do not normally get into theory of knowledge stuff, not do high schools generally call teachers 'professor' - color me a bit confused.
at any rate, part of the problem you're dealing with is the education model that's been in place in the US (and most English speaking countries) since the 1950s - it's a corrupted Dewey model. Basically what's happened is that high schools and grammar schools have adopted the idea that they are preparing students for entry into practical life, rather than training students in the rudiments of higher thought: this means that the schools focus on cramming both technical information and certain rules of self-discipline and obedience into students - the kind of material that will prepare students for work-a-day blue-collar or low-level white-collar employment. Critical thinking skills are not generally considered necessary for such jobs, and are sometimes viewed as an impediment. Private schools (which I assume you are attending) are usually more attentive to critical thinking skills because they are aiming to place their students in upper-level professional or academic positions, but private schools generally don't extend below 8th (or sometimes 6th) grade, and are obligated to meet standards imposed by government bodies (which standards are usually based on the corrupted Dewey model).
really, we should be teaching children independent critical thought down as low as 4th/5th grade (that's when you first start getting students capable of formal operational thought), but... --Ludwigs2 18:41, 4 September 2010 (UTC)[reply]

Passport types edit

Machine-readable passport says that the second character of the machine-readable zone is used, by certain countries, to indicate the document "type (for countries that distinguish between different types of passports)". Is there a uniform list of such codes, or are these codes defined by each issuing countries? Can the codes used for passports (or other travel documents) be found online (I am especially interested in Austrian documents) or elsewhere? Apokrif (talk) 18:38, 4 September 2010 (UTC)[reply]

In the passports I've seen in UK all have a "more-than sign" as second letter, is this helpful? MacOfJesus (talk) 21:51, 4 September 2010 (UTC)[reply]
The less-than sign (<) is used as a filler character in machine-readable passports. (It's used in place of a blank, because the symbols are readily countable for manual entry and automatic character recognition.) It appears that a filler character is widely used to mean 'regular passport', based on Bernard's experience with UK passports, the image of a U.S. passport page in our article, plus the output from Google Image search for a handful of other countries: Israel, Kazakhstan, Australia, Slovenia. The same appears to be true for Austria. Unfortunately, I haven't been able to locate images of (or data regarding) the codes used to indicate diplomatic, government, military, emergency, or other passport classes. TenOfAllTrades(talk) 23:51, 4 September 2010 (UTC)[reply]
This Swiss emergency passport uses the letter "D" (but the picture is barely readable) and the MRZ on this Latvian travel document for stateless persons begins with "PB". On these US refugee travel documents [2], the first two characters are "TP" instead of "P something": does this mean that "T" is used instead of P for travel documents which are not passports ? But Machine-readable_passport#Official_Travel_Documents_.28e.g._identity_cards.29 says that besides P, the first character can be only I, A or C. Moreover, document 9303 (page 57) of the ICAO says that "In documents other than passports, e.g. [...]refugee travel document[...], the first character of the document code should be P." (as is the case on this Canadian travel document). Apokrif (talk) 12:16, 6 September 2010 (UTC)[reply]

Disease in the age of exploration edit

When European explorers began to colonize the Americas, they introduced many diseases that wiped out countless native Americans. How come this didn't happen the other way around as well? I mean, there must have been loads of diseases that the Spanish, British and French had never come into contact with and thus weren't immune to. Why didn't they perish like the native Americans did? —Preceding unsigned comment added by 92.251.213.233 (talk) 23:15, 4 September 2010 (UTC)[reply]

For a good article on this topic, check out The Straight Dope : Why did so many Native Americans die of European diseases but not vice versa?
Short version : Europeans had been living with close contact with both farm animals, and a wider assortment of foreigners, which gave them better immunity and deadlier diseases. (They did, of course, bring some diseases back with them, but they didn't have nearly as devastating effects.) APL (talk) 23:47, 4 September 2010 (UTC)[reply]
You might want to read Guns, Germs, and Steel. In general, it was because Europe was so full of packed-together people that they had gone through just about every disease nature could impose on them, whereas the peoples of the Americas lived in small, generally isolated groups that didn't pass diseases on to each other. Everard Proudfoot (talk) 23:49, 4 September 2010 (UTC)[reply]
However, see Syphillis#Origins. Everard Proudfoot (talk) 23:51, 4 September 2010 (UTC)[reply]
Jared Diamond's Guns, Germs, and Steel: The Fates of Human Societies 1997, will give you some perspective. An analogous question: when the Isthmus of Panama joined the island continent of South America to the world-continent, in the shape of North America, why did North American mammals overrun South America, triggering a wave of extinctions, whereas only armadillos and possums made the reverse trip? You may also be interested in Alfred Crosby's Ecological Imperialism: The Biological Expansion of Europe, 900-1900 (1986) and its concept of portmanteau fauna from the world-continent, introduced all over the formerly isolated parts of the planet.--Wetman (talk) 23:50, 4 September 2010 (UTC)[reply]
(ec) Well, it's likely there were diseases passed the other way. Syphillis is the classic case of a disease that is thought to have likely originated in the Americas, for example. (It's hard to know for sure; historical records are not so great on this point, and distinguishing a "new disease" from existing ones is a tricky matter, even today, certainly historically.) A broader question is why a Native American version of the common cold, though, never had the same effect. A few speculations of mine: 1. It's different when you're traveling to someplace versus traveling from. If there was a disease that killed Europeans with the same virulence that smallpox or the cold had on the Native Americans, it'd probably have killed them on the boat ride back. 2. Europeans may have had boosted immune systems anyway, from their practice of living in cities and their already increased means of travel. They likely would have been pre-exposed to diseases from Asia, Africa, other parts of Europe, and so on, by the time they were exploring the Americas. --Mr.98 (talk) 23:51, 4 September 2010 (UTC)[reply]
The main issue is that Native Americans went a long distance with a very limited number of people to get to the New World. Some figures, which I don't believe, have been as low as 20 founders, if I recall correctly. Because this was a long, slow migration through (most suppose) arctic conditions, a disease would have to linger in some person for many years in order to keep a foothold in the population, and animal hosts were largely unavailable. Oh, and remember that the Americas don't have other great apes besides humans.
There is an idea that syphilis rode with Columbus back to Europe, but it's not certain. Wnt (talk) 14:37, 9 September 2010 (UTC)[reply]