Wikipedia:WikiProject Conservatism

(Redirected from Wikipedia:Conservatism)


Welcome to WikiProject Conservatism! A friendly and fun place where editors can easily ask questions, meet new colleagues and join A-Team collaborations to create prestigious, high quality A-Class articles. Whether you're a newcomer or regular, you'll receive encouragement and recognition for your achievements with conservatism-related articles. This project does not extol any point of view, political or otherwise, other than that of a neutral documentarian.

  • Have you thought about submitting your new article to "Did You Know"? It's the easiest and funnest way to get your creation on the Main Page. More info can be found in our guide "DYK For Newbies."
  • We're happy to assess your new article as well as developed articles. Make a request here.
  • Experienced editors may want to jump right in and join an A-Team. While A-Class is more rigorous than a Good Article, you don't have to deal with the lengthy backlog at GA. If you already have an article you would like to promote, you can post a request for co-nominators here.
  • Do you have a question? Just ask

Alerts

A broad collection of discussions that could lead to major changes of related articles

Did you know

Articles for deletion

Proposed deletions

Categories for discussion

(21 more...)

Templates for discussion

Redirects for discussion

Good article nominees

Good article reassessments

Requested moves

Articles to be merged

Articles to be split

Articles for creation

Hot articles

A list of related articles with the most (recent) edits
392 edits Springfield, Ohio, cat-eating hoax
197 edits List of Donald Trump 2024 presidential campaign endorsements
109 edits Am I Racist?
99 edits 2024 Liberal Democratic Party (Japan) presidential election
93 edits Laura Loomer
49 edits Lauren Chen
39 edits Lily Tang Williams
27 edits Madrid Forum
26 edits United National Movement (Georgia)
25 edits Donald Trump 2024 presidential campaign

These are the articles that have been edited the most within the last seven days. Last updated 15 September 2024 by HotArticlesBot.

A list of related articles with the most (recent) views

This is a list of pages in the scope of Wikipedia:WikiProject Conservatism along with pageviews.

To report bugs, please write on the Community tech bot talk page on Meta.

List

Period: 2024-08-01 to 2024-08-31

Total views: 57,224,031

Updated: 22:46, 5 September 2024 (UTC)

Rank Page title Views Daily average Assessment Importance
1 Donald Trump 1,545,154 49,843 B High
2 Project 2025 1,453,232 46,878 B High
3 Ronald Reagan 561,622 18,116 FA Top
4 George W. Bush 499,537 16,114 GA High
5 Richard Nixon 450,871 14,544 FA High
6 Vladimir Putin 435,383 14,044 B High
7 George H. W. Bush 418,957 13,514 B High
8 Adam Kinzinger 398,020 12,839 B Low
9 Theodore Roosevelt 363,726 11,733 B High
10 Family of Donald Trump 353,705 11,409 B Low
11 Ann Coulter 348,972 11,257 B Mid
12 Republican Party (United States) 347,720 11,216 B Top
13 Attempted assassination of Donald Trump 337,580 10,889 B Low
14 Gerald Ford 327,327 10,558 C High
15 Winston Churchill 320,452 10,337 GA Top
16 Dwight D. Eisenhower 304,825 9,833 B High
17 John McCain 291,052 9,388 FA Mid
18 Benjamin Netanyahu 289,481 9,338 B Mid
19 Candace Owens 266,465 8,595 B Low
20 Red states and blue states 249,015 8,032 C Mid
21 Zionism 245,844 7,930 B Low
22 Anna Paulina Luna 243,770 7,863 B Low
23 Rishi Sunak 239,937 7,739 B High
24 Jordan Peterson 218,393 7,044 C Low
25 Dick Cheney 212,937 6,868 GA Mid
26 Jon Voight 203,384 6,560 C Low
27 John Wayne 200,188 6,457 B Low
28 Mike Pence 199,718 6,442 B Mid
29 Chuck Norris 197,934 6,384 B Low
30 Margaret Thatcher 197,697 6,377 A Top
31 1964 United States presidential election 189,248 6,104 C Mid
32 Laura Ingraham 188,719 6,087 C Mid
33 Cold War 185,345 5,978 C Top
34 Nick Fuentes 184,201 5,941 B Low
35 Sarah Palin 184,006 5,935 C Mid
36 Far-right politics 183,688 5,925 B Low
37 George Santos 177,794 5,735 B Low
38 Agenda 47 174,814 5,639 C Top
39 Nikki Haley 173,138 5,585 B Low
40 Mitt Romney 171,505 5,532 FA High
41 Fyodor Dostoevsky 165,325 5,333 B Low
42 Ben Shapiro 164,303 5,300 C Mid
43 Charlie Kirk 163,185 5,264 C Low
44 Stephanie Grisham 162,471 5,241 C Low
45 Robert Duvall 160,021 5,161 B Low
46 Bharatiya Janata Party 157,147 5,069 GA High
47 Liz Truss 155,970 5,031 FA Mid
48 Tucker Carlson 153,833 4,962 B High
49 Rupert Murdoch 153,772 4,960 B Low
50 Herbert Hoover 153,261 4,943 B Mid
51 Stephen Baldwin 153,214 4,942 B Low
52 Grover Cleveland 151,768 4,895 FA Mid
53 Bangladesh Nationalist Party 151,491 4,886 C High
54 Kellyanne Conway 147,753 4,766 B Low
55 Taliban 147,145 4,746 B High
56 James Caan 145,694 4,699 C Low
57 Dan Quayle 144,805 4,671 B Mid
58 William McKinley 143,687 4,635 FA Low
59 QAnon 143,605 4,632 GA Mid
60 The Heritage Foundation 141,435 4,562 B High
61 Kelsey Grammer 141,140 4,552 B Low
62 Imran Khan 140,627 4,536 B Low
63 James A. Garfield 138,366 4,463 FA Low
64 Nigel Farage 137,877 4,447 B Mid
65 Calvin Coolidge 134,452 4,337 FA High
66 Warren G. Harding 133,003 4,290 FA Low
67 Kari Lake 132,714 4,281 C Low
68 John Malkovich 132,423 4,271 C Low
69 James Stewart 132,025 4,258 GA Low
70 Douglas Murray (author) 131,951 4,256 C Low
71 Chester A. Arthur 131,602 4,245 FA Low
72 Laura Loomer 128,025 4,129 C Low
73 Boris Johnson 126,814 4,090 B High
74 Anthony Scaramucci 125,712 4,055 C Low
75 Marjorie Taylor Greene 124,135 4,004 GA Low
76 Shinzo Abe 124,029 4,000 B Mid
77 Riley Gaines 123,195 3,974 B Unknown
78 Donald Trump 2024 presidential campaign 122,975 3,966 B Low
79 Matt Walsh (political commentator) 122,444 3,949 C Low
80 Constitution of the United States 121,606 3,922 B High
81 Lauren Boebert 120,039 3,872 B Low
82 Ayn Rand 118,121 3,810 GA Mid
83 William Howard Taft 116,893 3,770 FA Mid
84 Shirley Temple 115,720 3,732 B Low
85 Ted Cruz 115,131 3,713 B Mid
86 Condoleezza Rice 113,663 3,666 B Mid
87 Spiro Agnew 113,350 3,656 FA Mid
88 Clark Gable 112,149 3,617 B Low
89 Charles de Gaulle 110,253 3,556 B Mid
90 George Wallace 110,028 3,549 B Mid
91 James Woods 108,432 3,497 Start Low
92 Francisco Franco 108,415 3,497 C Mid
93 Chiang Kai-shek 107,578 3,470 C Low
94 Muhammad Ali Jinnah 106,432 3,433 FA High
95 Javier Milei 105,755 3,411 B Mid
96 Greg Gutfeld 105,710 3,410 C Low
97 Mary Matalin 103,283 3,331 C Low
98 Mike Lindell 103,126 3,326 C Low
99 Anders Behring Breivik 101,899 3,287 C Low
100 Milo Yiannopoulos 101,880 3,286 C Low
101 Libertarianism 101,553 3,275 B High
102 Ben Carson 101,537 3,275 C Low
103 Brett Cooper (commentator) 101,270 3,266 Stub Low
104 John Kennedy (Louisiana politician) 100,077 3,228 C Low
105 Bo Derek 99,937 3,223 Start Low
106 Falun Gong 99,499 3,209 B Mid
107 Lindsey Graham 98,055 3,163 C Low
108 Charlton Heston 97,854 3,156 B Low
109 Conservative Party (UK) 97,514 3,145 B High
110 Lara Trump 97,420 3,142 C Low
111 Mitch McConnell 97,343 3,140 B Mid
112 Gadsden flag 96,267 3,105 B Low
113 Ron DeSantis 95,189 3,070 B Mid
114 Nancy Reagan 94,769 3,057 B Mid
115 Clarence Thomas 94,612 3,052 B Mid
116 Mike Johnson 93,832 3,026 C Low
117 Neoliberalism 93,777 3,025 B Top
118 Dan Crenshaw 92,584 2,986 B Low
119 Paul Ryan 91,353 2,946 C Mid
120 Groypers 91,279 2,944 B Low
121 Benjamin Harrison 91,185 2,941 FA Low
122 Jeanine Pirro 89,968 2,902 B Low
123 Recep Tayyip Erdoğan 88,917 2,868 B High
124 Roger Stone 88,877 2,867 C Low
125 Scott Baio 88,660 2,860 Start Low
126 Patricia Heaton 88,524 2,855 C Low
127 Whig Party (United States) 87,179 2,812 C Low
128 Dave Mustaine 86,817 2,800 C Low
129 Otto von Bismarck 86,569 2,792 B High
130 Tradwife 86,449 2,788 B Low
131 Reform UK 86,448 2,788 C High
132 Sheldon Adelson 86,281 2,783 C Low
133 Fox News 86,262 2,782 C Mid
134 Thomas Sowell 85,223 2,749 C Mid
135 Rudy Giuliani 84,497 2,725 B Mid
136 Victoria Jackson 83,579 2,696 Start Low
137 David Duke 83,253 2,685 B Mid
138 Geoff Duncan 82,905 2,674 Start Unknown
139 Left–right political spectrum 82,834 2,672 C Top
140 Ron Paul 82,386 2,657 C Mid
141 Chris LaCivita 81,786 2,638 C Low
142 Enoch Powell 80,693 2,603 B High
143 Fred MacMurray 80,667 2,602 C Low
144 Rivers of Blood speech 80,667 2,602 C Low
145 Angela Merkel 80,502 2,596 GA High
146 Nancy Mace 80,360 2,592 B Low
147 Corey Lewandowski 80,084 2,583 C Low
148 House of Bourbon 79,965 2,579 B High
149 Paul von Hindenburg 79,678 2,570 C Mid
150 False or misleading statements by Donald Trump 79,663 2,569 B Low
151 Neil Gorsuch 79,612 2,568 B Mid
152 John Locke 79,606 2,567 C Top
153 Right-wing politics 79,387 2,560 C Top
154 Dana Perino 79,222 2,555 C Low
155 Dmitry Medvedev 79,171 2,553 C High
156 Bing Crosby 78,463 2,531 B Low
157 Deng Xiaoping 78,241 2,523 B Low
158 Barry Goldwater 77,249 2,491 B High
159 Kayleigh McEnany 77,075 2,486 C Low
160 Bob Dole 76,422 2,465 B Low
161 David Cameron 75,523 2,436 B Top
162 Arthur Wellesley, 1st Duke of Wellington 75,450 2,433 B Low
163 Charles Lindbergh 75,290 2,428 B Low
164 Gary Sinise 74,679 2,409 C Low
165 Karl Malone 74,555 2,405 Start Low
166 Liz Cheney 73,742 2,378 B High
167 Alternative for Germany 73,114 2,358 C Low
168 Jane Russell 72,946 2,353 B Low
169 Iran–Contra affair 71,835 2,317 GA Low
170 Strom Thurmond 71,805 2,316 B Mid
171 Joe Kent 71,615 2,310 C Low
172 Johann Wolfgang von Goethe 71,224 2,297 B Low
173 Tim Scott 71,043 2,291 C Low
174 Stephen Miller (political advisor) 70,887 2,286 Start Low
175 Steve Bannon 70,815 2,284 B Mid
176 Dinesh D'Souza 70,415 2,271 B Mid
177 1924 United States presidential election 70,389 2,270 C Low
178 Neoconservatism 70,263 2,266 C Top
179 Jeb Bush 69,270 2,234 B Low
180 Milton Friedman 67,789 2,186 GA High
181 Rumble (company) 67,785 2,186 Start Low
182 Sarah Huckabee Sanders 67,747 2,185 C Low
183 Rutherford B. Hayes 67,499 2,177 FA Low
184 Truth Social 67,019 2,161 B Low
185 Kristi Noem 66,821 2,155 B Low
186 Gary Cooper 66,745 2,153 FA Mid
187 Angie Harmon 66,265 2,137 C Low
188 Theresa May 66,150 2,133 B Mid
189 Conservatism 66,086 2,131 B Top
190 Reform Party of the United States of America 66,075 2,131 C Low
191 Kevin McCarthy 65,815 2,123 C Low
192 Dan Bongino 65,424 2,110 C Unknown
193 Mike Gabbard 65,366 2,108 Start Unknown
194 Blaire White 64,470 2,079 Start Low
195 Chick-fil-A 64,428 2,078 C Low
196 John Roberts 64,416 2,077 B High
197 Viktor Orbán 64,288 2,073 C Mid
198 Proud Boys 64,283 2,073 C Low
199 Matt Gaetz 64,229 2,071 C Low
200 Laura Bush 63,724 2,055 GA Low
201 Rick Scott 63,093 2,035 C Low
202 Michael Steele 62,719 2,023 B Low
203 Ginger Rogers 62,361 2,011 C Unknown
204 Craig T. Nelson 62,196 2,006 Start Unknown
205 Greg Abbott 62,097 2,003 B Mid
206 Rush Limbaugh 61,954 1,998 B High
207 Jack Kemp 61,804 1,993 GA Mid
208 Tom Clancy 61,779 1,992 C Low
209 Barbara Stanwyck 61,764 1,992 B Low
210 T. S. Eliot 61,704 1,990 B Low
211 Rasmussen Reports 61,358 1,979 Start Low
212 McCarthyism 61,224 1,974 C High
213 Newt Gingrich 60,763 1,960 GA High
214 Bob Hope 59,701 1,925 B Low
215 Ted Nugent 59,406 1,916 C Low
216 John Major 59,311 1,913 B High
217 Jair Bolsonaro 59,070 1,905 B Mid
218 Brett Kavanaugh 58,057 1,872 B High
219 Sean Hannity 58,011 1,871 B Mid
220 Linda McMahon 57,902 1,867 B Low
221 Capitalism 57,284 1,847 C Top
222 Jatiya Party (Ershad) 57,154 1,843 C Mid
223 Unification Church 56,959 1,837 B Unknown
224 Anthony Eden 56,948 1,837 B Mid
225 Donald Rumsfeld 56,946 1,836 B Mid
226 Byron Donalds 56,828 1,833 C Low
227 Neville Chamberlain 56,641 1,827 FA Mid
228 Redneck 56,441 1,820 C Low
229 Make America Great Again 56,285 1,815 C Low
230 Bill O'Reilly (political commentator) 56,235 1,814 B Mid
231 The Epoch Times 55,604 1,793 B Low
232 Denis Leary 55,422 1,787 C NA
233 Daily Mail 55,219 1,781 B Mid
234 Amy Coney Barrett 54,961 1,772 C Low
235 Opus Dei 54,799 1,767 C Mid
236 Melissa Joan Hart 54,642 1,762 B Low
237 White supremacy 54,160 1,747 B Low
238 Orange County, California 54,088 1,744 B Mid
239 The Gateway Pundit 53,483 1,725 C Unknown
240 Lil Pump 53,338 1,720 B Low
241 Pat Sajak 52,839 1,704 C Low
242 Jacobitism 52,520 1,694 B High
243 Tom Cotton 52,500 1,693 C Low
244 Presidency of Donald Trump 52,259 1,685 B Low
245 John Layfield 52,150 1,682 B Low
246 Liberal Democratic Party (Japan) 51,685 1,667 C High
247 Lee Hsien Loong 51,337 1,656 C Mid
248 Pat Buchanan 51,221 1,652 B Mid
249 Meghan McCain 50,905 1,642 C Low
250 Rachel Campos-Duffy 50,472 1,628 Start Low
251 The Daily Wire 50,422 1,626 C Low
252 Kelly Ayotte 50,360 1,624 C Low
253 Rand Paul 50,354 1,624 GA Mid
254 Ben Sasse 49,936 1,610 B Low
255 The Wall Street Journal 49,828 1,607 B Mid
256 Mullah Omar 49,812 1,606 B High
257 Tim Pawlenty 49,639 1,601 B Mid
258 Federalist Party 49,113 1,584 C Low
259 Nicolas Sarkozy 49,046 1,582 B High
260 Shiv Sena 49,020 1,581 C Unknown
261 Mahathir Mohamad 48,852 1,575 GA High
262 Tomi Lahren 48,154 1,553 Start Low
263 Larry Hogan 47,627 1,536 B Low
264 David Mamet 47,268 1,524 C Low
265 Stacey Dash 47,175 1,521 C Low
266 Edward Teller 47,032 1,517 FA Low
267 Trumpism 46,964 1,514 B Mid
268 Michael Farmer, Baron Farmer 46,833 1,510 C Low
269 Rick Wilson (political consultant) 46,792 1,509 Stub Low
270 Benjamin Disraeli 46,150 1,488 FA Top
271 Muhammad Zia-ul-Haq 46,051 1,485 B High
272 Patriotic Alternative 45,977 1,483 C Low
273 United Russia 45,909 1,480 B High
274 Pat Boone 45,755 1,475 C Low
275 Last Man Standing (American TV series) 45,755 1,475 B Low
276 Likud 45,641 1,472 C Low
277 Victor Davis Hanson 45,486 1,467 B Mid
278 Trump derangement syndrome 45,467 1,466 C Mid
279 Terri Schiavo case 45,378 1,463 GA Low
280 Conservatism in the United States 45,201 1,458 B Top
281 Priti Patel 44,558 1,437 C Unknown
282 Edward Heath 44,508 1,435 B High
283 Liberty University 44,489 1,435 B Mid
284 Tea Party movement 44,486 1,435 C Mid
285 Don King 44,068 1,421 B Low
286 William F. Buckley Jr. 43,823 1,413 C High
287 Dave Ramsey 43,821 1,413 C Unknown
288 Manosphere 43,768 1,411 Start Low
289 National Rally 43,741 1,411 GA High
290 Elisabeth Hasselbeck 43,558 1,405 C Low
291 John C. Calhoun 43,543 1,404 FA Low
292 John Rocker 43,405 1,400 C Unknown
293 Nawaz Sharif 43,158 1,392 B Unknown
294 Right-wing populism 42,697 1,377 C Low
295 Boogaloo movement 42,678 1,376 B Low
296 Harold Macmillan 42,665 1,376 B High
297 Loretta Young 42,629 1,375 C Low
298 Ustaše 42,621 1,374 C High
299 Antonin Scalia 42,169 1,360 FA High
300 The Daily Telegraph 42,169 1,360 C Low
301 Patriots for Europe 42,070 1,357 C Low
302 Jemima Goldsmith 42,029 1,355 C Unknown
303 Thomas Massie 41,167 1,327 B Low
304 Chris Christie 41,141 1,327 C Low
305 Mike Huckabee 40,957 1,321 B Mid
306 Park Chung Hee 40,805 1,316 C Low
307 2016 Republican Party presidential primaries 40,592 1,309 B Mid
308 The Times of India 40,568 1,308 C Mid
309 Fred Thompson 40,495 1,306 B Low
310 Jackson Hinkle 40,154 1,295 B Low
311 Marco Rubio 40,049 1,291 B Mid
312 Chuck Grassley 39,917 1,287 C Mid
313 Conservative Party of Canada 39,749 1,282 B High
314 John Birch Society 39,560 1,276 C Low
315 Booker T. Washington 39,530 1,275 B Low
316 Jacob Rees-Mogg 39,309 1,268 C Low
317 Libs of TikTok 39,242 1,265 C Low
318 Laissez-faire 38,908 1,255 C Top
319 Scott Jensen (Minnesota politician) 38,907 1,255 Start Unknown
320 John Warner 38,668 1,247 C Low
321 Betsy DeVos 38,617 1,245 C Mid
322 Walter Brennan 38,543 1,243 C Low
323 New York Post 38,531 1,242 C Low
324 Blue Dog Coalition 38,344 1,236 C Low
325 Samuel Alito 38,308 1,235 C Mid
326 Martin Heidegger 37,935 1,223 C Low
327 Frank Luntz 37,859 1,221 B Low
328 Bourbon Restoration in France 37,748 1,217 C High
329 Tudor Dixon 37,718 1,216 B Low
330 Karen Pence 37,573 1,212 C Low
331 Michael Reagan 37,440 1,207 C Low
332 Alt-right 37,357 1,205 C Mid
333 The Fountainhead 37,315 1,203 FA Low
334 James Cagney 37,302 1,203 GA Low
335 Bible Belt 37,038 1,194 C Low
336 Marsha Blackburn 36,945 1,191 C Low
337 Suella Braverman 36,813 1,187 C Low
338 Classical liberalism 36,788 1,186 B Top
339 Roger Ailes 36,599 1,180 C Mid
340 William Barr 36,538 1,178 B Unknown
341 Christopher Luxon 36,202 1,167 B Unknown
342 Jim Hagedorn 36,155 1,166 Start Low
343 Tricia Nixon Cox 36,027 1,162 Start Low
344 2024 Liberal Democratic Party (Japan) presidential election 35,993 1,161 C Unknown
345 John Bolton 35,894 1,157 C Mid
346 Edmund Burke 35,873 1,157 B Top
347 Honoré de Balzac 35,651 1,150 FA High
348 Dark Enlightenment 35,629 1,149 Start Mid
349 Gary Johnson 35,600 1,148 GA High
350 Gretchen Whitmer kidnapping plot 35,311 1,139 C Low
351 Robert Davi 35,310 1,139 Start Low
352 Joe Arpaio 35,200 1,135 B Low
353 InfoWars 35,177 1,134 C Low
354 Jim Jordan 35,162 1,134 B Low
355 Stephen Harper 35,142 1,133 GA High
356 Oliver North 35,065 1,131 C Mid
357 Johnny Ramone 35,027 1,129 C Low
358 Gavin McInnes 34,857 1,124 C Low
359 Tommy Tuberville 34,737 1,120 B Low
360 Michael Waltz 34,653 1,117 Start Low
361 Mark Rutte 34,416 1,110 C High
362 Glenn Beck 34,279 1,105 B Mid
363 Julius Evola 34,128 1,100 B Low
364 Mike Lee 34,043 1,098 C Low
365 Mark Levin 33,927 1,094 Start High
366 Thomas Mann 33,911 1,093 C Mid
367 Jeff Sessions 33,814 1,090 Start Unknown
368 Éamon de Valera 33,583 1,083 B High
369 Tom Wolfe 33,369 1,076 B Low
370 Islamism 33,285 1,073 B High
371 UK Independence Party 33,195 1,070 B Low
372 John Boehner 33,158 1,069 Start High
373 Tom Tugendhat 32,951 1,062 B Low
374 Swift Vets and POWs for Truth 32,878 1,060 C Low
375 Elaine Chao 32,712 1,055 B Low
376 Madison Cawthorn 32,711 1,055 C Low
377 Political spectrum 32,656 1,053 C Top
378 Lisa Murkowski 32,422 1,045 C High
379 Nuclear family 32,408 1,045 Start Low
380 Franz von Papen 32,280 1,041 B Low
381 Barack Obama citizenship conspiracy theories 32,226 1,039 B Low
382 Arianna Huffington 32,079 1,034 B Low
383 Constitution Party (United States) 31,973 1,031 C Low
384 Southern strategy 31,945 1,030 B High
385 Republican National Committee 31,839 1,027 C Mid
386 Phil Robertson 31,559 1,018 C Low
387 American Independent Party 31,513 1,016 C Low
388 Rick Perry 31,239 1,007 B Mid
389 Tammy Bruce 30,988 999 Start Low
390 Marine Le Pen 30,898 996 B Low
391 Hillsdale College 30,867 995 C Low
392 White genocide conspiracy theory 30,854 995 B Low
393 Brothers of Italy 30,607 987 B Mid
394 Peter Hitchens 30,537 985 B Unknown
395 Drudge Report 30,418 981 B Mid
396 Charles Koch 30,336 978 B Low
397 Friedrich Hayek 30,072 970 B Top
398 Maria Butina 29,984 967 C Low
399 Gretchen Carlson 29,937 965 B Low
400 Europe of Sovereign Nations 29,914 964 C High
401 Turning Point USA 29,912 964 C Low
402 Christian Democratic Union of Germany 29,896 964 C High
403 S. E. Cupp 29,804 961 Start Unknown
404 Koch family 29,755 959 Start High
405 RealClearPolitics 29,729 959 C Mid
406 Lillian Gish 29,603 954 C Low
407 Catturd 29,558 953 C Low
408 Freedom Caucus 29,300 945 C Low
409 History of the Republican Party (United States) 29,172 941 B High
410 BC United 29,159 940 C Mid
411 Newsmax 29,135 939 Start Low
412 John Kasich 28,994 935 B Mid
413 David Koch 28,951 933 C Mid
414 James Cleverly 28,898 932 C Low
415 Cultural Marxism conspiracy theory 28,824 929 B Low
416 Presidency of Ronald Reagan 28,757 927 C High
417 Ronny Jackson 28,742 927 C Low
418 Fianna Fáil 28,525 920 B Low
419 Grey Wolves (organization) 28,384 915 B Mid
420 Mandate for Leadership 28,313 913 Start Low
421 Paul Dans 28,307 913 Start Low
422 John Nance Garner 28,200 909 Start High
423 Steven Crowder 28,055 905 C Mid
424 Rick Santorum 28,006 903 B Mid
425 Jacob Chansley 27,913 900 B Low
426 Tories (British political party) 27,896 899 C High
427 Alessandra Mussolini 27,865 898 B Unknown
428 Mike DeWine 27,830 897 B Low
429 Bill Kristol 27,803 896 B High
430 Dennis Miller 27,763 895 Start Low
431 Ward Bond 27,746 895 C Low
432 Elise Stefanik 27,733 894 B Low
433 White movement 27,685 893 B Mid
434 Samuel Taylor Coleridge 27,610 890 C Top
435 Reaganomics 27,543 888 B Mid
436 Charlie Crist 27,358 882 B Low
437 Breitbart News 27,300 880 C Mid
438 What Is a Woman? 27,196 877 B Low
439 Morgan Ortagus 27,036 872 C Unknown
440 Jamaat-e-Islami (Pakistan) 27,007 871 B Low
441 John Connally 26,991 870 B Mid
442 John O'Hurley 26,982 870 Start Low
443 Alpha and beta male 26,916 868 Start Low
444 Bobby Jindal 26,904 867 B Mid
445 Donald Trump 2000 presidential campaign 26,738 862 GA Mid
446 Profumo affair 26,622 858 FA Mid
447 William Rehnquist 26,594 857 B High
448 Frankfurt School 26,579 857 B Low
449 2020 Republican Party presidential primaries 26,554 856 B Mid
450 Kataeb Party 26,459 853 B Low
451 Economic policy of the Donald Trump administration 26,401 851 Start Low
452 Dave Rubin 26,393 851 C Low
453 First impeachment of Donald Trump 26,382 851 B High
454 The Daily Caller 26,357 850 C Mid
455 Natural law 26,341 849 C Top
456 Will Cain 26,340 849 Start Mid
457 Presidency of George W. Bush 26,278 847 C High
458 Herman Cain 26,242 846 C Mid
459 Matt Bevin 26,184 844 GA Low
460 Steele dossier 26,141 843 B Low
461 2012 Republican Party presidential primaries 26,093 841 B Mid
462 2008 California Proposition 8 26,085 841 B Mid
463 David Frum 26,079 841 C Low
464 Chuck Woolery 26,026 839 C Low
465 Social stratification 25,980 838 C High
466 Fine Gael 25,820 832 B High
467 Ben Stein 25,726 829 C Low
468 Orson Scott Card 25,635 826 B Low
469 Jesse Lee Peterson 25,611 826 Start Low
470 Dixiecrat 25,530 823 Start Mid
471 Justice and Development Party (Turkey) 25,468 821 B Low
472 Justin Amash 25,467 821 B Low
473 Alec Douglas-Home 25,372 818 FA Low
474 Kwasi Kwarteng 25,332 817 B Low
475 Michele Bachmann 25,287 815 B Mid
476 Phyllis Schlafly 25,268 815 B High
477 Lauren Southern 25,257 814 Start Mid
478 Rule of law 25,247 814 C Top
479 Blake Masters 25,189 812 C Unknown
480 Paleoconservatism 25,111 810 C Top
481 Ernst Jünger 25,083 809 Start Top
482 Jerry Falwell 25,059 808 B High
483 Louis B. Mayer 25,034 807 C Low
484 Fairness doctrine 24,958 805 C Mid
485 Moms for Liberty 24,827 800 B Low
486 Illegal immigration to the United States 24,803 800 B Low
487 Anti-communism 24,756 798 B Mid
488 People's Action Party 24,694 796 C Mid
489 Karl Rove 24,652 795 B Mid
490 Anita Bryant 24,641 794 B High
491 Liberal Party of Australia 24,613 793 C High
492 Lou Dobbs 24,427 787 C Mid
493 Lawrence B. Jones 24,419 787 Start Unknown
494 BioShock Infinite 24,416 787 B Low
495 Free Democratic Party (Germany) 24,395 786 C Mid
496 Katie Britt 24,247 782 C Low
497 Liaquat Ali Khan 24,233 781 B Low
498 Promised Land 24,119 778 C Low
499 Steve Schmidt 24,102 777 C Low
500 GypsyCrusader 24,035 775 C Low

Watchlist for today

A list of edits done on the popular articles above just today
For a watchlist going back several more days, visit: Wikipedia:WikiProject Conservatism/Recent changes


Discuss  · Edit

List of abbreviations (help):
D
Edit made at Wikidata
r
Edit flagged by ORES
N
New page
m
Minor edit
b
Bot edit
(±123)
Page byte size change

15 September 2024

New articles

A list of semi-related articles that were recently created

This list was generated from these rules. Questions and feedback are always welcome! The search is being run daily with the most recent ~14 days of results. Note: Some articles may not be relevant to this project.

Rules | Match log | Results page (for watching) | Last updated: 2024-09-14 20:04 (UTC)

Note: The list display can now be customized by each user. See List display personalization for details.
















In The Signpost

One of various articles to this effect
July 2018
DISCUSSION REPORT
WikiProject Conservatism Comes Under Fire

By Lionelt

WikiProject Conservatism was a topic of discussion at the Administrators' Noticeboard/Incident (AN/I). Objective3000 started a thread where he expressed concern regarding the number of RFC notices posted on the Discussion page suggesting that such notices "could result in swaying consensus by selective notification." Several editors participated in the relatively abbreviated six hour discussion. The assertion that the project is a "club for conservatives" was countered by editors listing examples of users who "profess no political persuasion." It was also noted that notification of WikiProjects regarding ongoing discussions is explicitly permitted by the WP:Canvassing guideline.

At one point the discussion segued to feedback about The Right Stuff. Member SPECIFICO wrote: "One thing I enjoy about the Conservatism Project is the handy newsletter that members receive on our talk pages." Atsme praised the newsletter as "first-class entertainment...BIGLY...first-class...nothing even comes close...it's amazing." Some good-natured sarcasm was offered with Objective3000 observing, "Well, they got the color right" and MrX's followup, "Wow. Yellow is the new red."

Admin Oshwah closed the thread with the result "definitely not an issue for ANI" and directing editors to the project Discussion page for any further discussion. Editor's note: originally the design and color of The Right Stuff was chosen to mimic an old, paper newspaper.

Add the Project Discussion page to your watchlist for the "latest RFCs" at WikiProject Conservatism Watch (Discuss this story)

ARTICLES REPORT
Margaret Thatcher Makes History Again

By Lionelt

Margaret Thatcher is the first article promoted at the new WikiProject Conservatism A-Class review. Congratulations to Neveselbert. A-Class is a quality rating which is ranked higher than GA (Good article) but the criteria are not as rigorous as FA (Featued article). WikiProject Conservatism is one of only two WikiProjects offering A-Class review, the other being WikiProject Military History. Nominate your article here. (Discuss this story)
RECENT RESEARCH
Research About AN/I

By Lionelt

Reprinted in part from the April 26, 2018 issue of The Signpost; written by Zarasophos

Out of over one hundred questioned editors, only twenty-seven (27%) are happy with the way reports of conflicts between editors are handled on the Administrators' Incident Noticeboard (AN/I), according to a recent survey . The survey also found that dissatisfaction has varied reasons including "defensive cliques" and biased administrators as well as fear of a "boomerang effect" due to a lacking rule for scope on AN/I reports. The survey also included an analysis of available quantitative data about AN/I. Some notable takeaways:

  • 53% avoided making a report due to fearing it would not be handled appropriately
  • "Otherwise 'popular' users often avoid heavy sanctions for issues that would get new editors banned."
  • "Discussions need to be clerked to keep them from raising more problems than they solve."

In the wake of Zarasophos' article editors discussed the AN/I survey at The Signpost and also at AN/I. Ironically a portion of the AN/I thread was hatted due to "off-topic sniping." To follow-up the problems identified by the research project the Wikimedia Foundation Anti-Harassment Tools team and Support and Safety team initiated a discussion. You can express your thoughts and ideas here.

(Discuss this story)

Delivered: ~~~~~


File:Finally, a public-domain "Finally," template.jpg
Goodyear
PD
47
0
442
WikiProject Conservatism

Is Wikipedia Politically Biased? Perhaps


A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.


Report by conservative think-tank presents ample quantitative evidence for "mild to moderate" "left-leaning bias" on Wikipedia

A paper titled "Is Wikipedia Politically Biased?"[1] answers that question with a qualified yes:

[...] this report measures the sentiment and emotion with which political terms are used in [English] Wikipedia articles, finding that Wikipedia entries are more likely to attach negative sentiment to terms associated with a right-leaning political orientation than to left-leaning terms. Moreover, terms that suggest a right-wing political stance are more frequently connected with emotions of anger and disgust than those that suggest a left-wing stance. Conversely, terms associated with left-leaning ideology are more frequently linked with the emotion of joy than are right-leaning terms.
Our findings suggest that Wikipedia is not entirely living up to its neutral point of view policy, which aims to ensure that content is presented in an unbiased and balanced manner.

The author (David Rozado, an associate professor at Otago Polytechnic) has published ample peer-reviewed research on related matters before, some of which was featured e.g. in The Guardian and The New York Times. In contrast, the present report is not peer-reviewed and was not posted in an academic venue, unlike most research we cover here usually. Rather, it was published (and possibly commissioned) by the Manhattan Institute, a conservative US think tank, which presumably found its results not too objectionable. (Also, some – broken – URLs in the PDF suggest that Manhattan Institute staff members were involved in the writing of the paper.) Still, the report indicates an effort to adhere to various standards of academic research publications, including some fairly detailed descriptions of the methods and data used. It is worth taking it more seriously than, for example, another recent report that alleged a different form of political bias on Wikipedia, which had likewise been commissioned by an advocacy organization and authored by an academic researcher, but was met with severe criticism by the Wikimedia Foundation (who called it out for "unsubstantiated claims of bias") and volunteer editors (see prior Signpost coverage).

That isn't to say that there can't be some questions about the validity of Rozado's results, and in particular about how to interpret them. But let's first go through the paper's methods and data sources in more detail.

Determining the sentiment and emotion in Wikipedia's coverage

The report's main results regarding Wikipedia are obtained as follows:

"We first gather a set of target terms (N=1,628) with political connotations (e.g., names of recent U.S. presidents, U.S. congressmembers, U.S. Supreme Court justices, or prime ministers of Western countries) from external sources. We then identify all mentions in English-language Wikipedia articles of those terms.

We then extract the paragraphs in which those terms occur to provide the context in which the target terms are used and feed a random sample of those text snippets to an LLM (OpenAI’s gpt-3.5-turbo), which annotates the sentiment/emotion with which the target term is used in the snippet. To our knowledge, this is the first analysis of political bias in Wikipedia content using modern LLMs for annotation of sentiment/emotion."

The sentiment classification rates the mention of a terms as negative, neutral or positive. (For the purpose of forming averages this is converted into a quantitative scale from -1 to +1.) See the end of this review for some concrete examples from the paper's published dataset.

The emotion classification uses "Ekman’s six basic emotions (anger, disgust, fear, joy, sadness, and surprise) plus neutral."

The annotation method used appears to be an effort to avoid the shortcomings of popular existing sentiment analysis techniques, which often only rate the overall emotional stance of a given text overall without determining whether it actually applies to a specific entity mentioned in it (or in some cases even fail to handle negations, e.g. by classifying "I am not happy" as a positive emotion). Rozado justifies the "decision to use automated annotation" (which presumably rendered considerable cost savings, also by resorting to OpenAI's older GPT 3.5 model rather than the more powerful but more expensive GPT-4 API released in March 2023) citing "recent evidence showing how top-of-the-rank LLMs outperform crowd workers for text-annotation tasks such as stance detection." This is indeed becoming a more widely used choice for text classification. But Rozado appears to have skipped the usual step of evaluating the accuracy of this automated method (and possibly improving the prompts it used) against a gold standard sample from (human) expert raters.

Selecting topics to examine for bias

As for the selection of terms whose Wikipedia coverage to annotate with this classifier, Rozado does a lot of due diligence to avoid cherry-picking: "To reduce the degrees of freedom of our analysis, we mostly use external sources of terms [including Wikipedia itself, e.g. its list of members of the 11th US Congress] to conceptualize a political category into left- and right-leaning terms, as well as to choose the set of terms to include in each category." This addresses an important source of researcher bias.

Overall, the study arrives at 12 different groups of such terms:

  • 8 of these refer to people (e.g. US presidents, US senators, UK members of parliament, US journalists).
  • Two are about organizations (US think tanks and media organizations).
  • The other two groups contain "Terms that describe political orientation", i.e. expressions that carry a left-leaning or right-leaning meaning themselves:
    • 18 "political leanings" (where "Rightists" receives the lowest average sentiment and "Left winger" the highest), and
    • 21 "extreme political ideologies" (where "Ultraconservative" scores lowest and "radical-left" has the highest – but still slightly negative – average sentiment)

What is "left-leaning" and "right-leaning"?

As discussed, Rozado's methods for generating these lists of people and organizations seem reasonably transparent and objective. It gets a bit murkier when it comes to splitting them into "left-leaning" and "right-leaning", where the chosen methods remain unclear and/or questionable in some cases. Of course there is a natural choice available for US Congress members, where the confines of the US two-party system mean that the left-right spectrum can be easily mapped easily to Democrats vs. Republicans (disregarding a small number of independents or libertarians).

In other cases, Rozado was able to use external data about political leanings, e.g. "a list of politically aligned U.S.-based journalists" from Politico. There may be questions about construct validity here (e.g. it classifies Glenn Greenwald or Andrew Sullivan as "journalists with the left"), but at least this data is transparent and determined by a source not invested in the present paper's findings.

But for example the list of UK MPs used contains politicians from 14 different parties (plus independents). Even if one were to confine the left vs. right labels to the two largest groups in the UK House of Commons (Tories vs. Labour and Co-operative Party, which appears to have been the author's choice judging from Figure 5), the presence of a substantial number of parliamentarians from other parties to the left or right of those would make the validity of this binary score more questionable than in the US case. Rozado appears to acknowledge a related potential issue in a side remark when trying to offer an explanation for one of the paper's negative results (no bias) in this case: "The disparity of sentiment associations in Wikipedia articles between U.S. Congressmembers and U.K. MPs based on their political affiliation may be due in part to the higher level of polarization in the U.S. compared to the U.K."

 
Most negative sentiment among Western leaders: Former Australian PM Tony Abbott
 
Most positive sentiment among Western leaders: Former Australian PM Scott Morrison

This kind of question become even more complicated for the "Leaders of Western Countries" list (where Tony Abbott scored the most negative average sentiment, and José Luis Rodríguez Zapatero and Scott Morrison appear to be in a tie for the most positive average sentiment). Most of these countries do not have a two-party system either. Sure, their leaders usually (like in the UK case) hail from one of the two largest parties, one of which is more to the left and the another more to the right. But it certainly seems to matter for the purpose of Rozado's research question whether that major party is more moderate (center-left or center-right, with other parties between it and the far left or far right) or more radical (i.e. extending all the way to the far-left or far-right spectrum of elected politicians).

What's more, the analysis for this last group compares political orientations across multiple countries. Which brings us to a problem that Wikipedia's Jimmy Wales had already pointed to back in 2006 in response a conservative US blogger who had argued that there was "a liberal bias in many hot-button topic entries" on English Wikipedia:

"The Wikipedia community is very diverse, from liberal to conservative to libertarian and beyond. If averages mattered, and due to the nature of the wiki software (no voting) they almost certainly don't, I would say that the Wikipedia community is slightly more liberal than the U.S. population on average, because we are global and the international community of English speakers is slightly more liberal than the U.S. population. ... The idea that neutrality can only be achieved if we have some exact demographic matchup to [the] United States of America is preposterous."

We already discussed this issue in our earlier reviews of a notable series of papers by Greenstein and Zhu (see e.g.: "Language analysis finds Wikipedia's political bias moving from left to right", 2012), which had relied on a US-centric method of defining left-leaning and right-leaning (namely, a corpus derived from the US Congressional Record). Those studies form a large part of what Rozado cites as "[a] substantial body of literature [that]—albeit with some exceptions—has highlighted a perceived bias in Wikipedia content in favor of left-leaning perspectives." (The cited exception is a paper[2] that had found "a small to medium size coverage bias against [members of parliament] from the center-left parties in Germany and in France", and identified patterns of "partisan contributions" as a plausible cause.)

Similarly, 8 out of the 10 groups of people and organizations analyzed in Rozado's study are from the US (the two exceptions being the aforementioned lists of UK MPs and leaders of Western countries).

In other words, one potential reason for the disparities found by Rozado might simply be that he is measuring an international encyclopedia with a (largely) national yardstick of fairness. This shouldn't let us dismiss his findings too easily. But it is a bit disappointing that this possibility is nowhere addressed in the paper, even though Rozado diligently discusses some other potential limitations of the results. E.g. he notes that "some research has suggested that conservatives themselves are more prone to negative emotions and more sensitive to threats than liberals", but points out that the general validity of those research results remains doubtful.

Another limitation is that a simple binary left vs. right classification might be hiding factors that can shed further light on bias findings. Even in the US with its two-party system, political scientists and analysts have long moved to less simplistic measures of political orientations. A widely used one is the NOMINATE method which assigns members of the US Congress continuous scores based on their detailed voting record, one of which corresponds to the left-right spectrum as traditionally understood. One finding based on that measure that seems relevant in context of the present study is the (widely discussed but itself controversial) asymmetric polarization thesis, which argues that "Polarization among U.S. legislators is asymmetric, as it has primarily been driven by a substantial rightward shift among congressional Republicans since the 1970s, alongside a much smaller leftward shift among congressional Democrats" (as summarized in the linked Wikipedia article). If, for example, higher polarization was associated with negative sentiments, this could be a potential explanation for Rozado's results. Again, this has to remain speculative, but it seems another notable omission in the paper's discussion of limitations.

What does "bias" mean here?

A fundamental problem of this study, which, to be fair, it shares with much fairness and bias research (in particular on Wikipedia's gender gap, where many studies similarly focus on binary comparisons that are likely to successfully appeal to an intuitive sense of fairness) consists of justifying its answers to the following two basic questions:

  1. What would be a perfectly fair baseline, a result that makes us confident to call Wikipedia unbiased?
  2. If there are deviations from that baseline (often labeled disparities, gaps or biases), what are the reasons for that – can we confidently assume they were caused by Wikipedia itself (e.g. demographic imbalances in Wikipedia's editorship), or are they more plausibly attributed to external factors?

Regarding 1 (defining a baseline of unbiasedness), Rozado simply assumes that this should imply statistically indistinguishable levels of average sentiment between left and right-leaning terms. However, as cautioned by one leading scholar on quantitative measures of bias, "the 'one true fairness definition' is a wild goose chase" – there are often multiple different definitions available that can all be justified on ethical grounds, and are often contradictory. Above, we already alluded to two potentially diverging notions of political unbiasedness for Wikipedia (using an international instead of US metric for left vs right leaning, and taking into account polarization levels for politicians).

But yet another question, highly relevant for Wikipedians interested in addressing the potential problems reported in this paper, is how much its definition lines up with Wikipedia's own definition of neutrality. Rozado clearly thinks that it does:

Wikipedia’s neutral point of view (NPOV) policy aims for articles in Wikipedia to be written in an impartial and unbiased tone. Our results suggest that Wikipedia’s NPOV policy is not achieving its stated goal of political-viewpoint neutrality in Wikipedia articles.

WP:NPOV indeed calls for avoiding subjective language and expressing judgments and opinions in Wikipedia's own voice, and Rozado's findings about the presence of non-neutral sentiments and emotions in Wikipedia articles are of some concern in that regard. However, that is not the core definition of NPOV. Rather, it refers to "representing fairly, proportionately, and, as far as possible, without editorial bias, all the significant views that have been published by reliable sources on a topic." What if the coverage of the terms examined by Rozado (politicians, etc.) in those reliable sources, in their aggregate, were also biased in the sense of Rozado's definition? US progressives might be inclined to invoke the snarky dictum "reality has a liberal bias" by comedian Stephen Colbert. Of course, conservatives might object that Wikipedia's definition of reliable sources (having "a reputation for fact-checking and accuracy") is itself biased, or applied in a biased way by Wikipedians. For some of these conservatives (at least those that are not also conservative feminists) it may be instructive to compare examinations of Wikipedia's gender gaps, which frequently focus on specific groups of notable people like in Rozado's study. And like him, they often implicitly assume a baseline of unbiasedness that implies perfect symmetry in Wikipedia's coverage – i.e. the absence of gaps or disparities. Wikipedians often object that this is in tension with the aforementioned requirement to reflect coverage in reliable sources. For example, Wikipedia's list of Fields medalists (the "Nobel prize of Mathematics") is 97% male – not because of Wikipedia editors' biases against women, but because of a severe gender imbalance in the field of mathematics that is only changing slowly, i.e. factors outside Wikipedia's influence.

All this brings us to question 2. above (causality). While Rozado uses carefully couched language in this regard ("suggests" etc, e.g. "These trends constitute suggestive evidence of political bias embedded in Wikipedia articles"), such qualifications are unsurprisingly absent in much of the media coverage of this study (see also this issue's In the media). For example, the conservative magazine The American Spectator titled its article about the paper "Now We've Got Proof that Wikipedia is Biased."

Commendably, the paper is accompanied by a published dataset, consisting of the analyzed Wikipedia text snippets together with the mentioned term and the sentiment or emotion identified by the automated annotation. For illustration, below are the sentiment ratings for mentions of the Yankee Institute for Public Policy (the last term in the dataset, as a non-cherry-picked example), with the term bolded:

Dataset excerpt: Wikipedia paragraphs with sentiment for "Yankee Institute for Public Policy"
positive "Carol Platt Liebau is president of the Yankee Institute for Public Policy.Liebau named new president of Yankee Institute She is also an attorney, political analyst, and conservative commentator. Her book Prude: How the Sex-Obsessed Culture Damages Girls (and America, Too!) was published in 2007."
neutral "Affiliates

Regular members are described as ""full-service think tanks"" operating independently within their respective states.

Alabama: Alabama Policy Institute
Alaska: Alaska Policy Forum
[...]
Connecticut: Yankee Institute for Public Policy
[...]
Wisconsin: MacIver Institute for Public Policy, Badger Institute, Wisconsin Institute for Law and Liberty, Institute for Reforming Government
Wyoming: Wyoming Liberty Group"
positive "The Yankee Institute for Public Policy is a free market, limited government American think tank based in Hartford, Connecticut, that researches Connecticut public policy questions. Organized as a 501(c)(3), the group's stated mission is to ""develop and advocate for free market, limited government public policy solutions in Connecticut."" Yankee was founded in 1984 by Bernard Zimmern, a French entrepreneur who was living in Norwalk, Connecticut, and Professor Gerald Gunderson of Trinity College. The organization is a member of the State Policy Network."
neutral "He is formerly Chairman of the Yankee Institute for Public Policy. On November 3, 2015, he was elected First Selectman in his hometown of Stonington, Connecticut, which he once represented in Congress. He defeated the incumbent, George Crouse. Simmons did not seek reelection in 2019."
negative "In Connecticut the union is closely identified with liberal Democratic politicians such as Governor Dannel Malloy and has clashed frequently with fiscally conservative Republicans such as former Governor John G. Rowland as well as the Yankee Institute for Public Policy, a free-market think tank."
positive "In 2021, after leaving elective office, she was named a Board Director of several organizations. One is the Center for Workforce Inclusion, a national nonprofit in Washington, DC, that works to provide meaningful employment opportunities for older individuals. Another is the William F. Buckley Program at Yale, which aims to promote intellectual diversity, expand political discourse on campus, and expose students to often-unvoiced views at Yale University. She also serves on the Board of the Helicon Foundation, which explores chamber music in its historical context by presenting and producing period performances, including an annual subscription series of four Symposiums in New York featuring both performance and discussion of chamber music. She is also a Board Director of the American Hospital of Paris Foundation, which provides funding support for the operations of the American Hospital of Paris and functions as the link between the Hospital and the United States, funding many collaborative and exchange programs with New York-Presbyterian Hospital. She is also a Fellow of the Yankee Institute for Public Policy, a research and citizen education organization that focuses on free markets and limited government, as well as issues of transparency and good governance."
positive "He was later elected chairman of the New Hampshire Republican State Committee, a position he held from 2007 to 2008. When he was elected he was 34 years old, making him the youngest state party chairman in the history of the United States at the time. His term as chairman included the 2008 New Hampshire primary, the first primary in the 2008 United States presidential election. He later served as the executive director of the Yankee Institute for Public Policy for five years, beginning in 2009. He is the author of a book about the New Hampshire primary, entitled Granite Steps, and the founder of the immigration reform advocacy group Americans By Choice."

Briefly


Other recent publications

Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, are always welcome.

How English Wikipedia mediates East Asian historical disputes with Habermasian communicative rationality

From the abstract: [3]

"We compare the portrayals of Balhae, an ancient kingdom with contested contexts between [South Korea and China]. By comparing Chinese, Korean, and English Wikipedia entries on Balhae, we identify differences in narrative construction and framing. Employing Habermas’s typology of human action, we scrutinize related talk pages on English Wikipedia to examine the strategic actions multinational contributors employ to shape historical representation. This exploration reveals the dual role of online platforms in both amplifying and mediating historical disputes. While Wikipedia’s policies promote rational discourse, our findings indicate that contributors often vacillate between strategic and communicative actions. Nonetheless, the resulting article approximates Habermasian ideals of communicative rationality."

From the paper:

"The English Wikipedia presents Balhae as a multi-ethnic kingdom, refraining from emphasizing the dominance of a single tribe. In comparison to the two aforementioned excerpts [from Chinese and Korean Wikipedia], the lead section of the English Wikipedia concentrates more on factual aspects of history, thus excluding descriptions that might entail divergent interpretations. In other words, this account of Balhae has thus far proven acceptable to a majority of Wikipedians from diverse backgrounds. [...] Compared to other language versions, the English Wikipedia forthrightly acknowledges the potential disputes regarding Balhae's origin, ethnic makeup, and territorial boundaries, paving the way for an open and transparent exploration of these contested historical subjects. The separate 'Balhae controversies' entry is dedicated to unpacking the contentious issues. In essence, the English article adopts a more encyclopedic tone, aligning closely with Wikipedia's mission of providing information without imposing a certain perspective."

(See also excerpts)

Facebook/Meta's "No Language Left Behind" translation model used on Wikipedia

From the abstract of this publication by a large group of researchers (most of them affiliated with Meta AI):[4]

"Focusing on improving the translation qualities of a relatively small group of high-resource languages comes at the expense of directing research attention to low-resource languages, exacerbating digital inequities in the long run. To break this pattern, here we introduce No Language Left Behind—a single massively multilingual model that leverages transfer learning across languages. [...] Compared with the previous state-of-the-art models, our model achieves an average of 44% improvement in translation quality as measured by BLEU. By demonstrating how to scale NMT [neural machine translation] to 200 languages and making all contributions in this effort freely available for non-commercial use, our work lays important groundwork for the development of a universal translation system."

"Four months after the launch of NLLB-200 [in 2022], Wikimedia reported that our model was the third most used machine translation engine used by Wikipedia editors (accounting for 3.8% of all published translations) (https://web.archive.org/web/20221107181300/https://nbviewer.org/github/wikimedia-research/machine-translation-service-analysis-2022/blob/main/mt_service_comparison_Sept2022_update.ipynb). Compared with other machine translation services and across all languages, articles translated with NLLB-200 has the lowest percentage of deletion (0.13%) and highest percentage of translation modification kept under 10%."

"Which Nigerian-Pidgin does Generative AI speak?" – only the BBC's, not Wikipedia's

From the abstract:[5]

"Naija is the Nigerian-Pidgin spoken by approx. 120M speakers in Nigeria [...]. Although it has mainly been a spoken language until recently, there are currently two written genres (BBC and Wikipedia) in Naija. Through statistical analyses and Machine Translation experiments, we prove that these two genres do not represent each other (i.e., there are linguistic differences in word order and vocabulary) and Generative AI operates only based on Naija written in the BBC genre. In other words, Naija written in Wikipedia genre is not represented in Generative AI."

The paper's findings are consistent with an analysis by the Wikimedia Foundation's research department that compared the number of Wikipedia articles to the number of speakers for the top 20 most-spoken languages, where Naija stood out as one of the most underrepresented.

"[A] surprising tension between Wikipedia's principle of safeguarding against self-promotion and the scholarly norm of 'due credit'"

From the abstract:[6]

Although Wikipedia offers guidelines for determining when a scientist qualifies for their own article, it currently lacks guidance regarding whether a scientist should be acknowledged in articles related to the innovation processes to which they have contributed. To explore how Wikipedia addresses this issue of scientific "micro-notability", we introduce a digital method called Name Edit Analysis, enabling us to quantitatively and qualitatively trace mentions of scientists within Wikipedia's articles. We study two CRISPR-related Wikipedia articles and find dynamic negotiations of micro-notability as well as a surprising tension between Wikipedia’s principle of safeguarding against self-promotion and the scholarly norm of “due credit.” To reconcile this tension, we propose that Wikipedians and scientists collaborate to establish specific micro-notability guidelines that acknowledge scientific contributions while preventing excessive self-promotion.

See also coverage of a different paper that likewise analyzed Wikipedia's coverage of CRISPR: "Wikipedia as a tool for contemporary history of science: A case study on CRISPR"

"How article category in Wikipedia determines the heterogeneity of its editors"

From the abstract:[7]

" [...] the quality of Wikipedia articles rises with the number of editors per article as well as a greater diversity among them. Here, we address a not yet documented potential threat to those preconditions: self-selection of Wikipedia editors to articles. Specifically, we expected articles with a clear-cut link to a specific country (e.g., about its highest mountain, "national" article category) to attract a larger proportion of editors of that nationality when compared to articles without any specific link to that country (e.g., "gravity", "universal" article category), whereas articles with a link to several countries (e.g., "United Nations", "international" article category) should fall in between. Across several language versions, hundreds of different articles, and hundreds of thousands of editors, we find the expected effect [...]"

"What do they make us see:" The "cultural bias" of GLAMs is worse on Wikidata

From the abstract:[8]

"Large cultural heritage datasets from museum collections tend to be biased and demonstrate omissions that result from a series of decisions at various stages of the collection construction. The purpose of this study is to apply a set of ethical criteria to compare the level of bias of six online databases produced by two major art museums, identifying the most biased and the least biased databases. [...] For most variables the online system database is more balanced and ethical than the API dataset and Wikidata item collection of the two museums."

References

  1. ^ Rozado, David (June 2024). "Is Wikipedia Politically Biased?". Manhattan Institute. Dataset: https://doi.org/10.5281/zenodo.10775984
  2. ^ Kerkhof, Anna; Münster, Johannes (2019-10-02). "Detecting coverage bias in user-generated content". Journal of Media Economics. 32 (3–4): 99–130. doi:10.1080/08997764.2021.1903168. ISSN 0899-7764.
  3. ^ Jee, Jonghyun; Kim, Byungjun; Jun, Bong Gwan (2024). "The role of English Wikipedia in mediating East Asian historical disputes: the case of Balhae". Asian Journal of Communication: 1–20. doi:10.1080/01292986.2024.2342822. ISSN 0129-2986.   (access for Wikipedia Library users)
  4. ^ Costa-jussà, Marta R.; Cross, James; Çelebi, Onur; Elbayad, Maha; Heafield, Kenneth; Heffernan, Kevin; Kalbassi, Elahe; Lam, Janice; Licht, Daniel; Maillard, Jean; Sun, Anna; Wang, Skyler; Wenzek, Guillaume; Youngblood, Al; Akula, Bapi; Barrault, Loic; Gonzalez, Gabriel Mejia; Hansanti, Prangthip; Hoffman, John; Jarrett, Semarley; Sadagopan, Kaushik Ram; Rowe, Dirk; Spruit, Shannon; Tran, Chau; Andrews, Pierre; Ayan, Necip Fazil; Bhosale, Shruti; Edunov, Sergey; Fan, Angela; Gao, Cynthia; Goswami, Vedanuj; Guzmán, Francisco; Koehn, Philipp; Mourachko, Alexandre; Ropers, Christophe; Saleem, Safiyyah; Schwenk, Holger; Wang, Jeff; NLLB Team (June 2024). "Scaling neural machine translation to 200 languages". Nature. 630 (8018): 841–846. Bibcode:2024Natur.630..841N. doi:10.1038/s41586-024-07335-x. ISSN 1476-4687. PMC 11208141. PMID 38839963.
  5. ^ Adelani, David Ifeoluwa; Doğruöz, A. Seza; Shode, Iyanuoluwa; Aremu, Anuoluwapo (2024-04-30). "Which Nigerian-Pidgin does Generative AI speak?: Issues about Representativeness and Bias for Multilingual and Low Resource Languages". arXiv:2404.19442 [cs.CL].
  6. ^ Simons, Arno; Kircheis, Wolfgang; Schmidt, Marion; Potthast, Martin; Stein, Benno (2024-02-28). "Who are the "Heroes of CRISPR"? Public science communication on Wikipedia and the challenge of micro-notability". Public Understanding of Science. doi:10.1177/09636625241229923. ISSN 0963-6625. PMID 38419208. blog post
  7. ^ Oeberst, Aileen; Ridderbecks, Till (2024-01-07). "How article category in Wikipedia determines the heterogeneity of its editors". Scientific Reports. 14 (1): 740. Bibcode:2024NatSR..14..740O. doi:10.1038/s41598-023-50448-y. ISSN 2045-2322. PMC 10772120. PMID 38185716.
  8. ^ Zhitomirsky-Geffet, Maayan; Kizhner, Inna; Minster, Sara (2022-01-01). "What do they make us see: a comparative study of cultural bias in online databases of two large museums". Journal of Documentation. 79 (2): 320–340. doi:10.1108/JD-02-2022-0047. ISSN 0022-0418.   / freely accessible version


Tasks

Here are some tasks awaiting attention:
vieweditdiscusshistorywatch