Welcome to talk with me

edit

Welcome :)--Haiyizhu (talk) 04:24, 24 January 2011 (UTC)Reply

Answers about Collaboration of the Week

edit

Here are my answers to your questions. Thanks for your questions and let me know if you need more information. mheart (talk) 20:40, 24 January 2011 (UTC)Reply

1. How many times (approximately) have you participated in the alternative music collaboration of the week?

Maybe ten articles so far - perhaps five editing sessions per article.
Collaboration is difficult to follow. When I choose to collaborate, the project becomes my main focus on Wikipedia and I must remember to check the page often for another editor's response. I've been looking for a bot) that could notify me that someone's responded or changed one of my edits. I'd love to find one that would send an IM.
The collaboration's choice of the week is chosen by a member of the project, most often by WesleyDodds. The articles I've collaborated on are often edited by him as well.

2. Why did you participate in the alternative music collaboration of the week?

Primarily to expand my own knowledge of alternative music and to be introduced to music through Wikipedia. To an extent, I trust Wikipedia as a credible source: the community is full of people who make it their goal to look for lack of citation, bias in articles, vandalism... and there are also many people who have specific pieces of knowledge about a topic to contribute, and the sources to back them up. That's appealing.
To me, numbers (like having a number one single in a country for 30 straight weeks) would speak louder than a stranger's opinion, but it's also really cool that people who are passionate about the bands edit the articles to include details that a casual listener may not have known or picked up on. That's another appeal about participating in WikiProjects - you're virtually surrounded by peers who are into the topic and you all have the common goal of sharing knowledge together.

3. Do you feel you become a better editor through participating in collaboration? If so, could you provide some examples?

I do feel as though I have improved as an editor. A good way to learn is by watching someone else, and people have their own editing styles here.

I don't know if this counts as being specifically a part of the alternative music collaboration of the week, but the Soundgarden was once chosen for the collaboration. At another time, I assisted another user article in editing the Soundgarden article to obtain good article status on Wikipedia. Cannibaloki and I pushed each other to get the article up to good standards and our respective know-how about editing and the band ultimately sent the Soundgarden article into good article territory.

4. How did alternative music collaboration of the week change your behaviors? What causes the changes?

The alternative music collaboration of the week was the first part of a WikiProject that I'd thought about Wikipedia as a collaborative medium and place of discussion. Suddenly, Wikipedia became more than just an encyclopedia. It's a real place where people communicate, make friends, and celebrate our shared thirst for knowledge.

5. Are you still participating alternative music collaboration of the week recently? Why?

I haven't participated recently due to moving and final exams.

Talkback

edit
 
Hello, Haiyizhu. You have new messages at Jsayre64's talk page.
You can remove this notice at any time by removing the {{Talkback}} or {{Tb}} template.

Talkback

edit
 
Hello, Haiyizhu. You have new messages at Chevymontecarlo's talk page.
You can remove this notice at any time by removing the {{Talkback}} or {{Tb}} template.

Hi, I replied to your questions over on my talk page. Hope they are useful in some way. Chevymontecarlo 19:00, 25 January 2011 (UTC)Reply

Collaboration answers

edit

Hi Haiyizhu,

Thanks for your interest in Wikipedia. Here are my answers to your five questions.

1. I've taken part in Wikiproject Oregon's Collaboration of the Week between 10 and 20 times since 2007.

2. I usually work more-or-less alone on topics that interest me. An Oregon collaboration from time-to-time adds variety.

3. Collaborations help me see how others view things and how they go about their work. They know things that I don't know, and I learn from them by observation or discussion. For example, when I took part in a collaboration to improve articles about Oregon hospitals, I imitated what others were doing and asked their advice.

4. The collaborations did not change my basic behaviors.

5. I haven't participated as much lately because I've spent increasing amounts of time reviewing, especially at WP:PR. I have a wide variety of Wikipedia interests, many of which are collaborative.

Hope this helps. Finetooth (talk) 02:40, 30 January 2011 (UTC)Reply

Re: Survey Questions

edit

Q1. How many times (approximately) have you participated in Alternative Music Collaboration of the Week?

Probably only 2 or 3 times if I remember correctly.--Michig (talk) 20:33, 31 January 2011 (UTC)Reply

Q2. How much do you learn from participating in Alternative Music Collaboration of the Week? A. A lot; B. A little bit; C. Not at all (please skip Q3 if you choose C)

B --Michig (talk) 20:33, 31 January 2011 (UTC)Reply

Q3. What did you learn from participating in Alternative Music Collaboration of the Week? Please provide examples if possible.

A few facts about the band(s) in question.--Michig (talk) 20:33, 31 January 2011 (UTC)Reply

Q4. Do you have any negative experience of Alternative Music Collaboration of the Week?

I didn't get involved in many since a lot of the artists selected either were not of much interest to me, seemed somewhat peripheral to 'alternative music', or were bands for which I didn't have any sources to use. The one I contributed to most didn't get a lot of input from other editors. --Michig (talk) 20:33, 31 January 2011 (UTC)Reply

Q5. What do you think are some of the reasons for Wikiproject Alternative Music's cancellation of collaboration of the week?

Lack of interest I would suspect. The collaborations seemed to attract a limited number of editors, and if they no longer had time or lost interest it wasn't going to carry on. --Michig (talk) 20:33, 31 January 2011 (UTC)Reply

Best of luck with your research.--Michig (talk) 20:33, 31 January 2011 (UTC)Reply

Responses to your survey

edit

Q1. How many times (approximately) have you participated in Alternative Music Collaboration of the Week?

About 25 or so, in terms of just doing a few grammatical/stylistic corrections, adding a review, or something like that. Only 3 or so that I've really taken an interest in and made many edits.

Q2. How much do you learn from participating in Alternative Music Collaboration of the Week? A. A lot; B. A little bit; C. Not at all (please skip Q3 if you choose C)

B.

Q3. What did you learn from participating in Alternative Music Collaboration of the Week? Please provide examples if possible.

I began listening to Britpop artists more and acquainting myself better with music criticism such as what's found in reviews. I couldn't tell you a list of facts I learned since it's kind of an amalgmated thing.

Q4. Do you have any negative experience of Alternative Music Collaboration of the Week?

It was sometimes frustrating to see how slowly things went, but we can't expect any better if we aren't willing to work hard for it ourselves. I also remember a fair amount of confusion and controversy over which artists could be considered "alternative" or "indie" enough to fit within the project, especially Linkin Park and other nu-metal bands.

Q5. What do you think are some of the reasons for Wikiproject Alternative Music's cancellation of collaboration of the week?

Well, WesleyDodds told me in early '09 that people were beginning to participate less because they each had moved on to other projects of their own. I for one, began working less with this project and more on video game-related articles, where I could apply my workforce better. Since I've gotten a few articles to good and featured status in this realm, I was encouraged to keep at it, which didn't leave much room for this project. I suspect it's the same with other participants, or rather that we helped ease each other out of the project.

Hope I helped. Tezero (talk) 00:04, 5 February 2011 (UTC)Reply

Interview

edit

I'd rather not do any interviews. Sorry. Finetooth (talk) 21:32, 30 October 2011 (UTC)Reply

Re: Interview

edit

Sorry, but lately I've been having some problems with the messenger, and I'm unable to use it. Regards.Tintor2 (talk) 15:29, 8 November 2011 (UTC)Reply

Interview

edit

I am willing to take part in the interview, and can be reached through Windows Live Messenger.
I can be contacted through MSN at merlinsorca@hotmail.com
There is really no specific time I need to do the interview. Most of the time, I'll be logged in whenever I'm on the computer. If you find me online today or during this week, you can contact me whenever you have the time! Merlinsorca 16:18, 8 November 2011 (UTC)Reply

Re: Interview

edit

Hi. I'll have to decline on the interview as I don't think I've ever participated in any of the collaborations of the week/month. Good luck with your research. TH1RT3EN talkcontribs 01:02, 9 November 2011 (UTC)Reply

Interview

edit

Hi, I'm willing and able to participate in a text-based interview via Skype (my Skype-name is game-guru999). I'm free all weekend and would prefer doing it then, but I'll also be available on Thursday evening (betweek 5 and 8 PM, UTC+1) and Friday afternoon (between 3 and 6 PM).

Regards, Game-Guru999 (talk) 16:58, 9 November 2011 (UTC)Reply

 
Hello, Haiyizhu. You have new messages at Game-Guru999's talk page.
Message added 14:05, 10 November 2011 (UTC). You can remove this notice at any time by removing the {{Talkback}} or {{Tb}} template.Reply

Interview - Gaming Collab

edit

I would be happy to try to find time to answer questions about the video game collaboration of the week. To be fair, when I started using it, it had been decreasing in use. It's been changed from weekly to monthly and I haven't been watching it too closely, but I can answer questions about selection, my involvement, or how it had been used historically. I'm am likely more available for a text chat than anything else. —Ost (talk) 22:45, 14 November 2011 (UTC)Reply

Interview

edit

Hi Haiyi,

Sorry it took me a while to respond, I haven't been on Wikipedia much recently. I can try to help as much as I can, albeit I haven't edited much on WP in a year or 2. If you want to talk prob the best way would be to email me your questions, and I can then respond. If you are still interested, shoot me an email @ pdwinfre (at) loyno (dot) edu. --Samwisep86 (talk) 16:46, 23 November 2011 (UTC)Reply

WP:VG Newsletter feature

edit

Hi Haiyizhu, I wanted to let you know that there was some discussion of your research work on Wikipedia published in the Features section of The WikiProject Video Games Newsletter Volume 5, Number 3. If there is any concern that you have been misquoted or that your views have been distorted in any way then please let me know so we can issue a correction in the next newsletter. Thank you. -Thibbs (talk) 13:50, 17 October 2012 (UTC)Reply

ORCID

edit

Given your work, WP:ORCID may be of interest. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:57, 24 November 2014 (UTC)Reply

Can you help verify translations of articles from Chinese

edit

Hello Haiyizhu,

Would you be able to help evaluate the accuracy of translations of Wikipedia articles from Chinese to English Wikipedia?

File:Language icon.svg

This would involve evaluating a translated article on the English Wikipedia by comparing it to the original Chinese article, and marking it "Pass" or "Fail" based on whether the translation faithfully represents the original. Here's the reason for this request:

There are a number of articles on English Wikipedia that were created as machine translations from different languages including Chinese , using the Content Translation tool, sometimes by users with no knowledge of the source language. The config problem that allowed this to happen has since been fixed, but this has left us with a backlog of articles whose accuracy of translation is suspect or unknown, including some articles translated from Chinese. In many cases, other editors have come forward later to copyedit and fix any English grammar or style issues, but that doesn't necessarily mean that the translation is accurate, as factual errors from the original translation may remain. To put it another way: Good English is not the same as good translation.

If you can help out, that would be great. Here's a sample of the articles that need checking:

All you have to do, is compare the English article to the Chinese article, and mark it "Pass" or "Fail" (templates {{Pass}} and {{Fail}} may be useful). (Naturally, if you feel like fixing an inaccurate translation and then marking it "Pass", that's even better, but it isn't required.)

If you can help, please let me know. Thanks! Mathglot (talk) 22:22, 3 June 2017 (UTC)Reply

Welcome to The Wikipedia Adventure!

edit
 
Hi Haiyizhu! We're so happy you wanted to play to learn, as a friendly and fun way to get into our community and mission. I think these links might be helpful to you as you get started.

-- 02:07, Thursday, October 19, 2017 (UTC)

Welcome to The Wikipedia Adventure!

edit
 
Hi Haiyizhu! We're so happy you wanted to play to learn, as a friendly and fun way to get into our community and mission. I think these links might be helpful to you as you get started.

-- 02:08, Thursday, October 19, 2017 (UTC)

Welcome to The Wikipedia Adventure!

edit
 
Hi Haiyizhu! We're so happy you wanted to play to learn, as a friendly and fun way to get into our community and mission. I think these links might be helpful to you as you get started.

-- 02:09, Thursday, October 19, 2017 (UTC)


AI model governance

edit

Imagine you’ve just spent 27 minutes working on what you earnestly thought would be a helpful edit to your favorite article. You click that bright blue “Publish changes” button for the very first time, and you see your edit go live! Weeee! But 52 seconds later, you refresh the page and discover that your edit has been reverted and wiped off the planet.

However, your post was not deleted by a human editor; an AI system - called ORES - has been contributing to this rapid judgement of hundreds of thousands of editors’ work on Wikipedia. ORES is a machine learning system that automatically predicts edit and article quality to support editing tools in Wikipedia. For example, when you go to RecentChanges, you can see whether an edit is flagged as damaging and should be reviewed. This is based on the ORES predictions and RecentChanges even allows you to interact with ORES to change the sensitivity of the algorithm to "High (flags more edits)" or "Low (flags fewer edits)”.

In this discussion post, we want to invite editors to discuss the following *4 potential ORES models* -- Among those four models, which one you think presents the best outcomes and would recommend for the English Wikipedia community to use? why?

ABOUT US: We are a group of HCI researchers at Carnegie Mellon University and we are inviting editors to discuss the values of Wikipedia as it relates to ORES. We aim to help build a better wikipedia community by engaging editors in the design and development of AI tools. More details are available at our research metapage.


Model Card One

edit

Performance

edit
Model Cards of the Threshold 86% by Researchers
Group / Metrics Accuracy False Positive Rate False Negative Rate Damaging Rate
Overall 97.7% 0.1% 56.8% 1.8%
Newcomer 92.7% 0.9% 55.0% 6.2%
Experienced 99.6% 0.0% 83.6% 0.1%
Anonymous 92.3% 0.3% 53.8% 6.6%

Explanation

edit

Advantages:

For anonymous and newcomer editors, there are only a small proportion of their edits that are considered as damaging. The algorithm of this threshold treats experienced and non-experienced editors as similarly as possible.

Disadvantages:

1. For all these three groups of editors, a large proportion of their damaging edits will be considered by the algorithm as good.

Model Card Two

edit

Performance

edit
Model Cards of the Threshold 61% by Researchers
Group / Metrics Accuracy False Positive Rate False Negative Rate Damaging Rate
Overall 98.0% 1.6% 11.5% 5.0%
Newcomer 95.9% 3.8% 7.0% 14.3%
Experienced 99.8% 0.0% 41.8% 0.3%
Anonymous 91.6% 8.3% 9.2% 19.7%

Explanation

edit

Advantages:

1. For all these three groups of editors, only a small proportion of their good edits will be identified by the algorithm as damaging.

Disadvantages:

1. For all these three groups of editors, a relatively large proportion of their damaging edits will be considered by the algorithm as good.


Model Card Three

edit

Performance

edit
Model Cards of the Threshold 42% by Researchers
Group / Metrics Accuracy False Positive Rate False Negative Rate Damaging Rate
Overall 95.5% 4.6% 1.6% 8.2%
Newcomer 90.8% 10.4% 0.5% 21.0%
Experienced 99.9% 0.0% 11.9% 0.4%
Anonymous 80.0% 23.1% 0.6% 33.6%

Explanation

edit

Advantages:

1. The algorithm can correctly classify most edits by experienced editors.

Disadvantages:

1. The algorithm of this threshold treats experienced and non-experienced editors in a quite different way.


Model Card Four

edit

Performance

edit
Model Cards of the Threshold 26% by Researchers
Group / Metrics Accuracy False Positive Rate False Negative Rate Damaging Rate
Overall 90.5% 9.9% 0.8% 13.3%
Newcomer 78.7% 24.2% 0.0% 33.2%
Experienced 99.9% 0.1% 7.5% 0.5%
Anonymous 58.0% 48.6% 0.2% 55.7%

Explanation

edit

Advantages:

1. Only a really small proportion of damaging edits will be considered by the algorithm as good.

Disadvantages:

1. The algorithm of this threshold treats experienced and non-experienced editors in a really different way.