Talk:Impact factor/Archive 1

Archive 1

ISI impact factor list is currently not publicly available. It has to be subscribed. (I don't know whether it was publicly available in the past.) -- Zeyar Aung

It was never free. It is copyrighted by ISI - Thomson Scientific. If you are interested in "alternative" sources, see this. Karol 19:52, 15 December 2005 (UTC)
I propose that links to both the Journal Impact Factor mentioned by Karol and the Hirsch number be added to this article in a section called "alternative impact metrics." Right now the article is something of an advertisement for ISI. I'm not going to just go ahead and make the change because I appreciate that impact factor is already a well-crafted article that someone has taken pains over. Alison Chaiken 01:23, 28 December 2005 (UTC)

This is no means an advertisement for ISI: it is a straightforward description of one part of an extremely import bibliographic tool, available at all major uuniversities (for a very high price to the university library) . It's a world standard (for better or worse). Of course there are alternatives, and they should be included.

The page on Se 9, 2006 has indirect links to sources of illegal content, and these links are being removed as copyright violations in accordance with Wiki Policy. DGG 03:38, 10 September 2006 (UTC)

Title caps

Should this be Impact factor ? - FrancisTyers 14:34, 15 December 2005 (UTC)

I think it should. Note that Impact factor redirects to here. On the other hand, it is notoriously used with both words capitalized, as a proper name. I think ISI - Thomson Scientific is responsible for this :) Karol 19:45, 15 December 2005 (UTC)
Of course it should. No question. --Stemonitis 09:30, 19 December 2005 (UTC)

Can't say I agree. It has a name, and its name should be used as it is by the creator. With both capitals. But as I seem to be in a minority I won't change it. DGG 03:38, 10 September 2006 (UTC)

Result

Page moved as requested. WhiteNight T | @ | C 17:09, 29 December 2005 (UTC)


Skewness

I would have thought that the number of citations followed a Poisson distribution. Jeremymiles 16:23, 21 February 2006 (UTC)

I think they might follow the Zipf bibliometric distribution -- 80% of the refs are generally to 20% of the articles

objectivity and coverage

Nevertheless I do agree on the advertisement like of the article. In particular, I would appreciate somebody has time and expertise to refresh my souvenirs, and complete information on ISI's “wide international coverage” and the protocole of selection of this coverage and the consequences of it (national researches tend to be disgarded). The assertion “An objective measure of the importance of different publications” has to be put, in my opinion, in relation with the absence of explanations on where to count the citation numbers in the phrase “The impact factor […]to be the average number of times published papers are cited up […]”. The word “objective” then should rather be “relative according to the journal coverage of the database”. I would like also to add this reference in external links ("du bon usage du facteur d'impact" -on the good usage of impact factor) :http://www.udfapse.lib.ulg.ac.be/aide_publication/facteur%20d'impact.htm

Will add.DGG 00:50, 15 September 2006 (UTC)

Re: Impact factors of selected scientific journals in 2001

I understood that IF listings were copyright, and that Thomson have patrolled this vigorously since they took them over; should we link to this? Espresso Addict 22:35, 28 August 2006 (UTC)

The link to this data must be removed. It links to an essentially complete copy of the 2001 data in JCR. I would guess that this year was chosen because it is the year the ISI makes available at no charge to library schools, in order to familiarize their students with the system. Conceivable the people posting it were under the mistaken impression that it was being made generally available.

I shall edit to remove the link, to anticipate my friends at Thomson/ISI. There is another link to copyrighted ISI data on this Wiki page (the external link to "List of impact factors from recent years--which is a linl to a site that says "The lists of Impact Factors have been removed from this site due to copy right violations. Below are the lists available elsewhere." [and then proceeds to give links to other illegal copies]"

i shall remove it as well.   (and check Wiki's related pages to see if there are any more)  

--I am following the instructions on http://en.wikipedia.org/wiki/Wikipedia:Copyrights

The Wiki "Get Cited" page links to an independent database apparently compiled by asking people to join and enter the reference lists from their papers. It appears to be legitimate, and I will edit its page to describe it in more detail.

DGG 03:24, 10 September 2006 (UTC)

edited first half for clarity, specificity and NPOV

To me, the article pre-edit exhibited a POV deprecating the use of impact factors. This is one of the possibly reasonable positions, but it is not the majority position. As shown by the wide availability and universal use of these IFs, the majority position is that they are valid, although other measures should also be considered. There are, naturally, more recent references proposing other measures than defending the existing ones, but what else could be expected? One cannot get a paper published saying that h values are unreasonable, use IFs instead. I could fish out comments from the various lists, but so could the reader. I'm not including the comments from various lists that those seeking other metrics are those whose journals don't show up well. I could correlate if I wanted to, but this isn't a product review.

The second half has most of the alternatives, and I promise not to delete any of them.

I've used IF for convenience, but I'll change them. to match ISI practice. I ask again about capitatilization for Impact Factor, since capitals are the consistent ISI use. I'll follow the apparent policy here.

As in the previous article, I didn't link to ISI every time mentioned, and I didn't link to subject fields used as examples. I could bold them, but whom does linking or bolding help?- I have not yet checked that all ISI links are reachable without subscription--I will, but i need to use a different computer.

If someone disagrees with me on the NPOV, I wish they'd comment here before reverting or changing drastically. DGG 01:14, 14 September 2006 (UTC)

My comments are mostly going to be about some things that I disagree with, but don't let that discourage you. I'm taking the time to write this because of your promising start. I didn't fix any of the things I mention because you may want to do that yourself.
  • Use bold when you use the article title (and any redirects pointing here) for the first time (preferably in the first sentence or paragraph). Not everyone agrees on when to set links, but most terms should only be linked once per article (or per section, in some cases). Linking ISI once, as you said, is perfect; I'd link subject fields used as examples, too. Having a look at some featured articles and at the MoS may help.
  • Try never to make things worse than they were before. Your edits removed the interwiki links (to the article in other languages) and added a category a second time. There's a half-finished sentence now ("Most of these factors are thoughly didiscussed on the"), new spelling mistakes (Citaation, posiible), and a sentence ending in two periods.
  • The intro was better before because the first sentence explained what the article is about – now it says who invented and owns it.
  • This sentence needs explanation for readers not versed in the subject at hand: "There can be no Impact Factor for an article, because the denominator is always 1"
  • "The best explanation of the details and the proper use are at the [Thomson website]" is bad style. What we can or need to say belongs into the article body. If we want to suggest further reading, then that goes into "External links". On the other hand, if you want to point to an external source you used as a reference, use <ref> (there are examples in the article).
  • Capitalization would probably best follow what 3rd party publications on the subject usually do — the creators/owners of such terms often have some very odd ideas of how to make their special term even more special.
  • As for POV on IFs: IFs do measure something related to some quality of journals. The use of IFs in the evaluation of producers (institutes, teams, researchers), however, has been an unmitigated disaster – the serials crisis is the least of the problems it helped create. Even Garfield himself calls that an abuse. And there's little controversy about this because it's just too obvious:.
  • It is certainly true that journals with a low impact factor have an incentive to question the validity of IFs. But the criticism quoted in the article was made by Nature which as you know has a huge IF.


Rl 08:04, 14 September 2006 (UTC)

reply to comments

In general I do concede that a writer must follow the house style. Therefore:

  • I apologize for the typos and thank you for the corrections.
  • The interwiki links were an accidental deletion. I will restore them from previous versions.
  • I've rechecked, and 1/4 of the authoritative souces use both caps, 1/4 first only, and 1/2 both lc. In titles, all use both caps. (and, to my surprise, ISI uses l.c. in its help pages) I will adjust.
  • I still disagree with links to subject fields on grounds of readability, and the examples show exactly what i dislike. I think it's valid only if the target article has a relevant part, and then it should be a link to that specific section or anchor.
    • But since other people will probably add them if I don't it might as well be me. At least in WP it's easy.
  • For the opening sentence, I will revise to include the definition if possible. (you're right, at least in terms of WP style--but it is very hard to define IF accurately in one understandable sentence.)
  • As for the way I did the external links, I modeled on what I saw on other articles. I'll look at the help again, for there seem to be a number of different options there.
  • I'll say more about the equation, and if you think my explantation inadequate tell me. This is the sort of thing one cannot well judge by oneself. It's for the reader to judge whether one is clear, saying "it would have been clear if you had read more carefully," is never right.

For more substantative matters:

  • If I did not make it clear that there is significant criticism on the use to evaluate institutions & individuals, I'll make it clearer. There is however a distinction between relatively appropriate ways of using it, and totally ludicrous ones, which I will expand on.
  • As for the effect on the journal crisis, i refer to the fallacy of libraries always buying the highest IF journals. I don't think it is a major effect, for they generally judge otherwise. I see no other effect, so if you do, please tell me so I can think about it.

For more general WP policy

  • As for the use of the Thomson website, I will try to expand what I do say instead of leaving it all to them But the Thomson help pages are the best explanation possible, and much too extensive to include here. This is not a public domain source, and too much would be necessary to be fair use. I'm sure they'd permit me to insert it here. but it wouldn't be GFDL. I do intend to discuss the general issue at some more appropriate place, because if the WP policy is what I think it is, I think it should be reconsidered. It might have been appropriate as a way to get started. And I can see it might be unavoidable for certain descriptive articles. If its a matter of stable links, there's other ways for that.

I do not expect to make all these improvements tonight. :) DGG 00:50, 15 September 2006 (UTC)

I have now made most of the corrections, expanded some of the later paragraphs,and added the links to other wikis. DGG 02:42, 15 September 2006 (UTC)

  • New typos, literal quote mangled, one external link to pubmedcentral that used to work broken, repeated sentences in the intro
I fixed all that but I suggest to use "Show preview" and "Show changes" buttons liberally before saving a page, or to check the differences between revisions in the history tab
  • "impact factor" casing not fully consistent yet
  • I advise against adding more articles to external links; we should quote relevant facts in the text and then add the articles as references
  • It is of course not possible to "define IF accurately in one understandable sentence". But that's not the point. The first sentence just offers a rough description aimed at people who may a) have no clue what the term means or b) are wondering if they arrived at the right place. It is an attempt at capturing the essence, accurate definitions come later.
  • The article as is contains a lot of valuable information but in a barely organized way. What it needs most is a clear structure (e.g. Overview, History, Use/Impact/Criticism, Alternatives).
  • regarding the "more substantative matters" – criticism and serials crisis: I don't want to add more to the article as it is now. It would just make the necessary refactoring of the content harder.
  • Fair use is not an issue. We can publish the relavant facts in our own words.
Rl 07:55, 15 September 2006 (UTC)

what is important, and what can be changed

Dear RI, Though Im glad for the proofreading--you may not believe it, but I do transfer the content into MS word for a spell check first-- (I use MS word because Ive built up a good dictionary there of what I use) Besides yourself, there seem to be many proofreaders looking around, a practice I approve of. I too can even do this if I havent written the text. Some of what you advise are style changes I can make, but for some, you may be misunderstanding what I am doing.

1. I will indeed describe it in one sentence, unreadable though it may turn out to be. You can then give it a try. I notice a great many leads have extremely long and convoluted sentences. I know how to do that too. 2. Neither i nor anyone else can rephrase the help pages of JCR to be as good as the originals. As I understand it, that is one of the purposes for outside links-- detailed explanations that others have done better. If it is considered better style here, as you suggest, I will put in summary sentences of my own, and then use the outside material as references , as i can find subscription free material. (There's a good deal of other non-subscription writings by Garfield on the web). I know it when someone is clearer than I am, Garfield for example. BTW, by my standards, a detailed paraphrase is plagiarism. Plagiarism is borrowing expressions and arrangement --you cannot make a writing of your own from someone else's work. (it is much stricter than fair use in copyright). I do see signs that some do it here, and I'm keeping examples for when I'm experienced enough to be taken seriously here. I would be taken very seriously already if I put them on liblicense, but not even I would go outside first. I don;t think anyone's reported on this outside WP yet, but I'd rather influence than criticise. 2A When I started, the external links were all to subscription sources--I am still changing this.

3. Yes, its disorganized. It's disorganized because I have not yet done the later sections--I am very reluctant to alter all of this at once, because it has clearly been inserted by some whom I consider distinct small minorities. (BTW, for serious dscussion of the alternate measures, see the SIGMETRICS list, and JASIST.) If I have to prove they're minorities I will, but all I'm doing is reducing their share from 75% to 25%, which is probably 2X what their arguments deserve. If you are suggesting that it may be better to go point by point than divide, you msy be right. ZI did see the overlap as i ws writing. 3. There is no history. JCR was created, and has remained the same except for the better readability as an e-resource. 4. The substantative issues are my removing material, not adding it. The entire skewness section is wrong. the inflence on libraries is wrong. the h calculations i think arewrong, but thats the one part whee there is someserious owrk to be accounted for. Someone mentioned in the talk the angloamerican centrism, and this does need expansion. 5. I am aware that i can write or revise a complex page in a subpage of my space. But I always like to go by stages, and I want feedback as I go. I will try that, and i may even try writing in msword. I have my own good macros for the html, and I could probably write some for the wiki style. But the next version of MS Word has a more complicated scripting language which I have yet to learn. 5a Equally important, for some of this, if I make too many changes at once, I suspect that people whose ideas I discarded would revert. I am willing to be corrected, but not for my work to be futile. 6. If things i truly care about like 2. are not the custom here, I don't belong here. If articles like I found this are considered NPOV, ditto.

7. BTW, the real problem I have here is the necessary changes in interconnected articles. I may try to figure out a way to do this offline. But if it isnt fast enough it throws out other peoples edits. DGG 06:50, 16 September 2006 (UTC)

Most of your arguments and plans sound reasonable – it's good to have you here. Some of your assertions though are likely to be opposed by some. I advise you not to assume that the current article reflects the extent of our knowledge – it represents what has been written given several limitations like time, other responsibilites, and knowledge. (for instance, doing their own proofreading and fixing the mistakes for others makes some people slower)
The fact that there are many lousy leads doesn't mean that complicated intros are a good thing.
External links are for content that either doesn't belong into an encyclopedia or that we didn't get around to covering ourselves yet. But the article must be complete without the content of external sources.
A "detailed paraphrase" may be a copyvio. Plagiarism, however, is passing off someone else's work as your own, something you don't do if you give your source. If you present the facts of a text in your own words and name the source of your facts, then it's neither a copyvio nor plagiarism. I am aware that there are those who try to extend copyright way beyond that – but so far, we don't need a license just to be a secondary source on anything.
Please specify exactly what statements you disagree with.
The disorganized state of the article is of course not your fault, it's just something that needs to be fixed.
What I meant with history: the vision Garfield had when creating the impact factor, initial public reaction, increase in the number of journals covered, etc. But I don't insist on having a history section, we just need something better than what we have now.
The entire skewness section is most certainly not wrong. I remember reading the quote in Nature (I can verify the quote, but it certainly catches the spirit of what I read), so that's a seemingly relevant source accurately reported. The following paragraph does have a quirk or two, but looks mostly correct as well. Can you say specifically what statements you object to, or suggest a better wording?
I recommend either re-shuffling the content (without losing any) or fixing the statements you disagree with. Mixing both kinds of edits together can be quite confusing. You may want to take a look at {{inuse}} if you plan to do a major surgery, it may come in handy.
Rl 21:29, 17 September 2006 (UTC)
Thanks for info on the template; I was hoping there might be such a device.
Eugene Garfield did so many related things in bibliometrics that his article needs expansion--maybe we can best do some of the history there, but I do know what you mean.
The skewdness is not wrong, precisely, but rather just the manifestation of a more general phonomenon, Bradford's law. There are so many ways of presenting

this & so many difficulties that I havent even begun to work on this.-::enough talk--I'll try to work on the article instea


Hi, I am a Harvard researcher who manually tabulates new ISI rankings by journals of different scientific discipline. My tabulation of ISI impact ranking list is unique-- it breaks it down by scientific disciplines: general science, medicine, diabetes, CVD, cancer, obesity/nutrition, epidemiology and public health, etc.

This information took a lot of hard work and time to tabulate, and thus, a unique contribution for interested readers and scientific community, which was I included it on a weblog. Also because I tabulated these numbers from each scientific journal myself (from their general info "about" pages), it is not violating ISI copyright.

After a previous discussion with another Wiki user (R1), he agrees that given the nature of the unique tabulation of data (not available anywhere else), it is fair to provide 1 link on the impact factor list. Let me know if others have feedback- if not, perhaps R1 can add back the links, so as to maintain neutrality. —The preceding unsigned comment was added by Epiding (talkcontribs) 04:24, 19 September 2006.

Note: I removed the quotation from the talk page that broke formatting in several interesting ways. The comment referred to above is this one. Rl 08:06, 19 September 2006 (UTC)
I'll admit that said comment was not as clear as it could have been – my main point was that if the blog link belongs anywhere, it belongs into this article, and nowhere else. External links should be added sparingly (references are a different matter), links to blogs are often problematic and external sites that are ad-infested is where the line to "nothing is better" is usually crossed. – That said, the limits of copyright protection for ISI's impact factors are untested (to me, anyhow; anyone know precedents?), and ISI would have a vested interest in exaggerating the scope, so I don't object to the link (which has only a few impact numbers out of thousands) on copyright grounds. Rl 08:06, 19 September 2006 (UTC)
Look at how much is published in the literature--very little is--we want to set a good example, not fight with ISI. Let me see if I can reach an agreement with ISI. I think I know the right people. DGG 08:34, 19 September 2006 (UTC)
Excellent. Maybe you can get them to clarify a few things:
  • Is it okay for us to note current impact factors in the articles on specific journals (see Nature, 3rd paragraph, for an example) ? How about previous impact factors?
  • Is it okay for us to list the most highly rated journals sorted by area, as Epiding did? With or without their impact factors?
Please note that answers need to be verifiable in order to be meaningful (for instance, if you get an email, you may add – with their permission, of course – their reply to this talk page, along with all the email headers). Rl 09:05, 19 September 2006 (UTC)
i am planning to ask first for something which will serve everyone in and out of WP, which is for ISI to make one sample year available OA. which is a very basic qy, and will take personal contact. For the specific instances of current or historical data, if asked directly, any legal dept will usually ask anyone who wishes to post to obtain permission each time, which would not be GFDL, and not imho a useful technique for our purposes, for Epiding would need to ask himself about his own blog. I know WP moves quickly, but ISI does not. And see fair use, a very good article. DGG 03:48, 20 September 2006 (UTC)
OA is not a license and not necessarily compatible with the GFDL. Rl 06:34, 20 September 2006 (UTC)
I understand the difference perfectly well, Rl, and said what I meant to say. OA from ISI will be legal, and we can link to it. I know it is not sufficient to insert.
WP needs GFDL but someone's blog doesn't--it just has to be an authorized use. DGG 04:26, 21 September 2006 (UTC)

Illegal site?

The link below has been removed from the article because it is an "illegal site" – according to the edit summary:

I'm not sure how helpful a French article on impact factors is to readers, so I'm not advocating its inclusion. However, I challenge anyone to make a coherent argument that the linked site or article are illegal. Rl 08:18, 4 November 2006 (UTC)

Sure. The site had impact factors on its site. It removed them, and says:

"The lists of Impact Factors have been removed from this site due to copy right violations. Below are the lists available elsewhere." So the site admits that posting list of impact factors reproducing identical or to a considerable extent such factors is a copyright violation. To then link to some other places that have them is ingenious, and equally wrong. "I will not sell you heroin, but I know several people who will be glad to: here are their names and phone numbers."

Fortunately, when dealing with electonic resources, anyone can produce a citation indexand derive impact fcors from scratch, I encourage those who support this approach--which I support as well--to become active in one of the Open Access Citation Index workgroups--in fact I am, personally.

But I have respect for your position, and I do not immediately remove, pending further discussion--and because there would be no way to avoid a revert war if we continued that way. DGG 15:31, 9 November 2006 (UTC)

I must be missing something here. I was referring to a single web page titled Du bon usage du facteur d'impact, linked above. It contains exactly one link – to ISI. It doesn't contain any mention of copyright, either. You must be talking about a different page, then (link, please?). – And the comparison of what would constitute a relatively minor copyvio with selling heroin doesn't seem appropriate to me. Rl 18:39, 9 November 2006 (UTC)

apology

there does indeed seem to have been a mix-up, & I seem to have deleted the wrong page. I have restored it. The problem page is http://www.sciencegateway.org/impact/. which is a page providing a number of links to copyright violating lists of impact factors. I left it in, and labelled it for what it is. Yes, heroin is an exaggeration. But it is, quite literally, grand theft. These lists cost over $10,000 per year.DGG 05:01, 10 November 2006 (UTC) There is nothing wrong, with the rather good French page.

Oh, I see. I removed the sciencegateway page. It looks like a site thrown together to make advertising money with a high pagerank, and there's no need for us to support that. – However, what the site does is not "grand theft" at all: a) copyright infringement is not theft by law, and b) very few people who'd pay ISI $10,000 for such a list would rely on a random list on the internet for whatever important use case they have, so the damage to Thompson should be pretty much zero anyhow. Rl 07:01, 10 November 2006 (UTC)

proportion

I was not aware that it attempted to include such fields in the first place--keep it in proportion--it is hardly the main objection.DGG 04:20, 2 December 2006 (UTC)

Next time, you may want to specify what you are talking about ("it"? "such fields"?). — I don't like where the article is going: not only do the recent edits add further unsourced stuff, the addition even fails to mention which "leading journals" are allegedly omitted. This is worse than useless. Rl 08:21, 2 December 2006 (UTC)
Glad to discuss.
  • Such fields=the fields the previous edit to the article mentioned, in the edit I partially reverted, the applied business fields such as marketing.I was trying to be tactful to the editor concerned ;)
  • Where is it going? there has been very little change lately, tho it might be time to add some more. Where do you want it to go?
  • stuff like possible objections, such as saying something should include more, does not really need a source, though there are plenty on the web--you are welcome to insert, or query the ones you have in mind with {{fact}}.DGG 04:14, 3 December 2006 (UTC)
  • Your comment was reasonably clear to those watching the article and seeing the concurrent edits, but those who just happen to the read the talk page sometime later... :-). — The concern I have about the page is that it reads like original research, and with people adding more unsourced claims it's getting worse (and IMO there are way too many for using {{fact}}). We cannot say that leading journals are being omitted without at least give some examples so readers have a chance at forming an opinion. Rl 07:50, 3 December 2006 (UTC)
I think I can add a link that will take account of some of the criticism in summary, but it is neither easy nor useful to ref a list of pros and cons at each point. As for the new paragraph, I did not want to remove completely an addition made by an unfamiliar editor. DGG 20:29, 3 December 2006 (UTC)

field of biology

The paragraph just added which i removed has no sources, and is at best overgeneral. There are many fields of biology--I suppose the meaning is experimental biomedicine, because thats the only part of bio that has very high citation rates. I frankly don't see the relevance--we've already said that it isn't valid across fields. Second, I don't think its even true--the high citation rate is more from the publication of small segments of research--I cant imagine why there would be more back & forth than in any other field--all scholarship works that way to a considerable extent. But if its a sourced criticism, then it should go in. Is it?DGG 06:34, 19 June 2007 (UTC)

Politics of Knowledge

This article was superficial on the whole, ignoring the larger political context that shapes what is considered important by journals, editors and the like. Just consider Thomas Kuhn's work on paradigms and power and you can see how limited bean counting exercises like this can become. Just imagine the situation of poor Albert Einstein when he had a low impact factor. According to the primary author of this text, the implication is that maybe he should have stayed home and given up on physics. One might also consider the work of Noam Chomsky on filtering systems in the media and academic life, the various ways in which radical intellectuals are denied tenure, among other factors, if that is not asking too much. —Preceding unsigned comment added by 130.237.148.186 (talkcontribs)

all this may be true, but irrelevant. The subjects you want to discuss are 1/ bibliometrics in general. impact factor is just one of its constructs . 2/ the academic tenure system. (or the research evaluation system) again, impact factors are but one of the measuring devices used there. this article is about impact factor, not the social organisation of science and higher education. DGG (talk) 09:07, 31 August 2007 (UTC)

average impact

It is certainly true that IF measures the average impact, and that by Beer's law a small minority of the articles accouns accounts for most of the citations, as confirmed empirically hundreds of times. And thus of course the high-impact articles have a great impact than the average, and the IF a journal is not a the same as the impact of any particular article in it. This does not however address the predictive power, which is a statistical test: on the average, a articles in a high IF journal does get cited more. Theparagraph added needs rewording. to be less dogmatic.DGG (talk) 00:30, 22 September 2007 (UTC).

Deleted reference added in right place

I have seen that the reference to [1] has been deleted.

I think it's because it was not in the right place. I have added it when talking about faults of the impact factor.

I'd like to comment that the name of the paper doesn't reflect its content. It talks about measuring Impact factor for magazines in general, but shows the example of "Logic Programming"

--El Pantera 20:09, 10 October 2007 (UTC)

Eigenfactor.org

I suppose, a link to eigenfactor.org should be added to "other methods" as well. —Preceding unsigned comment added by 193.60.222.2 (talk) 15:16, 2 November 2007 (UTC)

SJR - SCImago Journal Ranking

I think that, as in the case of eigenfactor.org, a link to scimagojr.com should be added.

Extracted from scimagojr.com:

The SCImago Journal & Country Rank is a portal that includes the journals and country scientific indicators developed from the information contained in the Scopus® database (Elsevier B.V.). These indicators can be used to assess and analyze scientific domains.

This platform takes its name from the SCImago Journal Rank (SJR) indicator, developed by SCImago from the widely known algorithm Google PageRank™. This indicator shows the visibility of the journals contained in the Scopus® database from 1996. —Preceding unsigned comment added by 83.52.70.242 (talk) 19:11, 22 January 2009 (UTC)

Published article to consider for citation

I found this while looking at the all-time most-accessed articles on BioMed Central. This article ranks as the 19th most-accessed article, with >41,000 access hits, as of 2007-12-21. --User:Ceyockey (talk to me) 00:22, 22 December 2007 (UTC)

Dong, Peng (2005-12-05). "The "impact factor" revisited". Biomedical Digital Libraries. 2 (7). BioMed Central. doi:10.1186/1742-5581-2-7. PMID 16324222. Retrieved 2007-12-21. This narrative review explains how the IF is calculated, how bias is introduced into the calculation, which questions the IF can or cannot answer, and how different professional groups can benefit from IF use. {{cite journal}}: Cite has empty unknown parameters: |laysource=, |laysummary=, and |laydate= (help); Unknown parameter |coauthors= ignored (|author= suggested) (help)CS1 maint: unflagged free DOI (link)

Incommensurable

I do not agree with the notion that IFs of journals of different research fields may not be compared in general. This can be true for areas with different research outputs, e.g. efforts in mechanical engineering go into patents rather than papers like said in the article, resulting low IFs. However fields which do share a publication policy are in my opinion comparable. Mainstream areas attract more researchers generating more papers, citations and higher IFs than others, granted. But more researchers also means more competition and a tougher arena to excel in. The case is the same with journals competing for contributions. In that respect the higher IF should be regarded as a reward for the market performance and I think this is very much in accord with the original idea of IF being a measure of reputation. Amanitin (talk) 18:39, 2 January 2008 (UTC)

the question is how close. The citation density of, say, molecular biology and evolutionary biology is widely different, & the reputation is always in field or another. Similar in other cases. What example do you have in mind, though?DGG (talk) 07:35, 3 January 2008 (UTC)
Am not quite sure about this: "the reputation is always in field or another". Anyway, as said, if the two fields share the same publication procedures, meaning peer reviewed research papers, then I think the IF differences are relevant, because through citation numbers they also indicate competitiveness. In deep water oceanology there are lets say 4 journals, you have to beat 3 to be the best, IF 2. In pharmacology there are 400, you beat 399, IF 25. Amanitin (talk) 21:21, 3 January 2008 (UTC)

Misuse of IF problems

Two of the bullet points under Misuse of IF essentially state the same thing. One of them says that the use of an absolute value for IF is useless and the other says that comparing multiple subjects is impossible using the IF. I just think that this should be fixed to clean up this page. —Preceding unsigned comment added by Katosepe (talkcontribs) 21:11, 4 June 2008 (UTC)

The "misuse" and "debate" sections overlap and repeat each other at parts. This needs cleanup. Perhaps these sections should be reorganized into "criticisms of the IF" and "misuse of the IF". For instance, the fact that some fields are not adequately covered falls under criticism, not misuse (currently this is listed under both). --Crusio (talk) 21:36, 30 June 2008 (UTC)

I also felt there needed to be some cleanup because of overlap. I've started to put some of the points under separate headings, but I still see some overlap and also too verbose treatment of some points. Didn't want to step on anyone's toes so I edited as little as possible. Also added some more recent references. --Brembs (talk) 10:55, 30 July 2008 (UTC)

Haven't had time to look at this in detail, but my first impression is that you have been doing a pretty good job! I'd say, be bold and go ahead with cutting/paring down. One remaining problem is that many assertions in the article remain unsourced and therefore may constitute original research. --Crusio (talk) 11:03, 30 July 2008 (UTC)

Claim about citations

However, it is almost universal for articles in a journal to cite primarily its own articles, for those are the ones of the same merit in the same special field.

That's just not true in my experience, at least in biology. Most articles cite more things from other journals than from the journal they're in. 72.75.66.203 (talk) 23:10, 13 November 2008 (UTC)

the meaning is only that it is normal for the single journal most cited by any particular journal is itself. 21:27, 15 November 2008 (UTC)
That's not the meaning of "primarily", whatever the author may have intended. In any case, I wouldn't say it's "almost universal" for an article to cite the journal it's in more than any other journal. In fact, think about manuscripts that are submitted to one journal, rejected, and submitted to another, without changes. 72.75.66.203 (talk) 21:32, 15 November 2008 (UTC)

What date each year are impact factors published by ISI?

Does anyone know? It would be good to add this to the article. A page here suggests it is published in January [2] —Preceding unsigned comment added by 194.83.141.120 (talk) 12:16, 20 November 2008 (UTC)

I removed two links that required privileged access because they were not accessible to me as an editor or (presumably) the vast majority of the people who use wikipedia. User:Crusio undid this saying "that's not a reason to remove them".

I am not going to make a big deal about this if somebody undoes my undo, but I don't think links to private websites have any place in wikipedia. It says right below this screen Encyclopedic content must be verifiable.

How am I (or anybody else) supposed to know what lurks behind that log-in screen? —Preceding unsigned comment added by Compo (talkcontribs) 15:12, 16 July 2009 (UTC)

  • Many articles in WP are sourced to such subscription sites, for instance major newspaper sites accessible to subscribers only. And how about print references, would you remove those, too, just because they're not in your local library? I have no time now to look it up, but the relevant policies all say that whether a site is a subscription site of open access, does not figure into whether it is a reliable source or not. I suggest you self-revert after checking this. --Crusio (talk) 15:54, 16 July 2009 (UTC)
Many, if not most academic sources are behind pay-walls. That is very unfortunate, but a fact nonetheless. Therefore, references do not have to be easily accessible for everyone (but properly cited). In this case, however, they were external links, not references. That is why I agree with the removal. Rl (talk) 06:33, 17 July 2009 (UTC)

I don't think the 'if a book's not in your local library' argument holds water. To my mind, an external link is supposed to be a link to a source of information for the users of the encyclopaedia. Sites that require a user to be registered before they can gain access do not seem to fulfil this criteria.

If this was site that said "register with us for only 100€ and we will send you this information for free" rather than "if you are a member of a university we will let you see this", I suspect we would not be having this discussion.

Compo (talk) 10:04, 17 July 2009 (UTC)

  • Many universities don't have access to this, but that is not the point at all. This article is about the impact factor. The link you removed is to the website where those impact factors are published. Impact factors can only be calculated using a big database that costs huge amounts of money to maintain, so you really can't expect them to give away that stuff for free. It's a bit strange to remove from an article on a certain subject the very link to that subject, just because this is a subscription link. --Crusio (talk) 13:49, 17 July 2009 (UTC)
I suspect that the difference between myself and User:Crusio are largely philosophical - I am not sure that they can be simply resolved. When User:Crusio says "but that not the point" my thought is "but that is exactly the point".
I don’t have any problem with things like Thompson ISI being referenced, but I do object to external links that are not accessible. Perhaps the easiest way to resolve this is to simply "cite" the deleted links so that they appear in the reference section.
That way, external links remain as freely accessible sources of information that can be used by anybody who is interested if finding out more information and references remain as sources of authority that can, ultimately, be verified in one way or another.
Compo (talk) 05:25, 18 July 2009 (UTC)
  • Yesterday I had a moment to look up the relevant policies (but stupidly enough did not save the links). You are right, restricted-access sites are discouraged for external links (although if they are very important to the article, exceptions can be made), but there is no problem with references. I'll change the Thompson link to a reference later today. Happy editing! --Crusio (talk) 06:50, 18 July 2009 (UTC)

There are lots of debating about the term and method

http://scholar.google.com/scholar?hl=en&q=allintitle%3A+journal+impact+factor&as_ylo=2008&as_yhi=2009&btnG=Search

http://scholar.google.com/scholar?hl=en&q=allintitle%3A+journal+impact+factor&as_ylo=&as_yhi=&btnG=Search --222.64.17.99 (talk) 23:54, 5 August 2009 (UTC)

If you require me to

Talk before re-re-adding

Then the claim of anyone can edit on the following page should be changed accordingly

http://en.wikipedia.org/wiki/Main_Page --222.64.17.99 (talk) 00:06, 6 August 2009 (UTC)

Your comments do not suggest that you take discussion or consensus seriously. You might find it useful to watch some talk page conversations before deciding how to interpret the feedback you have been receiving. No one has argued that the type of information that you seek ought not appear in the page. We have made only the rather innocuous point that a section header should not be included for a section that does not exist. As soon as you have some relevant information and the references to support that information, go right ahead and add it to the mainpage together with the new heading.
— James Cantor (talk) 01:40, 6 August 2009 (UTC)

Priority

One of the top priorities early in this article should be to simply state what a desired impact factor is for a good journal. In other words, somebody just say something like "an impact factor of five would be preferable to an impact factor of 1!!!" I'm reading this article and I can't just find out what I'm looking for when I see an impact factor on a web site. That should be one of the first things you do in a wikipedia article. Explain the basic shit in a basic way! Get to the point! Explaining the equation that generates an impact factor is fine but the first priority is to tell someone what they're looking for, not how its generated. 75.25.38.57 (talk) 01:10, 26 August 2009 (UTC)

  • Jim, when you have found the time to read the article, you'll know why what you are asking is not possible. There is considerable disagreement about "what it means". I also suggest you tone it down a bit and stay civil. In addition, please sign your comments by adding four tildes (~) at the end of your comment. Thanks. --Crusio (talk) 23:19, 25 August 2009 (UTC)
I agree with Crusio on both counts.— James Cantor (talk) 13:34, 26 August 2009 (UTC)

Fair enough, but it seems like there has to be a way to get more context early on in the article with a History section or something. I just don't think it makes sense to go directly into the related indices and calculation when the average reader probably has more general questions at that point. I'm assuming Thomson had an intended outcome when he created this system, maybe the article could focus on that first and then get into the arguments about whether he accomplished his goal. Also, why would any journal list its impact factor if the very meaning of it is so disputed? As a reader, I saw an index factor on a web site, went to wikipedia to find out what it was all about, and still don't feel that anyone has explained why a website or journal would list its index factor. I know that sometimes a question can't be answered but it seems that the most general questions in a readers mind should be addressed earlier in any article. For instance, the statement in General that the number of citations isn't an amazingly relevant measure of anything is interesting but I would like to hear why the person who invented this system went to the trouble of doing so early on in the article or perhaps something on why journals and websites display a "meaningless" number. I just rarely have this hard of a time getting "the point" out of a wikipedia article. I enjoy hearing about the debate associated with the topic, but first I just want to know why this system is used at all. In an article on fire hoses there might be a debate on their effectiveness but first there would be an explanation of what a fire hose is intended to be used for and how. 75.25.38.57 (talk) 16:52, 26 August 2009 (UTC)

As you probably have noticed, I'm new to the talk page, so I hope you will all humor me one more time. To be more specific, in the second sentence of the article "It is frequently used as a proxy for the relative importance of a journal within its field." How? How do people use it when they use it frequently? There is much discussion about the fact that theoretically Impact Factors could be useful but the article almost avoids addressing how people who do use them, foolishly or not, go about doing so. Do some people just like big numbers and other people like the number one so they look for that? Do people who prefer the number seven only read journals with an impact number of seven? or even if there's no way to explain any of that then maybe just that statement, that it's totally irrational or too complex to be useful. In some way that should be addressed I think 75.25.38.57 (talk) 17:19, 26 August 2009 (UTC)

  • What you are saying now actually makes sense to me. Perhaps this article has too much been edited by people who know all about it and therefore don't see it with "fresh eyes" any more, in contrast with you. Unfortunately, I'm way too busy with other things to look into this now, but perhaps other editors here get your message, too. --Crusio (talk) 17:54, 26 August 2009 (UTC)
  • Despite what I just said, I have made a start, but this article is an awful mess, I'm afraid... Now I really have to do some real-life work... --Crusio (talk) 18:41, 26 August 2009 (UTC)

I agree that there's much left to be done but that introduction definitely makes me a lot happier, and I have a real screen name now... so improvement all around. Skiingdemon (talk) 21:39, 27 August 2009 (UTC)

Unsourced statements v. original research

Maybe it is time to point out that there is a crucial difference between unsourced statements and original research. Original research is not accepted in Wikipedia and therefore removed. Unsourced statements that are likely or obviously correct and published should be fixed by adding references, not by deleting them. For instance, the abuse of the IF in the evaluation of individual researchers is well documented but seems to have disappeared from the article. Rl (talk) 13:58, 2 September 2009 (UTC)

  • You're absolutely right, in my zeal I cut that away without noticing that it was now completely gone. I have rephrased the existing misuse statement, as it is referenced by the article cited. I will, in fact, continue working on this article and probably re-add some of the deleted stuff (but now sourced) over the coming days. I have collected a bunch of references on IF and its use/abuse. Unfortunately, I have other work to do, too (in the real world...) and cannot work on this but intermittently. Please have some patience until I'm done. If you know of good reference, please post them here. Thanks! --Crusio (talk) 14:22, 2 September 2009 (UTC)
Here are some that I found useful (sorry about the sloppy formatting):
  • Brown, H. How impact factors changed medical publishing--and science BMJ, 2007, 334, 561-564
  • Garfield, E. Der Impact Faktor und seine richtige Anwendung Anaesthesist, 1998, 47, 439-441 english version
  • Garfield, E. The Agony and the Ecstasy--The History and Meaning of the Journal Impact Factor International Congress on Peer Review and Biomedical Publication, 2005
  • Editors choice: Let's dump impact factors BMJ, 2004, 329, 0
  • Nature Not-so-deep impact Nature, 2005, 435, 1003-1004
  • Monastersky, R. The Number That's Devouring Science Chronicle of Higher Education, 2005, 52, A12
  • Dong, P.; Loh, M. & Mondry, A. The "impact factor" revisited Biomedical Digital Libraries, 2005, 2, 7
  • PLoS, The Impact Factor Game: It is time to find a better way to assess the scientific literature PLoS Medicine, 2006, 3, e291
  • Rossner, M.; Epps, H. V. & Hill, E. Show me the data The Journal of Cell Biology, 2007, 179, 1091-1092
Rl (talk) 19:11, 2 September 2009 (UTC)

Definition

I think the definition is wrong. —Preceding unsigned comment added by 140.78.69.159 (talk) 16:04, 13 December 2010 (UTC)

  • As far as I can see, the definition is identical to what Garfield says on p5 of that paper. can you be more specific about what exactly' is wrong? --Crusio (talk) 09:05, 6 July 2011 (UTC)
  • The citation count is only for 1-year periods, not 2. The papers cited are 2-year periods. So, for example, citation index of Journal J for 2009 is citations in 2009 for J over papers published in in J during 2008 and 2007. Quoting that paragraph from his talk, "A journal’s impact factor is based on two elements: the numerator, which is the number of cites in the current year to any items published in the journal in the previous 2 years; and the denominator, the number of substantive articles (source items) published in the same 2 years." — Preceding unsigned comment added by 78.130.24.119 (talk) 10:36, 6 July 2011 (UTC)

This is fake, it's old numbers published years ago, as can be seen here (from 2007): http://www2b.abc.net.au/science/k2/stn/newposts/3142/post3142056.shtm bersbers 09:38, 12 July 2011 (UTC) — Preceding unsigned comment added by Bersbers (talkcontribs)

i lost the edit war (battle)

Guillaume2303 is a shrewd wikipedian: i think he knew better than I that if we simply undo each other’s work tit for tat the outcome would default in his favor Touché.

As to the substance, Guillaume2303 objects to reference to content factor (CF) as it is, to his view, both too old (insuffiently novel) and too new new (itself not cited by others yet (even though most papers are not cited much by their 7th day)). It can't be both, no?

In any case, as detailed on this page, there are controversies surrounding impact factor (IF). Content Factor, (which is 'total citations' expressed in units comparable to those of impact factor) might be a remedy. In this new study, CF was seen to be much (much) more correlated to importance than IF.

This point is not worthy of blithe dismissal. IF supposedly is (or is perceived to be) related to importance. The CF allusion discusses a study that actually examined this. It adds at small cost of words to the readers' understanding

4081xsn (talk) 19:32, 30 July 2012 (UTC)

journals with no impact factor

Could someone include a section of what it means (if anything) for a journal to have no impact factor ranking. Generally it means that the journal is not indexed by Thompson Reuters, however does this mean anything for a journal? Are journals in the US/Europe more likely to be indexed than journals in South America, Africa, Asia, or International journals? Also, what about journals that get banned (and therefore have no impact factor for a number of years)? http://admin-apps.webofknowledge.com/JCR/static_html/notices/notices.htm --159.178.247.145 (talk) 14:48, 10 September 2012 (UTC)

rename to Impact Factor (capitalized)

this is a proprietary product, not synonymous with citation impact metric. I'd like also to use {{Italic title}}. Fgnievinski (talk) 03:10, 6 November 2014 (UTC)

  • Not even TR uses "Impact FactorTM" or something like that. It's not a product, it's an index proposed by Garfield decades ago and before ISI (now TR) started publishing the JCR. It's not the title of a book, journal, or film or something like that, so italicizing is not warranted either. --Randykitty (talk) 09:27, 6 November 2014 (UTC)
The article is confusing TR's IF with generic citation impact metrics. IF is calculated exclusively by Thomson-Reuters. There are links across WP linking to IF when they should be linking to citation impact instead. IF is just the most famous incarnation of this generic concept. I'll continue this discussion here: Template talk:Infobox journal#Eigenfactor?. Fgnievinski (talk) 19:06, 6 November 2014 (UTC)

It seems to be quite hard to get to a nice list of journals with their impact factors. A very good external link would be http://admin-apps.webofknowledge.com/JCR/JCR If there is no reason against it, I would like to add this to the external link section, so that one can easily find this nice listing. Any reason against it or comments an that before I add it? 134.60.31.73 (talk) 11:48, 19 June 2015 (UTC)

Source for 11,000 journals

The ISI Web of Knowledge indexes more than 11,000 science and social science journals.[5][6]

I couldn't find the number (11,000) in any of those two sources. --MartinThoma (talk) 10:33, 13 October 2015 (UTC)

numeric range?

This article sucks! All I wanted to know how to interpret 0.68. Is that high, low? The 1st sentence should explain that! — Preceding unsigned comment added by 91.146.180.235 (talk) 13:39, 17 November 2015 (UTC)

New important sources on impact factors

Following are important recent sources on "impact factors" should be kept at lead paragraph. All are from prominent sources like PLOS, eLife, EMBO Journal, The Royal Society, Nature (journal) and Science (journal).

Impact factors published by Thomson Reuters is crude and also misleading. Impact factors effectively undervalues papers in subjects that are low citations or have less popular irrespective of quality and novelty[1][2]. Collaborative article from Université de Montréal, Imperial College London, PLOS, eLife, EMBO Journal, The Royal Society, Nature (journal) and Science (journal), posted on BioRxiv to follow citation distributions and share metrics as measures of publication quality, these metrics are alternative to widely practicing impact factors[3][4]. Jessie1979 (talk) 18:21, 2 August 2016 (UTC)

Please stop adding this POV paragraph ("crude and misleading" etc.). The article already has a section for criticisms. Putting POV in the lead and not representing it as such is unencyclopedic and against wiki policy. — Preceding unsigned comment added by 164.67.77.247 (talk) 19:38, 3 August 2016 (UTC)

No one could seriously argue that saying "crude and misleading" is not POV. Also the paragraph is a poorly written grammatical nightmare that doesn't even have proper subject-verb agreement. Please stop edit warring by repeatedly reinserting this problematic paragraph. — Preceding unsigned comment added by 23.242.207.48 (talk) 03:09, 4 August 2016 (UTC)

Some Suggested Updates to First Section

The current second paragraph of the article uses subjective language that is presented as fact - "crude and misleading" and "effectively undervalues." Request to remove this type of language, and place discussion of the articles referenced from Science, Nature, etc. in the "Criticisms" section, not in the lead paragraph, to help keep the article neutral.

Current: Following are important recent sources on "impact factors" should be kept at lead paragraph. All are from prominent sources like PLOS, eLife, EMBO Journal, The Royal Society, Nature (journal) and Science (journal).

"Impact factors published by Thomson Reuters is crude and also misleading. Impact factors effectively undervalues papers in subjects that are low citations or have less popular irrespective of quality and novelty[1][2]. Collaborative article from Université de Montréal, Imperial College London, PLOS, eLife, EMBO Journal, The Royal Society, Nature (journal) and Science (journal), posted on BioRxiv to follow citation distributions and share metrics as measures of publication quality, these metrics are alternative to widely practicing impact factors[3][4]." Mollym3500 (talk) 16:12, 4 August 2016 (UTC)

The paragraph in question has since been removed by other editors. Altamel (talk) 19:22, 5 August 2016 (UTC)

Thanks for discussion, removed Crude and misleading. I kept it as same wording is there at prominent source Nature (journal) . These criticisms should be highlighted because high quality scientific scientific publishers unanimously raised the concerns on impact factors. Jessie1979 (talk) 07:35, 6 August 2016 (UTC)

Those criticisms already exist in the "criticims" section. I've moved your citations there and fixed the grammar and POV. There appears to be a consensus on not including the material in the lead. 99.47.245.32 (talk) 18:51, 6 August 2016 (UTC)

Based on the response from major quality scientific publishers we should keep this paragraph at lead. Jessie1979 (talk) 11:47, 9 August 2016 (UTC)

Request to please move this line to the "Criticisms" section where it is more relevant. Mollym3500 (talk) 14:57, 9 August 2016 (UTC)

We have a consensus. I have placed the criticisms in the criticisms section, where they clearly belong. This should not be controversial. I've even preserved all the citations. A single editor has been fighting an edit war to do otherwise and it is getting tiresome. Please, stop, Jessie1979. 164.67.77.247 (talk) 22:45, 9 August 2016 (UTC)

Someone removed nature source, i am keeping again and would like to sift to criticism section. It's not vandalism, giving respect to well respected scientific publishers and keeping their comments.  Jessie1979 (talk) 06:16, 10 August 2016 (UTC)

A Few Suggested Updates to Article

Hello - there's some outdated and/or erroneous info contained within the article as it currently stands, so suggesting some updates that would help bring it more current. Please see below for specific suggestions, and thank you for your consideration and assistance.

First paragraph: Request to add as the second sentence: "Specifically the impact factor comprises all citations to the journal made in year y to items published in years y-1 and y-2 divided by the number of substantive scholarly items published in years y-1 and y-2." Request to edit the sentence: "Impact factors are calculated yearly starting from 1975 for those journals that are listed (remove current "indexed') in the Journal Citation Reports."

Request to remove line "Web of Science (previously known as (ISI) Web of Knowledge) is publishing Journal Citation Reports." as this is inaccurate - Web of Science is itself a solution within the IP & Science division of Thomson Reuters (http://ipscience.thomsonreuters.com/product/web-of-science/).

Request to remove the second paragraph or perhaps place it in a new section, "News." The current sale status is not relevant to the lede. If kept, please remove the Web of Science reference as, again, Web of Science does not publish Impact Factor. It is itself a solution within the IP & Science division of Thomson Reuters.

Calculation Section: Third paragraph: Request to change from current "Occasionally, Thomson Reuters assigns an impact factor to new journals with less than two years of indexing, based on partial citation data.[2][3]" to the following: "Occasionally, Thomson Reuters assigns an impact factor to new journals with less than two years of indexing.[2][3] The calculation always uses two complete and known years of item counts, but for new titles one of the known counts is zero."

Use: Request to change reference to "ISI Web of Knowledge" to "Web of Science Core Collection (https://en.wikipedia.org/wiki/Web_of_Science)."

Criticisms - Editorial Policies that Affect Impact Factor: Second paragraph, request to change from current: "As a result of negotiations over whether items are "citable", impact factor variations of more than 300% have been observed. Items considered to be uncitable—and thus are not incorporated in impact factor calculations—can, if cited, still enter into the numerator part of the equation despite the ease with which such citations could be excluded. " to "As a result of reconsideration of whether items are "citable", impact factor variations of more than 300% have been observed. Items considered to be uncitable—and thus are not incorporated in impact factor calculations—can, if cited, still enter into the numerator part of the equation."

Thanks again for your consideration of this edit request. Mollym3500 (talk) 18:29, 19 August 2016 (UTC)

Most of these requests seem to have either been taken care of, or have become inapplicable due to edits that have been made in the meantime. Regarding those that remain:
Calculation insertion in lede: not done. I don't think this kind of specificity is required for the lede, which should be a summary. Exact calculation is explained in detail in the section immediately following.
indexed -> listed: done.
Calculation: Done. -- Elmidae (talk · contribs) 19:15, 25 September 2016 (UTC)

quick fix to make (no time)

Hi I just have time to mention it here, not to fix it without making a mess: ref. 8 = ref. 48 Jalalian M (2015). "The story of fake impact factor companies and how we detected them". Electron Physician 7 (2): 1069–72. doi:10.14661/2015.1069-1072. Harald88 (talk) 13:08, 6 June 2016 (UTC)

Fixed. --Floatjon (talk) 21:53, 30 October 2016 (UTC)

Hello fellow Wikipedians,

I have just added archive links to one external link on Impact factor. Please take a moment to review my edit. If necessary, add {{cbignore}} after the link to keep me from modifying it. Alternatively, you can add {{nobots|deny=InternetArchiveBot}} to keep me off the page altogether. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true to let others know.

 Y An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 02:49, 9 January 2016 (UTC)

Fixed by expanding ref to use archiveurl= instead of wayback template. Has been checked. --Floatjon (talk) 21:56, 30 October 2016 (UTC)

Hello fellow Wikipedians,

I have just modified one external link on Impact factor. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 18:33, 9 April 2017 (UTC)

Hello fellow Wikipedians,

I have just modified 2 external links on Impact factor. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

 Y An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 08:08, 7 November 2017 (UTC)

Hello fellow Wikipedians,

I have just modified one external link on Impact factor. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

 Y An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 09:59, 12 November 2017 (UTC)

Concern over the Undoing of My Revisions of the Impact Factor Page

Hi Wikipedians,

I am writing to express concern over multiple undoing of my revisions of the impact factor page.

In one occasion, I cited recent research results revealing that the presence of simple hyphens in the titles of academic papers adversely affects citation counts and journal impact factors. The administrator undid my revision, pointing out that the new information was “only just published, wait and see whether it gets any traction”.

In fact, the article was cited by major international journals, newspapers, and blogs. Both Web of Science and Scopus denied the allegation, and the original authors responded. The amount of interactions was much more than an average paper that was “only just published [waiting to] see whether it gets any traction”.

In another occasion, I cited the announcement by University of Cambridge that they have joined the San Francisco Declaration on Research Assessment (DORA) as “an important step for the University, particularly for early career researchers where all too often career progression is based on judgments using flawed metrics.” The administrator undid my revision, pointing out that there was “no need to include every single signatory [of DORA] and what they said about it”.

In fact, my revision was a second attempt to highlight flaws in impact factors. I have made no attempt to “include every single signatory and what they said about it”.

I would appreciate any channel for appeal. I am just a lone crusader who wants balanced views to be shown in Wikipedia. I do not have the time and energy for edit wars.

-- Laiwoonsiu (talk) 04:20, 21 July 2019 (UTC)

  • Per WP:BOLD, if your edits are challenged, then take this to the talk pages of those articles and try and convince other editors of the need to make them. Hope this helps. --Randykitty (talk) 16:30, 20 July 2019 (UTC) (Copied from my talk page in response to the same query there).
  • Thank you for your kind advice. Following the recommendation, I have submitted my concerns to the relevant Wikipedia pages. In particular, the above is my concern over the undoing of my revisions in the impact factor page. Your understanding and consideration will be appreciated. -- Laiwoonsiu (talk) 14:48, 21 July 2019 (UTC)
  • Dear Wikipedians,
Although I proclaim myself as a crusader, I do so only for the sake of software correctness and robustness. I do not have any intension to denounce Web of Science or Google in my Wikipedia edits. In fact, WoS provides a very sophisticated “analytics” system, much much more than “raw data” in the data-information-knowledge-wisdom pyramid. Similarly, Google provides an unparalleled search engine. Our job as software testers is to verify whether there are failures, rather than to denounce these systems. The former is meant to be supportive.
For this reason, a bug is a bug. It may take some time to fixed for a large and complex system. In the meantime, academics may be well advised to avoid hyphens in their paper titles. Such is the usefulness of my editing. I do not have the energy for infinite debates. I will leave the matter to the decision of Wikipedians.
-- Laiwoonsiu (talk) 04:30, 23 July 2019 (UTC)

Content moved over from Open access

I've moved over a couple of sections from the open access page, witten by nemo bis since I think it fits here better in the #criticism section here than it did at its previous location (discussion). It probably still needs a bit of editing for tone and consistency with the rest of the page. T.Shafee(Evo&Evo)talk 11:41, 31 December 2019 (UTC)

Criticism & Institutional Responses

The criticism section largely focussed on the statistical flaws in assuming the journal impact factor to be a proxy for research quality. This needs to be supplemented by information by more reference of the social implications of the journal impact factor (ie how the JIF affects the behaviour and research of scientists). This in turn should be added by more information on the growing number of institutional responses to the use of the impact factor Xcia0069 (talk) 14:53, 11 August 2021 (UTC)

Institutional Responses

I added in some more evidence here, including further detail of the 2021 dispute in the Netherlands over the use of the impact factor. Two other changes worth mentioned - the DFG policy change in 2010 did not actually specifically mention removing JIF numbers from CVs. Secondly, I could not find reference to the NSF banning impact factor (but would be happy if someone else could!) Xcia0069 (talk) 19:12, 14 August 2021 (UTC)

Criticism

A few of the terms used in the headings in the Criticism section did not use very clear English, so I updated them. I am happy to hear better alternatives! I also edited some of these sub-sections to improve the flow of the text, but they still require a bit of editing. Xcia0069 (talk) 19:18, 16 August 2021 (UTC)

Proposal to move/change this article from 'Impact factor' to 'Journal Impact Factor'

This article pertains to a proprietary journal-level metric. There is lots of confusion & controversy as to what it is and what it can be validly used for. I think it would help clear-up some of the conclusion if the title of this article was Journal Impact Factor, to more clearly and properly indicate that this is a journal level metric. I appreciate that merely 'Impact Factor' (without the word Journal) is in common usage, but I think the common usage is imprecise.

Furthermore, this page is really referring only to Clarivate's proprietary Journal Impact Factor. There are other 'impact factors' out there such as that of Index Copernicus impact factors.

I think this article should be moved to 'Journal Impact Factor' to make it clearer and more precise what this article is describing. Metacladistics (talk) 11:06, 19 August 2021 (UTC)

I support this suggestion :) 2A00:23C8:810D:BA00:C8E7:C7B6:9609:1FD8 (talk) 15:04, 20 August 2021 (UTC)
I greatly support this change.Medhekp (talk) 18:50, 17 October 2021 (UTC)
I would oppose using capital letters. English only uses capital letters for proper nouns, implying that this might be some company's product name, or the name of an actual journal. The article seems to describe a concept, so would use lower case letters (except the first one since titles capitalize the first one). The body seems to use lower case too. Would favor journal impact factor instead. W Nowicki (talk) 20:39, 18 October 2021 (UTC)

About adding a restricted access subcategory

Randykitty and I have different ideas about what is relevant to the criticism category. I want to add the fact that Journal Impact factor is restricted to viewers as there a subscription is required to Clarivate product JCR (Journal Citation Reports) in order to view a journal impact factor.

Here is the diff https://en.wikipedia.org/w/index.php?title=Impact_factor&diff=next&oldid=1050511592

I would also like to know why Randykitty thinks it is irrelevant.

I also ask for a vote or an argumented discussion for this edit. Many ThanksMedhekp (talk) 09:32, 18 October 2021 (UTC)

I completely agree with you. Check my edit and discussion below! Apparently, Randykitty is against any edit highlighting the commercial exploitation of impact factor by Clarivate. The present article makes the impact factor synonymous with the JCR impact factor, which is a product of Clarivate.KaramMaleki (talk) 20:36, 23 November 2021 (UTC)

Scientific aspect of impact factor (commercializing a scientific topic)

Randykitty perfectly safeguards the business interest of Clarivate by reverting any change undermining their role. This article is more like a business profile for Clarivate.

Impact factor is a scientometric measure and does not belong to Clarivate. The popularity of their business does not grant them a monopoly. It is totally inappropriate to include "calculated by Clarivate" as part of the definition in the first line. It is a shame that a scientific topic is written commercially. KaramMaleki (talk) 17:48, 23 November 2021 (UTC)

  • Well, I must earn all those $$$ that Clarivate pays me to be their stooge. Apart from that, when somebody talks about an impact factor, they mean Clarivate's 2-year IF. None of the alternatives come even close, not the different measures you can find in the JCR nor those provided by Scopus. We may regret that, but it's the reality and that is what we have to present. Like it or not, but Clarivate effectively has a monopoly. --Randykitty (talk) 18:00, 23 November 2021 (UTC)
  • Reporting a monopoly is different from legitimizing it. An independent encyclopedia/article does not do the latter. The correct way is: "Impact factor is X. Currently, Y is the only source". The current version says Impact Factor is what Clarivate produces, as Apple manufactures iPhone. The name Clarivate has appeared in the first line twice. It is even mentioned before explaining what the impact factor is. Keep in mind that most reads of this article are not fully familiar with impact factors or scientometrics. Therefore, the current version is utterly misleading. Many people believe Impact Factor is iPhone of Clarivate because of this misleading Wikipedia article. This is why I mentioned safeguarding their business interest. KaramMaleki (talk) 19:08, 23 November 2021 (UTC)
  • The criticism section was indeed the basis of my outcry. First, all my edits were in line with the criticisms extensively referenced in the corresponding section. Second, 80% of the article is a criticism of the JCR Impact Factor, but the opening section implies that impact factor is the JCR Impact Factor. Note that many people do not go far beyond the opening section, and probably the whole article in other languages. Third, one may argue that Clarivate is the only source of journal impact factors. I provided an alternative resource for freely available impact factors, and you deleted it as spam.KaramMaleki (talk) 23:34, 23 November 2021 (UTC)
  • The lead concludes with "it has recently come under attack for distorting good scientific practices". And while exaly could perhaps be mentioned in an appropriate place in the body of the article (provided there's at least one independent reliable source discussing it in depth), I don't see any evidence that even a small fraction of researchers pays even passing attention to it, so mentioning it in the lead is WP:UNDUE. Given that your draft doesn't mention even an -in-passing mention of Exaly, I doubt that there are more substantial sources, so at this point mentioning it in the article indeed is just spam. Exaly is not new to this, you surely must have noticed that an article on it haas been deleted multiple times in the past and currently is salted. --Randykitty (talk) 09:11, 24 November 2021 (UTC)
  • I created the page and mentioned exaly because I believe it is quite comprehensive for a free service (at least when the main service needs subscriptions). I admit I was not observant of the Wikipedia rule for secondary sources. Someone else may write it again according to the Wikipedia rules. However, I disagree with you on two points: first, if something is not popular is not worthy of mentioning; second, if something is not backed by secondary sources is spam. Spam is irrelevant stuff. I mentioned a free service providing all information provided by the paid service; you cannot call it irrelevant, trivial, or spam. In any case, my concern is that the article is misleading by making impact factor a product of Clarivate. To clarify my point, let me write the first sentence of the page Search engine with the style of this article. "Search engine is a software system designed by Google for searching the internet based on Google's indexed database." I wrote the sentence word by word from the current article.KaramMaleki (talk) 01:33, 26 November 2021 (UTC)
  • You're comparing apples and pears. In the case of search machines, despite Google's dominance, basically everyone knows that there are valid alternatives. In the case of impact factors, hardly anybody has ever heard of Exaly and if people talk about IFs, they mean those IFs that are calculated by Clarivate. Popularity has nothing to do with it: something unpopular may be notable and something popular may not, it all depends on the existence of reliable secondary sources that are independent of the subject and discuss it in-depth. I haven't seen any such sources about Exaly yet. --Randykitty (talk) 07:50, 26 November 2021 (UTC)
  • It is not about Exaly. I mentioned it as evidence that there are free sources for impact factor too. SCI's "Cites / Doc. (2 years)" is technically impact factor, but they don't call it impact factor because SCI belongs to Scopus, which is owned by Elsevier, which has shared business interests with Clarivate. 95% of authors have never heard the name Clarivate (but most likely ISI). Even WoS is not the popular academic search engine, most authors prefer Google Scholar, Scopus, etc. My point is that there is no justification for mentioning Clarivate twice in the first line. I believe the first line should be the conceptual definition of impact factor without highlighting who provides it. The subsequent lines can stress Clarivate is the main provider, as opposed to the only provider, of impact factor. As fun facts for the comparison of apples and pearls, 50% of people believe Google is part of the internet; 60% of people believe the internet is a facility run by the US government (I might be able to find secondary sources :D) KaramMaleki (talk) 18:31, 26 November 2021 (UTC)
All reputable alternatives to the JCR JIF are careful not to call themselves JIF, only fake alternatives would call themselves JIF, that's the gist of sources cited in the counterfeit section. fgnievinski (talk) 05:20, 1 December 2021 (UTC)

History

Think we should expand the history section about recent developments and applications, what do you think, any idea about sources? --Karlaz1 (talk) 14:03, 21 February 2022 (UTC)

Society impact

article in society impact 2407:5200:400:1C29:D91F:CC82:B1FA:5404 (talk) 13:51, 31 July 2023 (UTC)

journals with and without impact factors

The article describes that new journals only get an impact factor after 2-3 years, which makes sense. So, when an article like this one talks about journals without impact factors, they're basically talking about young (new) journals? Or is there another reason a journal would not get an impact factor? Azertus (talk) 14:01, 4 September 2023 (UTC)

  • Journals don't automatically get an impact factor. Their publisher has to apply for that and provide lots of information. And then it depends on whether a journal meets Clarivate's criteria whether or not it will get indexed. And even when indexed, journals get constantly evaluated and may get delisted if their quality goes down. Some journals never manage to get an IF (predatory journals, for example). See Journal Citation Reports. Hope this helps. --Randykitty (talk) 15:52, 4 September 2023 (UTC)
  1. ^ "Time to remodel the journal impact factor". Nature (journal). 535 (466). 2016. doi:10.1038/535466a.
  2. ^ John Bohannon (2016). "Hate journal impact factors? New study gives you one more reason". Science (journal). doi:10.1126/science.aag0643.
  3. ^ Veronique Kiermer (2016). "Measuring Up: Impact Factors Do Not Reflect Article Citation Rates". PLOS. doi:orcid.org/0000-0001-8771-7239. {{cite journal}}: Check |doi= value (help)
  4. ^ "Ditching Impact Factors for Deeper Data". Retrieved 2016-07-29.