Print

Print


***** To join INSNA, visit http://www.insna.org *****
One finding stands out to me in Professor Martín's very useful set of sources on the H-index, which ties in with other comments: "The use of the h-index could provoke changes in the publishing behaviour of scientists, such an artificial increase in the number of self-citations distributed among the documents on the edge of the h-index". An ongoing issue with the use of any bibliometric measure of success, prominence or reputation is that it can produce systematic feedback effects in its own measured phenomena. There are fewer problems if measures of prominence (with error included) is treated as such in a void, but if they're relied upon in publication or citation decisions, those sources of "error" will be compounded and re-integrated into the graph in ways that would be difficult to separate from prior "true" prominence using bibliometric means. 

At a minimum I think it's safe to say that a solitary quantitative measure doesn't exist that I could use as a "hallmark" of success in my field, even setting aside dataset sourcing problems. Google Scholar says Stapel still has an h-index of 26 on his publications since 2011, after all!

Robert Marriott

On Tue, Jan 12, 2016 at 3:39 PM, Manuel Jesús Cobo Martín <[log in to unmask]> wrote:
***** To join INSNA, visit http://www.insna.org *****
Dear Valdis,

The H-Index was introduced by Hirsch (Hirsch JE (2005) An index to quantify an individual's scientific research output. Proceedings of the National Academy of Sciences 102:16569-16572, doi: 10.1073/pnas.0507655102)), and it is defined as: "A scientist has index h if h of his or her Np papers have at least h citations each and the other (Np - h) papers have <= h citations each."

Although an h-index of 16 could be high for some disciplines, for others it could be low. The best way to know if an h-index is high, is to compare it with the h-index of relevant colleagues in the same field/discipline.

Finally, you can calculate the h-index in different bibliometric databases (ISI Web of Science, Scopus, Google Scholar, etc). In each one, the h-index could vary significantly. For example, get a citation in ISI Web of Science is more difficult than get it in Google Scholar.

Also, the altimetrics such as paper downloaded or browser are not taken into account in the computation of the h-index, since it only take into account the citations received. But, some studies argue that the altimetrics contribute positively in get more citations, and therefore to get a higher h-index.

If you would like to read more about this topic, maybe you could be interested in look at this web of my colleagues: http://sci2s.ugr.es/hindex

Best Regards,

Manuel Jesús.

@======================================================================@
Manuel Jesús Cobo Martín (Assistant professor).
 ----------------------------------------------------------------------
Dept. Computer Science
University of Cádiz. 11202 Algeciras SPAIN
e-mail: [log in to unmask]
e-mail: [log in to unmask]
Personal URL: http://sci2s.ugr.es/members#MJCobo
twitter: @mjcobomartin
Research ID: http://www.researcherid.com/rid/C-5581-2011
ORCID: https://orcid.org/0000-0001-6575-803X
Scholar google profile: http://scholar.google.com/citations?user=sc5Kz0oAAAAJ
ResearchGate: https://www.researchgate.net/profile/Manuel_Cobo
 ----------------------------------------------------------------------
Research group "Soft Computing and Intelligent Information Systems"
URL Research group: http://sci2s.ugr.es
@======================================================================@


El 12/01/2016 a las 21:08, Valdis Krebs escribió:
***** To join INSNA, visit http://www.insna.org *****
Happy 2016 to all!

I was reading an article on academic publications ( https://www.timeshighereducation.com/news/the-1-per-cent-at-the-centre-of-research/2014812.article )  and saw this statement…

… an h-index of 16 or more, which is regarded as the “hallmark of a successful scientist” ...

Is that true???  For both hard and social sciences?  And if true, which calculation do you trust? Is the Google Scholar h-Index a trusted source?

This brings up another question… How do all of the WWW sites for papers/research such as arXiv.org, academia.edu, researchgate.net, etc. affect paper rankings?  Maybe many of these papers never make into an “official journal”?  These sites probably keep stats on papers browsed, papers downloaded, and so on.  A downloaded paper shows greater interest/attention than a browsed paper.  Are these online publications omitted, or included, in h-index calcs?  Anyone know how all of this works?

Thanks,

Valdis


_____________________________________________________________________ SOCNET is a service of INSNA, the professional association for social network researchers (http://www.insna.org). To unsubscribe, send an email message to [log in to unmask] containing the line UNSUBSCRIBE SOCNET in the body of the message.

_____________________________________________________________________ SOCNET is a service of INSNA, the professional association for social network researchers (http://www.insna.org). To unsubscribe, send an email message to [log in to unmask] containing the line UNSUBSCRIBE SOCNET in the body of the message.

_____________________________________________________________________ SOCNET is a service of INSNA, the professional association for social network researchers (http://www.insna.org). To unsubscribe, send an email message to [log in to unmask] containing the line UNSUBSCRIBE SOCNET in the body of the message.