Metrics, Metrics, (Alt)metrics

By Amy Atchison

giphy1

If it sometimes feels like success in academia boils down to metrics, that’s because in many US institutions it does (sadly) boil down to metrics. We all know that it isn’t just the number of publications you have. It’s also the impact factor of the journals in which those articles were published, the citation count per article, and your h-index score. (And don’t get me started on the non-research metrics, like course evaluations—which we all know are notoriously flawed. See here, here, and here.) Those are all pretty common measures that are widely used. But a recent Twitter thread indicated to me that some political scientists may not be aware of a new(ish) measure that can help to quantify use of your non-publication outputs as well as your social media reach: altmetrics. This is helpful if your institution puts a premium on public engagement.

Altmetrics are simply alternative measures of scholarly reach/output. They include measures like the number of downloads of your work from your institutional repository or number of mentions on social media. The usage metrics provided by Academia.edu or Research Gate are also considered alternatives to traditional metrics (use the latter with caution, though).

In this post, I focus on Altmetric Badges from Altmetric.com because they aggregate many sources of attention and because many leading journals have started adding Badges to their sites. I give a brief overview of altmetrics, including how they can be used in promotion and tenure (P&T) applications, as well as the pros/cons. I end the post with a brief overview of the problems inherent in many of the traditional measures we use to evaluate scholarship (citation counts, journal impact factors, etc.) since I have found that almost no one tells people these things in grad school. (But they’re helpful to know!)

About Altmetrics & the Altmetrics Badge

The Altmetrics Manifesto was a response to an academic environment in which (1) traditional metrics were under fire from many quarters of the academy (see criticisms overview, below) and (2) scholars were increasingly engaging with social media and blogging. The authors of the manifesto promoted alternative metrics to address both issues. Since then, use and tracking of altmetrics has grown, and many large academic publishers now include Altmetric.com’s “attention” scores for the articles they publish. The attention score tracks your “citations on Wikipedia and in public policy documents, discussions on research blogs, mainstream media coverage, bookmarks on reference managers like Mendeley, and mentions on social networks such as Twitter.”

The circle pictured here is what you see on journal/publisher sites; it’s called an Altmetric Badge. At the center is the article attention score and the “donut” is the circle around it. The attention score isn’t an exact measure of your reach, it’s a weighted estimation based on volume of mentions, media type, and who is sharing your research. Each color in the donut represents a different media type (blogs, Twitter, etc.).

Screen Shot 2017-06-16 at 2.03.44 PM

While not all journals track altmetrics yet, it’s interesting to note that the social sciences & humanities seem to be early adopters, which may indicate that our colleagues are seeing utility in the measure. If the journal uses Altmetrics Badges, your Badge will show up on the journal’s site automatically once your article has received some online love (Tweets, FB mentions, blog posts, news articles, etc). Finding your attention score is easy: just find your article on the journal/publisher site, then click on “Metrics.” (Altmetrics Badges often appear next to the article listing in the online table of contents for the issue, as well.)

If your article has an attention score, the Altmetric Badge will typically be the first thing listed in the metrics. If you can’t find the metrics button on the publisher’s website, you can always add Altmetric.com’s “bookmarklet” to your browser—click that & it gives you the Altmetric Badge of any article you’re reading (if it has an attention score, of course). When you click a Badge, it’ll take you to a breakdown of both the components of that badge – including number of mentions, where hits are coming from, professions of the people giving the work their attention, and the context of the score in comparison both to other articles.

If your article doesn’t have an attention score and you’d like to change that, check out these tips and tricks. I’ve found that Twitter feeds like Women Also Know Stuff (@womenalsoknow) and PSA Women & Politics (@PSAWomenPol) have been great ways to promote my work on social media. The WPSA also has a twitter account where they are happy to promote your work (@theWPSA). Just tag or @ them.

Can I use this in my Tenure & Promotion File?

I included altmetrics data in my tenure file, so yes, you probably can. The nice folks at Altmetric.com even have a handy-dandy info-graphic guide (and longer post) to help you include altmetrics into your tenure and promotion applications appropriately. The challenge may be that some of the people on your committee aren’t going to know what altmetrics are. I included the “what are altmetrics?” page from Altmetric.com in my tenure file to explain.

If you’re not immediately up for tenure or promotion, you can start laying the groundwork now by talking to your department/likely tenure voters about altmetrics.

Some things that might be helpful here:

  • Many research libraries have LibGuides on altmetrics, including sections for using altmetrics for tenure and promotion, like this one from Oregon State University. Check your library website—you might even get institution-specific tips. Your subject librarian might even be willing to do a brief presentation to your department.
  • Altmetrics fill in some gaps left by traditional measures like citation counts. For example, altmetrics can tell you if your publications have been cited in non-academic formats (public policy documents, practitioner publications, news media) and demonstrate the reach of your non-publication outputs like datasets.
  • Because the peak citation year for most articles is typically 3-5 years after publication, altmetrics can be an early indicator of the type of impact an article may have. This is helpful if you are a junior scholar whose work has not yet had time to accumulate citations.
  • The available evidence shows that there does seem to be a positive association between altmetrics and later citation—it makes sense that the more eyes on your research, the greater the likelihood that it’ll get cited. Keep in mind, though, that the Altmetric.com folks only started tracking attention metrics in 2011, so it’s not like there’s strong longitudinal data. Bottom line: don’t oversell the altmetrics-citation link.

The biggest thing to keep in mind if you’re using altmetrics in your tenure or promotion application is that your attention score is a complimentary statistic that can be helpful if used appropriately. It’s just one more tool in your P&T toolbox.

Pros and Cons of Altmetrics

People more knowledgeable than I have put together lists of the advantages, and I think this article’s list is pretty succinct, so I’ve pasted it here for you (verbatim):

Altmetrics offer four potential advantages:

  • A more nuanced understanding of impact, showing us which scholarly products are read, discussed, saved and recommended as well as cited.
  • Often more timely data, showing evidence of impact in days instead of years.
  • A window on the impact of web-native scholarly products like datasets, software, blog posts, videos and more.
  • Indications of impacts on diverse audiences including scholars but also practitioners, clinicians, educators and the general public.

Also, the part where altmetrics may be predictive of citation patterns is a pretty big pro. Of course, it could be a con if your article gets very little attention…but you probably wouldn’t highlight a lack of altmetrics/citations in your tenure or promotion application anyway.

Screen Shot 2017-06-16 at 2.04.34 PM

Turning to the disadvantages, the Altmetrics.com site itself has a pretty good list of limitations, including full explanations of the following:

  • Altmetrics don’t tell the whole story
  • Like any metric, there’s a potential for gaming of altmetrics
  • Altmetrics are relatively new, more research into their use is needed

On that last one, it’s more than just “more research is needed.” (How many times have we all typed that phrase?) It’s also that Altmetrics Badges are still evolving and do not include every possible source that could be mentioning your work. For example, I’ve done two LSE Impact Blog posts that reference two separate PS articles, but those are not reflected in my Badges for those articles because Altmetics.com does not yet monitor the LSE blogs. However, this is manageable: I’ve put in a suggestion to add LSE to the monitored blogs—users can always suggest new sites that they should be monitoring.

To the Altmetric.com list, I would add just one more limitation:

  • Just as databases sometimes miss citations to your work (see below), sometimes Altmetrics.com misses some of the attention being paid to your work. It could be a typo in the Tweet, it could be a glitch, or (most likely) it could be that you didn’t link to a version of an article that has a DOI. Keep in mind that it works best when the DOI (or book ISBN) is available—so use the publisher’s link to your work. Also, if they miss one, they have a mechanism for you to submit missed attention.

Finally, you should be aware that there are people who are actively hostile to altmetrics – some people really don’t like change, I guess. So if you run into one of those at your institution, proceed with caution.

Now, having just told you the drawbacks to altmetrics, it’s worthwhile to also mention the most commonly criticized drawbacks to traditional academic metrics (like journal impact factor). In conversations with colleagues, it seems that there’s not a lot of understanding of the flaws in our measures of academic “success.”

You can stop reading here if you’ve already got a grip on the common criticisms of article counts, journal impact factors, citation counts, and h-index scores. Otherwise, read on McDuff.

The Drawbacks of Traditional Metrics

Altmetrics came about, in part, as a reaction against traditional academic metrics such as article counts, journal impact factors, citation counts, and h-index scores. All of these metrics have utility, don’t get me wrong, but they all have some serious issues, as well. Despite these issues, many institutions’ P&T decisions are heavily reliant on these measures; it’s this overreliance on them as a measure of a scholar’s worth that many people find to be objectionable.

First, number of articles seems straightforward, does it not? Just count how many articles you’ve published. Easy-peasy. But it really isn’t that straightforward. Does your department “count” co-authored publications? If so, do they count as full or partial publications? What if you’re the second author? Is your solo-or-dual-authored article counted the same as that article on which your colleague in the natural sciences is literally the 17th author? What about pedagogy articles? What about articles published outside of the discipline? I could go on, but you get the point: article count isn’t as easy as it seems.

Next we get to journal impact factor (JIF), which has so many flaws that I can’t get to all of them in the space allotted. So, let me focus on just one: the JIF was never intended to be applied to individual publications. It’s a fairly simple metric, typically described as the average number of citations per article over the previous two years; it is a proprietary score calculated by Thomson Reuters & published in the Journal Citation Report. Of note is the fact that, technically, the calculation isn’t exactly the average of the citations of the articles—it’s citations divided by “items” published in the journal over the two years under review—whatever “items” means (apparently Thomson-Reuters is not overly forthcoming on that score). That aside, average number of citations per article is misleading, at best.

I’ll use the APSR as an example. The APSR’s 2015 score, 3.44, would typically be interpreted to mean that 2013-14 APSR articles were cited an average of 3.44 times. So, in the absence of the Thomson-Reuters data that were used to calculate impact factor, I used Publish or Perish to pull all (115) APSR articles from 2013 and 2014; I found that nearly 20% have fewer than 3.44 citations, and almost 9% have zero citations. (It could be worse; apparently the % of articles under the average in some journals can be as high as 75%.) The APSR’s 2015 JIF is strongly influenced by the 10% of articles that were cited more than 100 times (including one with 728 citations). With that in mind, the bottom line is that if departments are using JIFs as an indicator of article quality, then the authors of the 9 papers that were never cited are essentially free-riding on other authors’ citation counts. This only scratches the surface of academics’ concerns regarding JIFs. See here, here, and here for more.

Moving on to citation counts, those are a bit thorny, as well (and are another problem of JIFs). How many citations has your article received? Well, that depends on which database you’re in (Google Scholar, Web of Science, or Scopus). Citation counts come from the references section of published papers –and each database uses different rules for counting those citations, so the counts can be very different. To use myself as an example, Google Scholar indicates that my 2015 PS paper (with Jonathan K. Bull) has received 25 citations; Scopus says it has 13, and Web of Science says 9. Not only might counts vary wildly, but using citation counts as a measure of a junior scholar’s worthiness for tenure is seriously problematic since the peak citation year for articles is typically 3-5 years after publication, although some research indicates longer for female-authored articles. This is likely because women’s articles aren’t cited at the same rate as men’s. Side note/shameless self-promotion: my recent research indicates that women can negate men’s gender citation advantage by making their work open access after publication.

Finally, let’s talk h-index scores. Citation counts are one of the many reasons that h-index scores aren’t the world’s best measure of scholar quality. The other reasons include lack of comparability across disciplines or between ranks, plus the fact that the exclusion of non-article publications (from books to blog posts) means that not all of a scholar’s work is incorporated into the index score. How can you measure “scholar quality” if a scholar’s full body of work is not being assessed? And, of course, there’s also the “which database?” question, because each one calculates an h-index. Google Scholar says my h-index is 4, Scopus says 2, and Web of Science says 1 (Web of Science doesn’t seem to be using all of my articles, either). So, for all of these reasons & more, h-index truly tells you very (very) little about the quality of an author’s work. Use an h-index with extreme caution.

Wrapping Up

The TL;DR version of this post is this: citation count isn’t the only measure of impact, check out altmetrics as a compliment to more traditional measures of scholarly output/quality, traditional measures are flawed, and any measure can be misused. Knowing how they can be misused helps you put them into context when applying for tenure and promotion.

Amy L. Atchison is associate professor of political science and international relations at Valparaiso University.  Her work has appeared in PS: Political Science and Politics, Politics & Gender, the Journal of Women, Politics, & Policy, the Journal of Political Science Education, and Poverty & Public Policy. Her Twitter handle is @Dr_Atchison. 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s