Don’t have time to stay on top of the most important Open Science and Altmetrics news? We’ve gathered the very best of the month in this post. Read on!
UK researchers speak out on assessment metrics
There are few issues more polarizing in academia right now than research assessment metrics. A few months back, the Higher Education Funding Council for England (HEFCE) asked researchers to submit their evidence and views on the issue, and to date many well-reasoned responses have been shared.
Some of the highlights include Ernesto Priego’s thoughtful look at the evidence for and against; this forceful critique of the practice, penned by Sabaratnam and Kirby; a call to accept free market forces “into the internal dynamics of academic knowledge production” by Steve Fuller; and this post by Stephen Curry, who shares his thoughts as a member of the review’s steering group.
Also worth a look is Digital Science’s “Evidence for excellence: has the signal overtaken the substance?”, which studies the unintended effects that past UK assessment initiatives have had on researchers’ publishing habits.
Though the HEFCE’s recommendations will mainly affect UK researchers, the steering group’s findings may set a precedent for academics worldwide.
Altmetrics researchers agree: we know how many, now we need to know why
Researchers gathered in Bloomington, Indiana on June 23 to share cutting-edge bibliometrics and altmetrics research at the ACM WebScience Altmetrics14 workshop.
Some of the highlights include a new study that finds that only 6% of articles that appear in Brazilian journals have 1 or more altmetrics (compared with ~20% of articles published in the “global North”); findings that use of Twitter to share scholarly articles grew by more than 90% from 2012 to 2013; a study that found that most sharing of research articles on Twitter occurs in original tweets, not retweets; and a discovery that more biomedical and “layman” terms appear in the titles of research shared on social media than in titles of highly-cited research articles.
Throughout the day, presenters repeatedly emphasized one point: high-quality qualitative research is now needed to understand what motivates individuals to share, bookmark, recommend, and cite research outputs. In other words, we increasingly know how many altmetrics research outputs tend to accumulate and what those metrics’ correlations are–now we need to know why research is shared on the social Web in the first place, and how those motivations influence various flavors of impact.
Librarians promoting altmetrics like never before
This month’s Impactstory blog post, “4 things every librarian should do with altmetrics,” has generated a lot of buzz and some great feedback from the library community. But it’s just one part of a month filled with librarians doin’ altmetrics!
To start with, College & Research Libraries News named altmetrics a research library trend for 2014, and based on just the explosion of librarian-created presentations on altmetrics in the last 30 days alone, we’re inclined to agree! Plus, there were librarians repping altmetrics at AAUP’s Annual Meeting and the American Library Association Annual Meeting (here and here), and the Special Libraries Association Annual Meeting featured our co-founder, Heather Piwowar, in two great sessions and Impactstory board member, John Wilbanks, as the keynote speaker.
More Open Science & Altmetrics news
- NIH makes a major change to the biosketch format: now, applicants will be asked to list their accomplishments instead of their publications in their NIH biosketches. As you can see from the blog post’s comments, some researchers are up in arms. But supporters of the change point out that it moves science away from defining success using a very narrow format, the journal article.
- Your input is needed on NISO’s altmetrics recommended practices: the standards-setting organization NISO has requested comments on a number of action items and recommended practices related to altmetrics. Their final report will likely have consequences for how publishers, universities, and scholars access and use altmetrics. Interested? You can submit your feedback through July 18, 2014.
- Altmetrics companies Plum Analytics and Altmetrics roll out great new features: in June, Plum Analytics rolled out a new data visualization for their platform, PlumX, called the Plum Print, became the first altmetrics company to add download and pageview count information from library database vendor EBSCO, and teamed up with research foundation Autism Speaks to share “powerful and actionable information about the research they fund.” And the fine folks at Altmetric released their institutional edition and added a crucial new metric to their reports: citations in policy documents. Congrats, guys!
- You can now export Impactstory metrics directly to your CV: not long after we published our thoughts on why you should put altmetrics on your CV, post-doc and Impactstory Advisor Guillaume Lobet created an Impactstory CV generator. Provide your Impactstory handle and his tool exports your CV in HTML, PDF, LaTeX, or Markdown format. Check it out now!
- Save the date for Europe’s first altmetrics conference: 1am: London will be held September 25-26 at the Wellcome Trust’s offices in London, and (based on the rumors we’ve heard about the lineup) is shaping up to be great! For more information as it is released, watch altmetricsconference.com.
Stay connected
We share altmetrics and Open Science news as-it-happens on our Twitter, Google+, Facebook, or LinkedIn pages. And if you don’t want to miss next month’s news roundup, remember that you can sign up to get these posts and other Impactstory news delivered straight to your inbox.
One thought on “Open Science & Altmetrics Monthly Roundup (June 2014)”