Let’s value the software that powers science: Introducing Depsy

Today we’re proud to officially launch Depsy, an open-source webapp that tracks research software impact.

We made Depsy to solve a problem:  in modern science, research software is often as important as traditional research papers–but it’s not treated that way when it comes to funding and tenure. There, the traditional publish-or-perish, show-me-the-Impact-Factor system still rules.

We need to fix that. We need to provide meaningful incentives for the scientist-developers who make important research software, so that we can keep doing important, software-driven science.

Lots of things have to happen to support this change. Depsy is a shot at making one of those things happen: a system that tracks the impact of software in software-native ways.

That means not just counting up citations to a hastily-written paper about the software, but actual mentions of the software itself in the literature. It means looking how software gets reused by other software, even when it’s not cited at all. And it means understanding the full complexity of software authorship, where one project can involve hundreds of contributors in multiple roles that don’t map to traditional paper authorship.

Ok, this sounds great, but how about some specifics. Check out these examples:

  • GDAL is a geoscience library. Depsy finds this cool NASA-funded ice map paper that mentions GDAL without formally citing it. Also check out key author Even Rouault: the project commit history demonstrates he deserves 27% credit for GDAL, even though he’s overlooked in more traditional credit systems.
  • lubridate improves date handling for R. It’s not highly-cited, but we can see it’s making a different kind of impact: it’s got a very high dependency PageRank, because it’s reused by over 1000 different R projects on GitHub and CRAN.
  • BradleyTerry2 implements a probability technique in R. It’s only directly reused by 8 projects—but Depsy shows that one of those projects is itself highly reused, leading to huge indirect impacts. This indirect reuse gives BradleyTerry2 a very high dependency PageRank score, even though its direct reuse is small, and that makes for a better reflection of real-world impact.
  • Michael Droettboom makes small (under 20%) contributions to other people’s research software, contributions easy to overlook. But the contributions are meaningful, and they’re to high-impact projects, so in Depsy’s transitive credit system he ends up as a highly-ranked contributor. Depsy can help unsung heroes like Micheal get rewarded.
     

Depsy doesn’t do a perfect job of finding citations, tracking dependencies, or crediting authors (see our in-progress paper for more details on limitations). It’s not supposed to. Instead, Depsy is a proof-of-concept to show that we can do them at all. The data and tools are there. We can measure and reward software impact, like we measure and reward the impact of papers.

Embed impact badges in your GitHub README

Given that, it’s not a question of if research software becomes a first-class scientific product, but when and how. Let’s start having the conversations about when and how (here are some great places for that). Let’s improve Depsy, let’s build systems better than Depsy, and let’s (most importantly) start building the cultural and political structures that can use these systems.

For lots more details about Depsy, check out the paper we’re writing (and contribute!), and of course Depsy itself. We’re still in the early stages of this project, and we’re excited to hear your feedback: hit us up on twitter, in the comments below, or in the Hacker News thread about this post.

Depsy is made possible by a grant from the National Science Foundation.
edit nov 15 2015: change embed image to match new badge

Better than a free Ferrari: Why the coming altmetrics revolution needs librarians

This post was originally published as the forward to Meaningful Metrics: A 21st Century Librarian’s Guide to Bibliometrics, Altmetrics, and Research Impact [paywall, embargoed for 6mo]. It’s also persistently archived on figshare.

A few days ago, we were speaking with an ecologist from Simon Fraser University here in Vancouver, about an unsolicited job offer he’d recently received. The offer included an astonishing inducement: anyone from his to-be-created lab who could wangle a first or corresponding authorship of a Nature paper would receive a bonus of one hundred thousand dollars.

Are we seriously this obsessed with a single journal? Who does this benefit? (Not to mention, one imagines the unfortunate middle authors of such a paper, trudging to a rainy bus stop as their endian-authoring colleagues roar by in jewel-encrusted Ferraris.)  Although it’s an extreme case, it’s sadly not an isolated one. Across the world, A Certain Kind of administrator is doubling down on 20th-century, journal-centric metrics like the Impact Factor.

That’s particularly bad timing, because our research communication system is just beginning a transition to 21st-century communication tools and norms. We’re increasingly moving beyond the homogeneous, journal-based system that defined 20th century scholarship.

Today’s scholars increasingly disseminate web-native scholarship. For instance, Jason’s 2008 tweet coining the term “altmetrics” is now more cited than some of his peer-reviewed papers. Heather’s openly published datasets have gone on to fuel new articles written by other researchers. And like a growing number of other researchers, we’ve published research code, slides, videos, blog posts, and figures that have been viewed, reused, and built upon by thousands all over the world. Where we do publish traditional journal papers, we increasingly care about broader impacts, like citation in Wikipedia, bookmarking in reference managers, press coverage, blog mentions, and more. You know what’s not capturing any of this? The Impact Factor.

Many researchers and tenure committees are hungry for alternatives, for broader, more diverse, more nuanced metrics. Altmetrics are in high demand; we see examples at Impactstory (our altmetrics-focused non-profit) all the time. Many faculty share how they are including downloads, views, and other alternative metrics in their tenure and promotion dossiers, and how evaluators have enthused over these numbers. There’s tremendous drive from researchers to support us as a nonprofit, from faculty offering to pay hundreds of extra dollars for profiles, to a Senegalese postdoc refusing to accept a fee waiver. Other altmetrics startups like Plum Analytics and Altmetric.com can tell you similar stories.

At higher levels, forward-thinking policy makers and funders are also seeing the value of 21st-century impact metrics, and are keen to realize their full potential. We’ve been asked to present on 21st-century metrics at the NIH, NSF, the White House, and more. It’s not these folks who are driving the Impact Factor obsession; on the contrary, we find that many high-level policy-makers are deeply disappointed with 20th-century metrics as we’ve come to use them. They know there’s a better way.

But many working scholars and university administrators are wary of the growing momentum behind next-generation metrics. Researchers and administrators off the cutting edge are ill-informed, uncertain, afraid. They worry new metrics represent Taylorism, a loss of rigor, a loss of meaning. This is particularly true among the majority of faculty who are less comfortable with online and web-native environments and products. But even researchers who are excited about the emerging future of altmetrics and web-native scholarship have a lot of questions. It’s a new world out there, and one that most researchers are not well trained to negotiate.

We believe librarians are uniquely qualified to help. Academic librarians know the lay of the land, they keep up-to-date with research, and they’re experienced providing leadership to scholars and decision-makers on campus. That’s why we’re excited that Robin and Rachel have put this book together. To be most effective, librarians need to be familiar with the metrics research, which is currently advancing at breakneck speed. And they need to be familiar with the state of practice–not just now, but what’s coming down the pike over the next few years. This book, with its focus on integrating research with practical tips, gives librarians the tools they need.

It’s an intoxicating time to be involved in scholarly communication. We’ve begun to see the profound effect of the Web here, but we’re just at the beginning. Scholarship is on the brink of Cambrian explosion, a breakneck flourishing of new scholarly products, norms, and audiences. In this new world, research metrics can be adaptive, subtle, multi-dimensional, responsible. We can leave the fatuous, ignorant use of Impact Factors and other misapplied metrics behind us. Forward-thinking librarians have an opportunity to help shape these changes, to take their place at the vanguard of the web-native scholarship revolution. We can make a better scholarship system, together. We think that’s even better than that free Ferrari.

Farewell to Stacy

We’ve made a lot of happy announcements here on our blog, but today we’re making a sad one: Friday was Stacy’s last day at Impactstory. We’re eliminating our Director of Marketing position, because we need to focus significantly less on marketing and significantly more on finding product-market fit. We’re at a point where we must double down on understanding our users’ needs, and building the product it takes to meet them.

Stacy accomplished amazing things at Impactstory.  Here are just a few:

  • Turned our blog into the top source of information on altmetrics (not just our opinion…we’ve had lots of folks tell us this) for thousands of readers
  • Authored a terrific free e-book on how to raise the profile of scholars’ research
  • Created and ran our successful Advisor program, which is now comprised of researchers and librarians from all over the world
  • Quintupled our followers on Twitter

Stacy is amazing. She’s smart, thorough, engaging, and a terrific combination of idealistic and practical. We’re so proud to have worked beside her.

Impactstory is going to move forward. We’re going to keep learning, keep improving, and we’re ultimately going to transform the world of scholarly communication–thanks in part to the great work that Stacy’s done. That’ll happen. But today, we miss our teammate, and our friend.

 

PS If you want to hire someone awesome, drop Stacy a line at stacykonkiel@fastmail.fm. Drop us a line and we’ll tell you in more detail just how awesome she is.

 

Open Science & Altmetrics Monthly Roundup (January 2015)

2015 kicked off with good news about Nature Publishing Group’s increased commitment towards Open Access, the launch of Frontiers’ research impact social network, Loop, and seven more cool developments in the world of Open Science and altmetrics. Read on!

Nature Publishing Group’s OA journals go CC-BY

Twenty Open Access journals published by Nature Publishing Group recently made the move to offering CC-BY by default. Previously, CC-BY-NC was the default license available for most NPG OA journals, and many authors had to pay higher article processing charges to use a CC-BY license. We applaud this move, which was one of many towards Open Access that NPG made in 2014. For more information, read Claire Calder’s recap of her team’s efforts on the Of Schemes and Memes blog.

How to pay for Gold Open Access fees, even if you’re not well-funded

Self-described “scientific have-not” Zen Faulkes recently blogged about the many strategies he uses to pay for the article processing charges (APCs) his Open Access publications incur.  They include: finding OA journals that waive APCs, petitioning his department chair, and sometimes asking co-authors at other institutions to cover the costs. It’s a great read for anyone concerned about making their work Open Access who lacks grant funding to cover the fees. Read the full list on Dr. Zen’s blog.

Elsevier acquires news monitoring service NewsFlo

Elsevier announced their acquisition of NewsFlo this month. The news monitoring service–which mines over 50,000 news outlets for mentions of research articles–will be integrated into reference management and social bookmarking site Mendeley. This partnership will pave the way for new altmetrics reports for articles and other content added to the platform. Currently, Altmetric.com is the only altmetrics aggregator that reports mainstream media mentions. More information on the acquisition can be found on TechCrunch.

Other open science & research metrics news

  • Altmetrics strategy meeting recap available for all to read: In December, altmetrics researchers and organizations from around the world convened at the PLOS headquarters in San Francisco to discuss ways to improve metrics for all. A report of the meeting’s results is now available on Figshare.

  • Frontiers launches new research impact social network, Loop: Loop is designed to bring together download and pageview metrics from a variety of publisher and academic websites into a researcher-centered profile. (Currently, these metrics are only sourced from Nature Publishing Group and Frontiers journals.) Researchers can follow each others’ profiles to get updates on new publications, and authors’ research networks (sourced from article co-author lists) can be easily explored. The free service plans to monetize in the future by possibly selling ads or selling its users’ data to advertisers. You can learn more about the service on the Loop website.

  • The many ways in which researchers use the scientific literature (hint: it ain’t only about citations): Paleontologist Andy Farke shared how he uses articles in his day-to-day work, and (not surprisingly) “citing in his own papers” isn’t high on the list. Instead, he uses articles to inform his teaching, when reviewing manuscripts, to help prepare him for talking to the public and the media about newly published studies, and more. So why then does academia value citations over the other ways we can measure articles’ use? Read Andy’s full list on his blog.

  • Impactstory Advisor of the Month, Chris Chan, on the library’s role in scholcomm innovations: We recently chatted with Chris on his work to bring ORCID to his campus, and what he thinks all librarians should do to foster the adoption of emerging scholarly communication technologies at their universities. Read the full interview on the Impactstory blog.

  • New resources available for librarians interested in altmetrics: We recently published two LibGuides (one for researchers and one for librarians) that can help librarians do altmetrics outreach at their university. We’re also now hosting virtual “office hours”, where librarians can message Stacy (our Director of Marketing & Research who’s also an academic librarian) to chat and ask questions about altmetrics and Impactstory. And for those in search of altmetrics professional development opportunities, Library Juice Academy is hosting an altmetrics & bibliometrics course.

  • 85% of research data is uncited & only 4-9% have altmetrics: a new study digs deep into citations and altmetrics for research data. Read the full study on Arxiv.

Want updates like these as-they-happen? Follow us on Twitter! In addition to open science and altmetrics news, you can also get Impactstory news–the only altmetrics non-profit.

Steal these altmetrics LibGuides!

Screenshot of the "Ultimate Guide to Altmetrics (Librarian Edition)" libguide page

We’re pleased to announce yet another altmetrics resource for librarians: ready-to-reuse altmetrics LibGuides!

As an academic librarian, I know how hard it can be to find and compile timely, trustworthy resources on a topic like altmetrics. That’s why I’ve created two altmetrics LibGuides, now available for reuse under a CC-BY 4.0 license.

These “Ultimate Guides to Altmetrics” can help researchers and librarians better understand the benefits to (and limitations of) altmetrics. They include:

  • Examples of ways that researchers have used altmetrics in their CVs and for tenure and grants

  • Up-to-date tutorials on finding citation counts and altmetrics for articles, books, data, software, and more

  • Detailed comparisons of Altmetric.com, Impactstory, and PlumX

  • Curated videos, handouts, presentations, and other librarian-created altmetrics outreach materials

These are among the most up-to-date, comprehensive altmetrics LibGuides currently available. Check them out, I think you’ll agree:

The Ultimate Guide to Altmetrics (Researcher Edition)

The Ultimate Guide to Altmetrics (Librarian Edition)

Reuse these LibGuides at your own library or pass them along to a colleague. And please do let me know if you have questions or suggestions for improvements (team@impactstory.org).

Impactstory Advisor of the Month: Chris Chan (January 2015)

Photograph of Chris Chan

The first Impactstory Advisor of the Month for 2015 is Chris Chan, Head of Information Services at Hong Kong Baptist University Library.

We interviewed Chris to learn more about his crucial role in implementing ORCID identifiers for HKBU faculty, and also why he’s chosen to be an Impactstory Advisor. Below, he also describes his vision for the role librarians can play in bringing emerging scholarly communication technologies to campus–a vision with which we wholeheartedly agree!

Tell us a bit about your role as the Head of Information Services at the Hong Kong Baptist University Library.

My major responsibilities at HKBU Library include overseeing our instruction and reference services, and advising the senior management team on the future development and direction of these services. I’m fortunate to work with a great team of librarians and paraprofessionals, and never tire of providing information literacy instruction and research help to our students and faculty.

Scholarly communication is a growing part of my duties as well. As part of its strategic plan, the Library is exploring how it can better support the research culture at the University. One initiative that has arisen from this strategic focus is our Research Visibility Project, for which I am the coordinator.

Why did you initially decide to join Impactstory?

Scholarly communication and bibliometrics have been of great interest to me ever since I first encountered them as a newly-minted academic librarian. Furthermore, the strategic direction that the Library is taking has made keeping up to date with the latest developments in this area a must for our librarians.

When I came across Impactstory I was struck by how useful and relatively straightforward (even in that early incarnation) it was for multiple altmetrics to be presented in an attractive and easy to understand way. At the time, I had just been discussing with some of our humanities faculty how poorly served they were by traditional citation metrics. I saw immediately in Impactstory one way that this issue could be addressed.

Why did you decide to become an Advisor?

As mentioned above, in the past year or so I have become heavily involved in our scholarly communication efforts. When the call for applications to be an Advisor came out, I saw it as an opportunity to get the inside scoop on one of the tools that I am most enthusiastic about.

What’s your favorite Impactstory feature?

I would have to say that my favourite feature is the ability to add an ORCID iD to the Impactstory profile! More on why that is below.

You’ve been hard at work recently implementing ORCID at HKBU. (I especially like this video tutorial you produced!) How do you envision the library working in the future to support HKBU researchers using ORCID and other scholarly communication technologies?

Academic libraries around the world are re-positioning themselves to ensure that their collections and services remain relevant to their members. The scholarly communication environment is incredibly dynamic, and I think that librarians have an opportunity to provide tremendous value to our institutions by serving as guides to, and organizers of, emerging scholarly communication technologies.

Our ORCID initiative at HKBU is a good example of this. We have focused heavily on communicating the benefits having an ORCID iD and how in the long run this will streamline research workflows and ensure scholars receive the proper credit for their work. Another guiding principle has been to make adoption as painless as possible for our faculty. They will be able to create an ORCID iD, connect it with our system, and automatically populate it with their latest five years’ of research output (painstakingly checked for accuracy by our team), all in just a few minutes.

I believe that as information professionals, librarians are well-positioned to take on such roles. Also, in contrast to some of our more traditional responsibilities, these services bring us into close contact with faculty, raising the visibility of librarians on campus. These new relationships could open doors to further collaborations on campus.

Thanks, Chris!

As a token of our appreciation for Chris’s outreach efforts, we’re sending him an Impactstory travel mug from our Zazzle store.

Chris is just one part of a growing community of Impactstory Advisors. Want to join the ranks of some of the Web’s most cutting-edge researchers and librarians? Apply to be an Advisor today!

Open Science & Altmetrics Monthly Roundup (December 2014)

In this month’s roundup: a university allegedly attempts to hire its way to the top of the rankings, NISO’s altmetrics initiative enters its next phase, and seven other ways December was an interesting month for Open Science and altmetrics. Read on!

An altmetrics-flavored look back at 2014

What were the most popular articles of 2014? Altmetric.com let us know with their year-end roundup, which detailed the 100 most shared and discussed scholarly articles of 2014. At the top of the list was the controversial “emotional contagion” study co-authored by Facebook researchers. See more highlights on the full list, and download the full altmetrics data for the study on Figshare.

Did our altmetrics predictions for 2014 come true? Back in February, we wagered some bets on how the field would evolve throughout 2014, and as expected we got some right and some wrong. Probably our biggest win? That more researchers would become empowered to show the world how they’re winning by using altmetrics. Our biggest miss? Sadly, that altmetrics did not become more “open”.

University criticized for alleged attempts to hire its way to the top of rankings

The Daily Cal reports that staff from King Abdulaziz University in Saudi Arabia recently contacted several University of California professors with opportunities to be hired as “distinguished adjunct professors.” Respected researchers are regularly contacted with job opportunities, but this was different, according to the article:

“KAU offered [Jonathan Eisen] $72,000 per year and free business-class airfare and five-star hotel stays for him to visit KAU in Jeddah, Saudi Arabia…In exchange, Eisen was told he would be expected to work on collaborations with KAU local researchers and also update his Thomson Reuters’ highly cited researcher listing to include a KAU affiliation. He would also be expected to occasionally publish some scientific journal articles with the Saudi university’s name attached.”

Eisen and other scientists interviewed suggest that their high citation rates are at the heart of KAU’s interest, as their affiliation with KAU would boost the university’s international rankings. Read more on The Daily Cal.

NISO votes to create standards for altmetrics

NISO approved Phase 2 of the organization’s altmetrics initiative in December, which will include the creation of standards and recommended practices on the following:

Phase 2 of the project will be to develop standards or recommended practices in the prioritized areas of definitions, calculation methodologies, improvement of data quality, and use of persistent identifiers in alternative metrics. As part of each project, relevant use cases and how they apply to different stakeholder groups will be developed.

This should come as no surprise to those who’ve been following NISO-related altmetrics developments. In September, NISO released results from their community survey, which showed more concern with standards and definitions than issues like gaming.

Want to have a voice in altmetrics standards development? Join any of NISO’s four working groups before 1 Feb., 2015. More information can be found on the NISO website.

We’ll be watching future developments with interest, as any standards and recommended practices developed will affect the way we and other altmetrics aggregators collect, display, and archive altmetrics data in Impactstory profiles.

Other altmetrics & open science news

  • ArXiv hits the 1 million paper milestone: Nature News reports that one of the world’s most famous and respected preprint servers, ArXiv, is now home to more than 1 million articles and receives 10 million download requests per month. Incredibly, ArXiv manages to make this treasure-trove of scholarly information freely available to the public at a cost of less than $10 per paper–much less than the reported $50 million per year it takes to operate Science. For an overview of ArXiv’s history, check out Nature News.

  • New altmetrics studies confirm that citations don’t correlate with quality (or do they?), OA publications get more downloads & more: Five studies of interest to the altmetrics community were publicized in December. They included a study that shows a lack of correlation between citations and quality (as measured by expert peer review); another, conflicting study that may hold the “secret sauce” formula for turning citations into indicators of quality; a study that found–yet again–that Open Access publications receive more downloads; the results of one conference’s experiment with peer review, which showed that peer review is “close to random” in terms of what reviewers agree to accept and reject; and a paper on “negative links,” which may have future applications for context-aware altmetrics.

  • Meet Open Science champion and Impactstory Advisor Dr. Lorena Barba: We recently interviewed Lorena to learn more about her lab’s Open Science manifesto, her research in computational methods in aeronautics and biophysics, and George Washington University’s first Massive Open Online Course, “Practical Numerical Methods with Python”. To read more about her work, visit the Impactstory blog.

  • Altmetrics can help measure attention and influence for education-oriented journal articles: PLOS Computational Biology recently shared a thoughtful perspective on an editorial titled, “An Online Bioinformatics Curriculum.” To look at citations to the 2012 paper, you’d think it wasn’t a success–but you’d be wrong. PLOS’s article-level metrics show that the editorial has been viewed over 77,000 times, bookmarked more than 300 times, and has received a great deal of attention on social media. It’s just one more example of ways in which altmetrics can measure attention and influence of scholarship beyond those traditionally valued.

  • Nature takes a lot of heat for its experiment in free-to-access–but not Open Access–journal articles: Nature Publishing Group announced its intent to offer free access to articles in many of its journals over the next year. The plan allows those with subscription access to an article to generate a link that will allow others to read the article for free–but not download or copy the article’s content. Many scientists criticized the move, pointing out the many restrictions that are placed on content shared. We also shared our concerns, particularly with respect to the negative effects the program could have on altmetrics. But many scientists also lauded Nature’s experiment, and shared their appreciation for NPG’s attempt to make content more accessible. To learn more, check out Timo Hannay’s blog and John Wilbank’s thoughts on “Nature’s Shareware Moment.”

  • Impactstory’s “30-Day Impact Challenge” released as an ebook: To download a free copy of our new ebook based on the popular November Impact Challenge, visit our blog. You can also purchase a copy for your Kindle.

Want updates like these delivered to your inbox weekly? Sign up for our open science & altmetrics newsletter! In addition to open science and altmetrics news, you can also get news from us here at Impactstory, the only altmetrics non-profit.

Announcing Impactstory “office hours” for librarians

Over the next two months, we’re experimenting with providing increased support to librarians, many of whom are their on-campus resource for all-things-altmetrics.

If you’re a librarian with questions about Impactstory, altmetrics, or just about anything else related to measuring and demonstrating impact, you can message Stacy on Skype during the following times:

  • 7 pm to 9 pm on Mondays (Mountain time here in the US; better for folks east of the Central European Time zone–India, Japan, China, Australia, New Zealand)

  • 9 am to 12 pm on Fridays (Mountain time here in the US; better for folks west of CET–USA, Western & Central Europe)

To confirm when Stacy will be online in your time zone, we recommend checking out EveryTimeZone.com.

To connect with Stacy, open Skype and go to the Contacts drop-down menu, select “Add contact,” search for Stacy by her name or username (stacy.konkiel), then click the green “Add contact” button. Please include a short note about who you are and why you want to chat when sending the invitation to connect, as it helps keeps the spammers away.

Stacy will be keeping these office hours starting Monday, January 12 through March 20, 2015. Talk to you soon!

What’s our impact? (Dec. 2014)

Back in August, we started sharing our outreach and growth statistics, warts and all. That’s because we’re committed to radical transparency, and had a hunch that our users–who are interested in quantitative measures of impact–would be curious to see the numbers.

After using this format to share our stats for five months, we’ve decided to move away from blogging them in favor of a centralized, easier to read format: a simple Google Spreadsheet, open to all.

Below, we share our numbers in blog format for the final time, and provide a link to the Google Spreadsheet where we’ll share our stats from here on out.

Here are our outreach numbers for December 2014*.

impactstory.org traffic

  • Visitors: 3,504 total; 2,189 unique
  • New Users: 195
  • Conversion rate: 8.9% (% of visitors who signed up for a trial account)

Blog stats

  • Unique visitors: 6,911
  • Clickthrough rate: 0.8% (% of people who visited Impactstory.org from the blog)
  • Conversion rate: 17.5% (% of visitors to Impactstory.org from blog who then signed up for a trial Impactstory account)
  • Percent of new user signups: 5.1%

Twitter stats

  • New followers  262
  • Increase in followers from November 5.3%
  • Mentions 173 (We’re tracking this to answer the question, “How engaged are our followers?”)
  • Tweet reach ~398,800 (We’re tracking this–the number of people who potentially saw a tweet mentioning Impactstory or our blog–to understand our brand awareness)
  • Clickthroughs: 131
  • Conversions: 14

What does it all mean?

We’re seeing a familiar, seasonal dip in traffic for our blog, Twitter, and impactstory.org as the semester winds down and researchers take leave from the lab to celebrate the holidays. However, impactstory.org traffic is up 25% from the same period last year, our Twitter followers more than doubled since late January, and we’ve seen a steady growth in traffic to our blog over the past twelve months.

We’re pleased with our growth to date, and look forward to sharing our future growth here.

Thanks for joining us in this experiment in radical transparency! And Happy New Year!

* These numbers were recorded at 10 am MDT on Dec. 31st, 2014.

Our 2014 predictions for altmetrics: what we nailed and what we missed

Back in February, we wagered some bets about how the altmetrics landscape would evolve throughout 2014. As you might expect, we got some right, and some wrong.

Let’s take a look back at how altmetrics as a field fared over the last year, through the framework of our 2014 predictions.

More complex modelling

“We’ll see more network-awareness (who tweeted or cited your paper? how authoritative are they?), more context mining (is your work cited from methods or discussion sections?), more visualization (show me a picture of all my impacts this month), more digestion (are there three or four dimensions that can represent my “scientific personality?”), more composite indices (maybe high Mendeley plus low Facebook is likely to be cited later, but high on both not so much).”

Visualizations were indeed big this year: we debuted our new Maps feature, which tells you where your work in the world has been viewed, bookmarked, or tweeted about. We also added “New this week” indicators to the Metrics page on your profile.

And both PlumX and Altmetric.com added new visualizations, too: the cool-looking “Plum Print” and the Altmetric bar visualization were introduced.

Impactstory also launched a new, network-aware feature that shows you what Twitter users gave you the most exposure when they tweeted your work. And we also debuted your profile’s Fans page, which tells you who’s talking about your work and how often, exactly what they’re saying, and how many followers they have.

And a step forward in context mining has come from the recently launched CRediT taxonomy. The taxonomy allows researchers to describe how co-authors on a paper have contributed–whether by creating the study’s methodology, cleaning and maintaining data, or in any of twelve other ways. The taxonomy will soon be piloted by publishers, funders, and other scholarly communication organizations like ORCID.

As for other instances of network-awareness, context mining, digestion, and composite indices? Most of the progress in these areas came from altmetrics researchers. Here are some highlights:

  • This study on ‘semantometrics’ posits that more effective means of determining impact can be found by looking at the full-text of documents, and by measuring the interdisciplinarity of the papers and the articles they cite.

  • A study on the size of research teams since 1900 found that a larger (and more diverse) number of collaborators generally leads to more impactful work (as measured in citations).

  • This preprint determined that around 9% of tweets about ArXiv.org publications come from bots, not humans–which may have big implications for how scholars use and interpret altmetrics.

  • A study showed that papers tagged on F1000 as being “good for teaching” tend to have higher instances of Facebook and Twitter metrics–types of metrics long assumed to relate more to “public” impacts.

  • A study published in JASIST (green OA version here) found that mentions of articles on scholarly blogs correlate to later citations.

Growing interest from administrators and funders

“So in 2014, we’ll see several grant, hiring, and T&P guidelines suggest applicants include altmetrics when relevant.”

Several high-profile announcements from funding agencies confirmed that altmetrics was a hot topic in 2014. In June, the Autism Speaks charity announced that they’d begun using PlumX to track the scholarly and social impacts of the studies they fund. And in December, the Wellcome Trust published an article describing how they use altmetrics in a similar manner.

Are funders and institutions explicitly suggesting that researchers include altmetrics in their applications, when relevant? Not as often as we had hoped. But a positive step in this direction has been from the NIH, which released a new biosketch format that asks applicants to list their most important publications or non-publication research outputs. It also prompts scientists to articulate why they consider those outputs to be important.

The NIH has said that by moving to this new biosketch format, it “will help reviewers evaluate you not by where you’ve published or how many times, but instead by what you’ve accomplished.” We applaud this move, and hope that other funders adopt similar policies in 2015.

Empowered scientists

“As scientists use tools like Impactstory to gather, analyze, and share their own stories, comprehensive metrics become a way for them to articulate more textured, honest narratives of impact in decisive, authoritative terms. Altmetrics will give scientists growing opportunities to show they’re more than their h-indices.”

We’re happy to report that this prediction came true. This year, we’ve heard from more scientists and librarians than ever before, all of whom have used altmetrics data in their tenure dossiers, grant applications and reports, and in annual reviews. And in one-on-one conversations, early career researchers are telling us how important altmetrics are for showcasing the impacts of their research when applying for jobs.

We expect that as more scientists become familiar with altmetrics in the coming year, we’ll see even more empowered scientists using their altmetrics to advance their careers.

Openness

“Since metrics are qualitatively more valuable when we verify, share, remix, and build on them, we see continued progress toward making both  traditional and novel metrics more open. But closedness still offers quick monetization, and so we’ll see continued tension here.”

This is one area where we weren’t exactly wrong, but we weren’t 100% correct, either. Everything stayed more or less the same with regard to openness in 2014: Impactstory continued to make our data available via open API, as did Altmetric.com.

We hope that our prediction will come true in 2015, as the increased drive towards open science and open access puts pressure on those metrics providers that haven’t yet “opened up.”

Acquisitions by the old guard

“In 2014 we’ll likely see more high-profile altmetrics acquisitions, as established megacorps attempt to hedge their bets against industry-destabilizing change.”

2014 didn’t see any acquisitions per se, but publishing behemoth Elsevier made three announcements that hint that the company may be positioning itself for such acquisitions soon: a call for altmetrics research proposals, the hiring of prominent bibliometrician (and co-author of the Altmetrics Manifesto) Paul Groth to be the Disruptive Technology Director of Elsevier Labs, and the launch of  Axon, the company’s invitation-only startup network.

Where do you think altmetrics will go in 2015? Leave your predictions in the comments below.