Monday, September 23, 2013

Altmetrics: Present and Future (SIG/MET) > ASIS&T 2013 Annual Meeting Montréal, Québec, Canada > November 1-5, 2013 > 1:30 PM (EST)

  • Dr. Cassidy Sugimoto, Indiana University Bloomington
  • Judit Bar-Ilan, Bar-Ilan University
  • William Gunn, Mendeley
  • Stefanie Haustein, Université de Montréal
  • Stacy Konkiel, Indiana University Bloomington, 
  • Vincent Larivière, Université de Montréal
  • Jennifer Lin, Public Library of Science

Scholars are increasingly incorporating social media tools like blogs, Twitter and Mendeley into their professional communications. Altmetrics tracks usage of these and similar tools to measure scholarly influence on the social web. Altmetrics researchers and practitioners have amassed a growing body of literature and working tools to gather and analyze altmetrics and there is growing interest in this emerging subfield of scientometrics. Panelists will present results demonstrating the utility of alternative metrics from a variety of stakeholders: researchers, librarians, publishers and those participating in academic social media sites.

Thanks to Jose Kruse /

Source and Links Available At



SIG/MET is the Special Interest Group for the measurement of information production and use. It encourages the development and networking of all those interested in the measurement of information. It encompasses not only bibliometrics, scientometrics and informetrics, but also measurement of the Web and the Internet, applications running on these platforms, and metrics related to network analysis, visualization, scholarly communication and the design and operation of Information Retrieval Systems. SIG/MET will facilitate activities to encourage the promotion, research and application of metrics topics. Academicians, practitioners, commercial providers, government representatives, and any other interested persons are welcome. 


Sunday, September 22, 2013

A/V + Available > NISO Webinar: Beyond Publish or Perish: Alternative Metrics for Scholarship

NISO How the information world CONNECTS
November 14, 2012 / 1:00 - 2:30 p.m. (Eastern Time)


About the Webinar

Increasingly, many aspects of scholarly communication—particularly publication, research data, and peer review—undergo scrutiny by researchers and scholars. Many of these practitioners are engaging in a variety of ways with Alternative Metrics (#altmetrics in the Twitterverse). Alternative Metrics take many forms but often focus on efforts to move beyond proprietary bibliometrics and traditional forms of peer referencing in assessing the quality and scholarly impact of published work. Join NISO for a webinar that will present several emerging aspects of Alternative Metrics.



Todd Carpenter, Executive Director at NISO


Article-Level Metrics at PLOS

Martin Fenner, Technical Lead, PLOS Article-Level Metrics project

Article-Level Metrics have become an exciting new opportunity for publishers, funders, universities and researchers. The publisher Public Library of Science (PLOS) has started to collect and display citations, usage data, and social web activity for all their articles in 2009. The webinar will discuss the opportunities (and challenges) of Article-Level Metrics, from issues in collecting data to interesting results of data analysis.

Total-Impact and other altmetrics initiatives

Jason Priem, Ph.D. Student, Co-Principal Investigator, Impact Story

Altmetrics helps us track diverse scholarly impacts by looking in new places for evidence--public places like Wikipedia and Twitter, and scholarly environments like Mendeley and Faculty of 1000. Doing this lets us promote and reward new forms of Web-native scholarship in two ways. Broader measures of impact  help us move

  • beyond the article:  we can value the increasingly important and powerful new genres of scholarly products like blog posts, software, and datasets, and
  • beyond the impact factor: we can value the observed impact of scholarly products themselves, across lots of different audiences and use types--not just awarding the prestige of where they're published.

That said, altmetrics can be tricky to gather and understand. We'll discuss tools and frameworks to help turn rich but dense altmetrics data into data-supported stories that can help inform important conversations about what it means to make a scholarly impact.

Unconventional Scholarly Communications

Aalam Wassef, Founder of Peer Evaluati

Participate in Aalam's survey on social networks at 

Scholars are blogging, microblogging, searching, sharing primary data, collaborating, discussing, rating, bookmarking articles in public folders, recommending links over public networks, offering live coverages of events and receiving badges, views, likes or mentions for all they do online and elsewhere. More than ever, scholars are communicating and getting credit for it, with no limitations as to style, format or environment, enjoying high levels of engagement and responsiveness from their peers.

  • How are all other parties concerned (librarians, public funders, policy makers, publishers universities, research centers) absorbing, supporting or rejecting all of the above?
  • Could “unconventional” communications and alternative metrics be eventually as valued as peer reviewed articles and proprietary bibliometrics? How much of these altmetrics are truly accessible and for free, and what would be the alternatives to potential limitations?
  • What is the current perception of direct publishing and open peer review, whether by individuals, groups or institutions? What are the risks and opportunities for the production of high quality research?

Event Q&A


Source and Links Available At:

NISO > Altmetrics Steering Group

NISO How the information world CONNECTS
  • Wed, 11 Sep 2013 > MP3 recording - Altmetrics Steering Group call - September 10, 2013 (6MB) 
  • Thu, 05 Sep 2013MP3 recording - Altmetrics Steering Group call - September 4, 2013 (5MB) 
  • Tue, 03 Sep 2013 > MP3 recording - Altmetrics Steering Group call - August 30, 2013 \\(7MB) 
  • Thu, 29 Aug 2013  > Draft Agenda NISO Altmetrics Workshop 2013-10-09.docx (102K) 

Source and Links Available At  

Information Standards Quarterly (ISQ) > Summer 2013 > Volume 25, no. 2 > Topic: Altmetrics

Table of Contents

Letter from the Guest Content Editor: Altmetrics Have Come of Age
by Martin Fenner


Consuming Article-Level Metrics: Observations and Lessons
by Scott Chamberlain

Institutional Altmetrics and Academic Libraries
by Robin Chin Roemer and Rachel Borchardt


Altmetrics in Evolution: Defining & Redefining the Ontology of Article-Level Metrics
by Jennifer Lin and Martin Fenner

Exploring the Boundaries: How Altmetrics Can Expand Our Vision of Scholarly Communication and Social Impact
by Mike Taylor

Social Signals Reflect Academic Impact: What it Means When a Scholar Adds a Paper to Mendeley
by William Gunn


Source and Links to Full Text Available At:


NISO to Develop Standards and Recommended Practices for Altmetrics

NISO How the information world CONNECTS

Grant from Sloan Foundation will fund community-informed effort to standardize collection and use of alternative metrics measuring research impact

Baltimore, MD - June 20, 2013 - The National Information Standards Organization (NISO) announces a new two-phase project to study, propose, and develop community-based standards or recommended practices in the field of alternative metrics. Assessment of scholarship is a critical component of the research process, impacting everything from which projects get funded to who gains promotion and tenure to which publications gain prominence. Since Eugene Garfield's pioneering work in the 1960s, much of the work on research assessment has been based upon citations, a valuable measure but one that has failed to keep pace with online reader behavior, network interactions with content, social media, and online content management. Exemplified by innovative new platforms like ImpactStory, a new movement is growing to develop more robust alternative metrics—called altmetrics—that complement traditional citation metrics. NISO will first hold several in-person and virtual meetings to identify critical areas where altmetrics standards or recommended practices are needed and then convene a working group to develop consensus standards and/or recommended practices. The project is funded through a $207,500 grant from the Alfred P. Sloan Foundation.

Citation analysis lacks ways to measure the newer and more prevalent ways that articles generate impact such as through social networking tools like Twitter, Facebook, or blogs," explains Nettie Lagace, NISO's Associate Director for Programs. "Additionally, new forms of scholarly outputs, such as datasets, software tools, algorithms, or molecular structures are now commonplace, but they are not easily—if at all—assessed by traditional citation metrics. These are two among the many concerns the growing movement around altmetrics is trying to address."

"For altmetrics to move out of its current pilot and proof-of-concept phase, the community must begin coalescing around a suite of commonly understood definitions, calculations, and data sharing practices," states Todd Carpenter, NISO Executive Director. "Organizations and researchers wanting to apply these metrics need to adequately understand them, ensure their consistent application and meaning across the community, and have methods for auditing their accuracy. We must agree on what gets measured, what the criteria are for assessing the quality of the measures, at what granularity these metrics are compiled and analyzed, how long a period the altmetrics should cover, the role of social media in altmetrics, the technical infrastructure necessary to exchange this data, and which new altmetrics will prove most valuable. The creation of altmetrics standards and best practices will facilitate the community trust in altmetrics, which will be a requirement for any broad-based acceptance, and will ensure that these altmetrics can be accurately compared and exchanged across publishers and platforms."


Source and Full Text Available At


Saturday, September 21, 2013

JISC Report > Access to Citation Data: Cost-benefit and Risk Review and Forward Look

1 Introduction
1.1 Aim, scope and focus of the study


1.1.1 The overarching aim of the report is to explore and suggest practical directions and actions to move toward more cost-effective creation, dissemination and exploitation of citation data in the context of current and potential future usage scenarios. A further aim is to propose the roles that Jisc and others might play in this system in future.

Scope and focus

1.1.2 The general scope of this work is the creation and exploitation of citation data derived from peerreviewed academic research articles. However, the citation of datasets is specifically excluded. While  bibliographic metadata associated with the referencing and referenced outputs is clearly relevant, it is  not the focus of the work. Approaches to exploitation of citation data are also not the focus of the review, except in as much that this might increase or change the demand for different types of citation data.

1.2 Study approach

1.2.1 The study approach has been informed by the JISC invitation to tender (JISC Executive, 2012) and was conducted in three phases. In the first phase, desk research and an intensive series of interviews were undertaken with key stakeholders including publishers, citation data providers, citation data users, research funders to understand the strategic drivers. The information gathered was used to identify possible outline usage scenarios and develop a business model framework together with an initial view of the pros and cons of each scenario.

1.2.2 In the second phase, an agreed set of usage scenarios and business models were developed and refined in consultation with users of citation data, publishers and citation data providers. In addition, a DevCSI developers’ workshop or Hack Day was held on 27 September 2012 [6] to explore usage of citation data through short trials or pilots using real citation data. This brought together a group of domain experts, users and developers to explore ideas related to potential real world uses of citation data and to prototype potential solutions. The group investigated aspects of citation data and its use, including the properties of sparse networks of data, and considering new ways to visualise citation data. The event provided some interesting perspectives on the use of citation data, and in particular supported the design of the ‘open’ processes.

1.2.3 In the final phase, the results of the second phase have been used to develop options for a viable practical direction and set of actions for taking the use of citation data forward. It is planned to seek  feedback and agreement with stakeholders at a final stakeholder meeting.

Source and Full Text Available At:


Wednesday, September 18, 2013

S&TL > Introduction to Altmetrics for Science, Technology, Engineering, and Mathematics (STEM) Librarians

Science & Technology Libraries

Linda M. Galloway, Janet L. Pease & Anne E. Rauh
Published online: 12 Sep 2013 / DOI:10.1080/0194262X.2013.829762


Quantifying scholarly output via citation metrics is the time-honored method to gauge academic success. Altmetrics, or alternative citation metrics, provide researchers and scholars with new ways to track influence across evolving modes of scholarly communication. This article will give librarians an overview of new trends in measuring scholarly influence, introduce them to altmetrics tools, and encourage them to engage with researchers in discussion of these new metrics.


Paying attention to and collecting alternative metrics about research products will vary according to one's field and scholarly community. Authors should be encouraged to explore and engage with social media tools already in use in their disciplines and be mindful of emerging tools. Scholars are beginning to go “beyond the paper” and engage with their colleagues via Twitter, blogs, and reference managers (Priem 2013). These types of interactions will continue to increase, and those who remain unengaged will likely be left out of important discussions. Increasingly, it is important to not only read the newest journal article, but to follow the chatter about the research in social media platforms. Reluctant social media adopters may be encouraged to engage once they understand that it is perfectly acceptable to simply read or observe, rather than post or tweet.

Awareness of new metric tools and how they relate to social media is important knowledge for producers of scholarly output. These tools complement existing readership, promote work to new readers, and measure outputs in concert with traditional scholarly metrics. As a complement to traditional citation metrics, altmetrics can provide a more rapid assessment and arguably a more complete picture of an individual's scholarly influence. Altmetrics tools can also help illustrate the value of scholarly output beyond publications.
Tracking the relevance and significance of these research products requires knowledge of the practices within a discipline and the foresight to predict what may be important to track in the future. While altmetrics can help researchers by vetting, organizing, and adding value to information products retrieved, it is essential to contextualize these data. Information professionals, with knowledge of both traditional and emerging scholarly metrics, are able to bridge the divide between these forms of scholarly engagement.

>>> Thanks to Lorrie Pellack for the HeadUp ! <<<

Source and Full Text Available At


Open Access Version Not Currently Availablle [09-18-13] / Subscribers and Pay-Per-View Only

Tuesday, September 17, 2013

Article-Level Metrics Workshop 2013 / Thursday, October 10, 2013 at 8:30 AM - Friday, October 11, 2013 at 4:00 PM (PDT) San Francisco, CA

Event Details

As article-level metrics (ALM) come of age, the question is no longer whether we need them, but rather how we implement them. Building upon the successful ALM Workshop in November 2012, PLOS invites you to the second annual ALM Workshop 2013 on October 10-12, 2013 in San Francisco.

The preliminary program is [now] ... posted ... .  Check out the lineup of speakers and presentations in store for the event, representing researchers; funders; academic administrators; and technology providers.


This year, we will move the community conversations beyond the basics to focus on success stories and challenges encountered, latest analyses on the growing data corpus, as well as deep dives into the technical details behind it all. The format in the first two days will consist of a series of talks and panel discussions with ample mingling time to encourage in-depth sharing. For the third day of the workshop, we will organize a data challenge (data hackathon), giving participants the opportunity to do data analysis and data visualization on a variety of ALM datasets from different sources.

*** FREE Registration ***

Please register for the data challenge [at]

Source and Links Available At:


A/V Now Available > 09-19-13 > Science > Live Chat: Should We Ditch Journal Impact Factor? > September 19 2013 > 3 PM (ET)

Chat Guests. (L) Sandra Schmid is the Cecil H. Green Distinguished Chair in Cellular and Molecular Biology at the University of Texas Southwestern Medical Center. (C) Heather Piwowar is a postdoctoral researcher at Duke University who works remotely from Vancouver, Canada, and who primarily studies the way bibliometric factors and credit attribution affect scientists. (R) Mike Price is a staff writer for Science and the chat moderator.

[The live video chat will begin at 3 p.m. Eastern on Thursday 19 September. Please leave your questions for the guests in the comment box below and check back just before it begins to join the chat.]

The journal impact factor was designed to help librarians decide which journals to subscribe to and was never intended as a measuring stick for the value of a scientist’s research, as it is sometimes used today. Now, there has been a push to reexamine the importance that tenure committees and journal reviewers assign to journal impact factors.

Earlier this year, a group of concerned scientists and journal publishers signed an open letter known as the San Francisco Declaration on Research Assessment (DORA) to encourage review boards and tenure committees to “eliminate the use of journal-based metrics, such as Journal Impact Factors, in funding, appointment, and promotion considerations,” and to encourage the development of alternative metrics (altmetrics) to measure a scientist’s research contributions.

Join Heather Piwowar of Duke University, an expert in bibliometric factors and credit attribution, and DORA signatory Sandra Schmid of the University of Texas Southwestern Medical Center on Thursday, 19 September, at 3 p.m. EDT ... .

Source, Link, and A/V Available At: