Monday, December 3, 2012

Awareness, Attitudes and Participation of Teaching Staff Towards the Open Content Movement in One University / Peter Reed

Page Header


This research investigates the current awareness of, and participation in, the open content movement at one UK institution for higher education. The open content movement and the open educational resources can be seen as potential methods for reducing time and cost of technology-enhanced learning developments; however, its sustainability and, to some degree, its success are dependent on critical mass and large-scale participation. Teaching staff were invited to respond to a questionnaire. Respondents (n59) were open to the idea of sharing their own content and, similar to other studies, demonstrated existing practices of sharing resources locally amongst colleagues; however, there was little formal, large-scale sharing using suitable licenses. The data gathered concurs with other research suggesting a lack of awareness to the Creative Commons licenses as well as a lack of participation in large open educational resource repositories.
Keywords: open educational resources; staff attitudes; sustainability

(Published: 22 October 2012)

Citation: Research in Learning Technology 2012, 20: 18520

Source and Links to Full Text Available At 


Saturday, December 1, 2012

Visualizing Tweets Linking to a Paper

Martin Fenner / Posted: July 14, 2012

DNA Barcoding the Native Flowering Plants and Conifers of Wales has been one of the most popular new PLoS ONE papers in June. In the paper Natasha de Vere et al. describe a DNA barcode resource that covers the 1143 native Welsh flowering plants and conifers.

My new job as technical lead for the PLoS Article Level Metrics (ALM) project involves thinking about how we can best display the ALM collected for this and other papers. We want these ALM to tell us something important and/or interesting, and it doesn’t hurt if the information is displayed in a visually appealing way. There are many different ways this can be done, but here I want to focus on Twitter and CiteULike, the only two data sources where PLoS is currently storing every single event (tweet or CiteULike bookmark) with a date. Usage data (HTML and XML views, PDF downloads) are aggregated on a monthly basis, and PLoS doesn’t store the publication dates of citations.

We know from the work of Gunter Eysenbach and others that most tweets linking to scholarly papers are written in the first few days after publication. It therefore makes sense to display this information on a timeline covering the first 30 days after publication, and the tweets about the de Vere paper follow the same pattern.


Source and Full Text Available At 


altmetrics12 > An ACM Web Science Conference 2012 Workshop

Evanston, IL • 21 June 2012

Keynotes (9:00-10:00)

  • Johann Bollen
  • Gregg Gordon
Coffee break (10:00-10:30)

Paper presentations (10:30-01:00)

Position and theory papers, 10min each (10:30-11:30)

  • Martin Fenner / Altmetrics will be taken personally at PLoS (presentation)
  • William Gunn and Jan Reichelt / Social metrics for research: quantity and quality (presentation)
  • Elizabeth Iorns / Reproducibility: an important altmetric
  • Britt Holbrook / Peer review, altmetrics, and ex ante broader impacts assessment – a proposal
  • Kelli Barr / The Role of altmetrics and Peer Review in the Democratization of Knowledge (chalkboard notes)

Empirical papers, 15min each (11:30-1:00)

  • Judit Bar-Ilan / JASIST@mendeley
  • Jasleen Kaur and Johan BollenStructural Patterns in Online Usage (presentation)
  • Vincent Larivière, Benoit Macaluso, Staša Milojević, Cassidy R. Sugimoto and Mike Thelwall / Of caterpillars and butterflies: the life and afterlife of an arXiv e-print
  • Jason Priem, Heather Piwowar and Bradley Hemminger / Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact (presentation)
  • Jennifer Lin / A Case Study in Anti-Gaming Mechanisms for Altmetrics: PLoS ALMs and DataTrust (presentation)
  • Richard Price / Altmetrics and
Lunches on your own (1:00-2:00p)

Demos (2:00-3:00p)

  • total-impact (Heather Piwowar)
  • (Euan Adie) (presentation)
  • PLoS ALM (Martin Fenner)
  • Ubiquity Press metrics (Brian Hole)
  • Plum Analytics (Andrea Michalek)
  • BioMed Central metrics (Ciaran O’Neill)
  • (Richard Price)
  • Knode (David Steinberg)
  • CASRAI (David Baker)
  • Mendeley and ReaderMeter (William Gunn)
  • (Richard Price)
Group discussion (3:00-4:30p)

We’ll split into small groups to discuss key altmetrics issues; topics may include:

  • Gaming: how might it happen, and how do we stop it?
  • Standards: We’ve got COUNTER for downloads; should there be standards for other altmetrics? What should they look like?
  • Visualization: There’s a lot of data. How should we display it?
  • Peer review: Could altmetrics replace traditional peer review? Should it? Can we build new publishing models around altmetrics?
  • CVs and “impact dashboards”: What does an altmetrics-informed CV look like? Who wants (and doesn’t want) one?
  • Publishers: What do publishers want from altmetrics services? How about readers and authors?
  • Normalization: How do we compare metrics from different fields or disciplines?
Group presentations and discussion (4:30-5:30p)

Summing up (5:30-6:00p)

Conclusion (Summarize key points from live and online discussion)

Open discussion: what’s the next year of altmetrics look like?


Source and Presentation Links Available At 


Slideshare > NISO Webinar: Beyond Publish or Perish: Alternative Metrics for Scholarship

November 14, 2012
1:00 - 2:30 p.m. (Eastern Time)


About the Webinar

Increasingly, many aspects of scholarly communication—particularly publication, research data, and peer review—undergo scrutiny by researchers and scholars. Many of these practitioners are engaging in a variety of ways with Alternative Metrics (#altmetrics in the Twitterverse). Alternative Metrics take many forms but often focus on efforts to move beyond proprietary bibliometrics and traditional forms of peer referencing in assessing the quality and scholarly impact of published work. Join NISO for a webinar that will present several emerging aspects of Alternative Metrics.

Source and Q&A and Slideshare Only Available At


Reusing, Revising, Remixing and Redistributing Research

An OA Week guest post by Daniel Mietchen

The initial purpose of Open Access is to enable researchers to make use of information already known to science as part of the published literature. One way to do that systematically is to publish scientific works under open licenses, in particular the Creative Commons Attribution License that is compatible with the stipulations of the Budapest Open Access Initiative and used by many Open Access journals. It allows for any form of sharing of the materials by anyone for any purpose, provided that the original source and the licensing terms are shared alongside. This opens the door for the incorporation of materials from Open Access sources into a multitude of contexts both within and outside traditional academic publishing, including blogs and wikis.

Amongst the most active reusers of Open Access content are Wikimedia projects like the over 280 Wikipedia, Wikispecies and their shared media repository, Wikimedia Commons. In the following, a few examples of reusing, revising, remixing and redistributing Open Access materials in the context of Wikimedia projects shall be highlighted.


Source and Full Text Available At 


Slideshare > Current and Future Effects of Social Media-Based Metrics on Open Access and IRs

The Power of Post Publication Review: A Case Study

There are many discussions and examples of post-publication review as an alternative to the currently more common peer-review model. While this comes up fairly regularly in my Twitter stream, I don’t think I’ve done more than hint at it within the blogposts here. I’ve also been watching (but neglecting to mention here) the emergence of data journalists and data journalism as a field, or perhaps perhaps I should say co-emergence, since it seems to be tightly coupled with shifts in the field of science communication and communicating risk to the public. Obviously, these all tie in tightly with the ethical constructs of informed consent and shared decisionmaking in healthcare (the phrase from the 1980s) which is now more often called participatory medicine.

That is quite a lot of jargon stuffed into one small paragraph. I could stuff it equally densely with citations to sources on these topics, definitions, and debates. Instead, for today, I’d like to give a brief overview of a case I’ve been privileged to observe unfolding over the weekend. If you want to see it directly, you’ll have to join the email list where this took place.


Source and Full Text Available At 


Open Access and Its Impact on the Future of the University Librarian

We are shifting from content ownership by individual libraries to joint provision of services on a larger scale, says Stephen Barr

With the publication of the Finch report earlier this year and the UK government's announcement to commit £10m to help make research findings freely available, there has been a gear shift towards a more rapid movement into an open access world for the publishing of scholarly information.

While there has been a lot of discussion around what that shift means for academic publishers, and there is now a lively dialogue between researchers and scholars in different disciplines, there seems to have been less discussion of what this shift means for libraries and librarians. Yet the move towards open access is a profound change for the whole infrastructure of scholarly communication, and is bound to have impacts on the library as it does on other parts of the process.


Source and Full Text Available At 


YouTube > Article-Level Metrics

YouTube > Alt Metrics -- A Funder's Perspective


Duration =  ~14:00 Minutes 

From Bibliometrics to Altmetrics A Changing Scholarly Landscape

C&RL News > 73 (10) > November 2012 > Robin Chin Roemer and Rachel Borchardt

When future Science Citation Index founder Eugene Garfield first came up with the idea of journal impact factor in 1955, it never occurred to him “that it would one day become the subject of widespread controversy.”

Today, techniques for measuring scholarly impact—traditionally known as bibliometrics —are well known for generating conflict and concern, particularly as tenure-track scholars reach beyond previously set boundaries of discipline, media, audience, and format. From the development of more nuanced academic specialties to the influence of blogs and social media, questions about the scope of scholarly impact abound, even as the pressure to measure such impact continues unabated or increases.

As faculty at universities around the world struggle to find new ways of providing evidence of their changing scholarly value, many librarians have stepped forward to help negotiate the landscape of both traditional impact metrics, such as h-index and journal impact factor, and emerging Web-based alternatives, sometimes called altmetrics, cybermetrics, or webometrics. With interest in online venues for scholarly communication on the rise, and the number of tools available for tracking online influence growing steadily, librarians are in a key position to take the lead in bolstering researchers’ knowledge of current trends—and concerns—in the new art and science impact measurement.


Source and Full Text Available At  


Scholarly Metrics with a Heart

I attended last week the PLOS workshop on Article Level Metrics (ALM). As a disclaimer, I am part of  the PLOS ALM advisory Technical Working Group (not sure why :). Alternative article level metrics refer to any set of indicators that might be used to judge the value of a scientific work (or researcher or institution, etc). As a simple example, an article that is read more than average might correlate with scientific interest or popularity of the work. There are many interesting questions around ALMs, starting even with simplest - do we need any metrics ? The only clear observation is that more of the scientific process is captured online and measured so we should at least explore the uses of this information.


Source and Full Text and Links Available At 


Open Post-Publication Peer Review


Beyond open access, which is generally considered desirable, the essential drawbacks of the current system of scientific publishing are all connected to the particular way that peer review is used to evaluate papers. In particular, the current system suffers from a lack of quality and transparency of the peer review process, a lack of availability of evaluative information about papers to the public, and excessive costs incurred by a system, in which private publishers are the administrators of peer review. These problems can all be addressed by open post-publication peer review.


Source and Links To Full Text and Brief and Full Arguments Available At


Beyond Open Access: Visions for Open Evaluation of Scientific Papers by Post-Publication Peer Review


This Research Topic in Frontiers in Computational Neuroscience collects visions for a future system of open evaluation. Because critical arguments about the current system abound, these papers will focus on constructive ideas and comprehensive designs for open evaluation systems. Design decisions include: Should the reviews and ratings be entirely transparent, or should some aspects be kept secret? Should other information, such as paper downloads be included in the evaluation? How can scientific objectivity be strengthened and political motivations weakened in the future system? Should the system include signed and authenticated reviews and ratings? Should the evaluation be an ongoing process, such that promising papers are more deeply evaluated? How can we bring science and statistics to the evaluation process (e.g. should rating averages come with error bars)? How should the evaluative information about each paper (e.g. peer ratings) be combined to prioritize the literature? Should different individuals and organizations be able to define their own evaluation formulae (e.g. weighting ratings according to different criteria)? How can we efficiently transition toward the future system?


Source and Full Text and Articles Links Available At 




The Digital Scholar: How Educators Can Be Part of the Digital Transformation / Martin Weller

Publication Date: 2011 / Pages: 256 /DOI:

While industries such as music, newspapers, film and publishing have seen radical changes in their business models and practices as a direct result of new technologies, higher education has so far resisted the wholesale changes we have seen elsewhere. However, a gradual and fundamental shift in the practice of academics is taking place. Every aspect of scholarly practice is seeing changes effected by the adoption and possibilities of new technologies. This book will explore these changes, their implications for higher education, the possibilities for new forms of scholarly practice and what lessons can be drawn from other sectors.

Table of Contents

  • Acknowledgements
  • Digital, Networked and Open
  • Is the Revolution Justified?
  • Lessons from Other Sectors
  • The Nature of Scholarship
  • Researchers and New Technology
  • Interdisciplinarity and Permeable Boundaries
  • Public Engagement as Collateral Damage
  • A Pedagogy of Abundance
  • Openness in Education
  • Network Weather
  • Reward and Tenure
  • Publishing
  • The Medals of Our Defeats
  • Digital Resilience
  • References

Source and Full Text Available At 




Article-level Metrics: Which Service to Choose?

26 Oct, 12 /  Claire Bower, Digital Comms Manager

Article-level metrics (or ALMs) were a hot topic at this week’s HighWire publisher meeting in Washington. (Highwire hosts both the BMJ and its stable of 42 specialist journals). From SAGE to eLife, publishers seem sold on the benefits of displaying additional context to articles, thereby enabling readers to assess their impact. These statistics range from traditional indicators, such as usage statistics and citations, to alternative values (or altmetrics) like mentions on Twitter and in the mainstream media.

So, what services are available to bring this information together in one simple interface? There are quite a few contenders in this area, including Plum Analytics, PLoS Article-Level Metrics application, Science Card, CitedIn and ReaderMeter. One system in particular has received a good deal of attention in the past few weeks; ImpactStory, a relaunched version of total-impact. It’s a free, open-source webapp that’s been built with financial help from the Sloan Foundation (and others) “to help researchers uncover data-driven stories about their broader impacts”.


Source and Full Text Available At 


Scientists Seek New Credibility Outside of Established Journals

Altmetrics: An App Review > Stacy Konkiel

Keywords: altmetrics; bibliometrics


Date: 2012-10-07

Rights: Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0)

Rights URL:

Type: Presentation


In a university culture increasingly influenced by metrics, academic libraries can use altmetrics to highlight scholarship’s hidden value. This session will cover the apps and services that can help faculty, administration, and librarians learn the full, true impact of research.


Presented at OCLC Innovation in Libraries post-conference event, LITA Forum 2012.

Source and Links Available At 


YouTube Video 


Duration = ~23:30

Thursday, November 22, 2012

COAR > Automated Downloading of Citation Data

Catalina Oyler, Five Colleges of Ohio Digital Initiatives Coordinator, developed, as a part of an Andrew W. Mellon Foundation grant, a procedure for batch loading scholarly article citations (from Web of Science [etc.]/via Refworks) into a DSPACE scholarly article repository.  This allowed Oberlin College to efficiently load large numbers of faculty citations for 2010 and 2011 as a means of growing the IR.

OhioLink > Documentation‎ > ‎Batch Submission from RefWorks

This process modifies the batch submission process to start with metadata in the form of RefWorks citations instead of an excel spreadsheet.

There are two different processes for going from Refworks to the DRC.  The Refworks2DC process uploads the Refworks metadata without an associated bitstream.  This process can be used to populate a collection with citations and links to DOIs or have bitstreams added later.  The Refworks2DCbitsteam process uploads metadata as well as primary object bitstreams.  Each attachment includes an instruction guide as well as the files needed for the transformation.

Source and Links Available At 



Process for Batch Uploads to Production Instance

Wednesday, November 21, 2012

SUNScholar/Audit > Ingest of Research Digital Assets and Metadata

Section 6: Ingest

Ingest of research digital assets and metadata must be actively pursued and monitored using automatic and manual methods.

Source and Links Available


PDF Permssions Google Docs Script YouTube Video

A demo of the early development stages of a script that will automate PDF permissions lookup in Sherpa Romeo

Stephen X. Flynn / Emerging Technologies Librarian / The College of Wooster/ Wooster, OH

Deposit Strand

This project will seek to embed institutional deposit into the academic workflow of the researcher at almost no cost to the researcher. We will work with Mendeley and Symplectic to allow researchers to synchronise their personal research collections with institutional systems at no extra effort. We expect to significantly increase deposit rates as a result.

This strand builds on previous JISC programmes and other work in this area that have dealt with the issues around the deposit process and as mentioned above, seeks to lower the barrier to deposit:

  • "Jisc Depost" event that preceded the funding of these projects: the list of current deposit tools that have been built  and the themes/patterns beginning to emerge in these deposit situations.
  • There have been a range of other JISC projects that have worked in the deposit solution.
  • Open Access Repository Junction offers an API that supports redirect and deposit of research outputs into multiple repositories.
  • Open Access policies are listed by ROARMAP and Sherpa-Juliet, and these may suggest research communities where deposit might be a concern for researchers.
  • SWORD is a widely used application nationally and internationally.
  • Various "Shared Infrastructure Services" projects, such as Sherpa-RoMEO, openDOAR and Names offer functionality that can support deposit.
  • Text mining tools/services by organisations such as Yahoo's term extractor, Thomson Reuters's Open-Calais, Nactem's tools for researchers and other services also provide opportunities to enhance deposit.

DepositMO: Modus Operandi for Repository Deposits

The DepositMO project aims to develop an effective culture change mechanism that will embed a deposit culture into the everyday work of researchers and lecturers. The proposal will extend the capabilities of repositories to exploit the familiar desktop and authoring environments of its users. The objective is to turn the repository into an invaluable extension to the researcher’s desktop in which the deposit of research outputs becomes an everyday activity. The target desktop software suite is Microsoft Office, which is widely used across many disciplines, to maximise impact and benefit. Targeting both EPrints and DSpace, leveraging SWORD and ORE protocols, DepositMO outputs will support a large number of organisations. The ultimate goal is to change the Modus Operandi of researchers so that repository deposit becomes standard practice across a wide number of disciplines using familiar desktop tools.

DURA – Direct User Repository Access

This project will seek to embed institutional deposit into the academic workflow of the researcher at almost no cost to the researcher. We will work with Mendeley and Symplectic to allow researchers to synchronise their personal research collections with institutional systems at no extra effort. We expect to significantly increase deposit rates as a result.

See Also > Dura Project with Mendeley and Caret 

RePosit: Positing a New Kind of Deposit

The RePosit Project seeks to increase uptake of a web-based repository deposit tool embedded in a researcher-facing publications management system. Project work will include gathering feedback from users and administrators and evaluating the tool's effectiveness; developing general strategies for increasing uptake of embedded deposit tools; compiling a community commentary on the issues surrounding research management system integration; and producing open access training materials to help institutions enlighten their users and administrators regarding how embedded deposit tools are related  to the work of the library and the repository.

The intention is to use the reduction in deposit barriers offered by the tool to enhance open access content, creating more full-text objects available under stable URIs. This will be used to demonstrate that repositories can play a part in the researcher's daily activities, and that a deposit mandate is viable for the partner institutions. Success is measurable by an increase in the number of open access items which is greater than the expected increase without use of the deposit tool and the advocacy throughout this project. Other outputs will take the form of documentation available freely on the web.

Source and Links Available

Friday, November 16, 2012

Automated Deposit of Researcher Publications Into Repositories ?


Are you aware of any effort in which metadata and the full text (and/or link) of e-journal articles (and/or other digital publications) are automatically harvested and "deposited" within a local *institutional* (and/or subject) repository ?

It has occurred to me that the automation of publication deposition could quickly populate such repositories.

As a number of publishers allow for deposit of a post-print

the question of copyright could / might / should / would not be an issue [?]

Thanks for considering ...

Please submit as comment / Thanks !


Monday, November 5, 2012

Mendeley Global Research Report

What Authors Want From Open Access Publishing: Wiley Author Survey 2012

Wiley conducted a survey of over 100,000 journal article authors to discover their opinions and behaviors with regard to open access publishing. The results are detailed in these slides.

Friday, October 26, 2012

Open Access Explained! < YouTube

>>> Duration = ~ 8:30 Minutes <<<

What is open access? Nick Shockey and Jonathan Eisen take us through the world of open access publishing and explain just what it's all about.

Thursday, October 25, 2012

Nature > Alternative Metrics

As the old 'publish or perish' adage is brought into question, additional research-impact indices, known as altmetrics, are offering new evaluation alternatives. But such metrics may need to adjust to the evolution of science publishing.

Today, a growing frustration among researchers is that the impact of their contribution to science is mostly assessed on the basis of out-of-date mechanisms including impact factor and citation measurements. This discontent occurs as we are reaching a turning point in science publishing history where the essence of the peer-review process has been called into question.

Indeed, the drive to find alternative metrics is a symptom of a community where research evaluation is not functioning well. A new movement called altmetrics — eloquently described through a manifesto1 published in 2010 and arguably a variation on the theme of what is referred to as webometrics or social media metrics — revisits the measurement of a scientist's worth. Rather than using peer-reviewed journal articles, alternative metrics range from other types of research output to a researchers' reputation made via their footprint on the social web.


Source and Full Text Available At

Monday, October 8, 2012

HowOpenIsIt? > Open Access Spectrum > Final Version Now Available

Not all Open Access is created equal. To move beyond the seemingly simple question of “Is it Open Access?” PLOS, SPARC and OASPA have collaborated to develop a resource called “HowOpenIsIt?” This resource identifies the core components of open access (OA) and how they are implemented across the spectrum between "Open Access" and "Closed Access". We recognize there are philosophical disagreements regarding OA and this resource will not resolve those differences. 

We are seeking input on the accuracy and completeness of how OA is defined in this guide. Download the above open review draft and provide feedback below in the comment form. In its final form, this guide will provide an easily understandable, comprehensive, and quantifiable resource to help authors make informed decisions on where to publish based on publisher policies. In addition, funders and other organizations will have a resource that indicates criteria for what level of OA is required for their policies and mandates.

This OA guide is aimed toward a wide audience of researchers, authors, and policy-makers. Your feedback will help us more precisely define OA across a number of categories. The goals of the guide are to:

• Move the conversation from “is it open access?” to “how open?” 

• Clarify the definition of OA  

• Standardize terminology 

• Illustrate a continuum of “more open” versus “less open” 

• Enable people to compare and contrast publications and policies 

• Broaden the understanding of OA to a wider audience 

In 2002, the Budapest Open Access Initiative articulated the basic tenets of OA for the first time. Since then, thousands of journals have adopted policies that embrace some or all of the open access core components related to: readership; reuse; copyright; posting; and machine readability.

Why now and why this resource?  

OA is gaining momentum and we are seeing a groundswell of support from authors and funders to colleges and governments. Despite this progress there is still confusion about OA. With this guide we aim to provide greater clarity regarding its definition and components. All suggestions will be considered and a final version will be released during Open Access Week (October 22 -28, 2012). 



Unfortunately > The comment is now closed. 

Final Version Available Via (10-19-12)


Saturday, June 2, 2012

Scientific Utopia: I. Opening Scientific Communication

Brian A. Nosek, Yoav Bar-Anan / Submitted on 4 May 2012

Existing norms for scientific communication are rooted in anachronistic practices of bygone eras, making them needlessly inefficient. We outline a path that moves away from the existing model of scientific communication to improve the efficiency in meeting the purpose of public science - knowledge accumulation. We call for six changes: (1) full embrace of digital communication, (2) open access to all published research, (3) disentangling publication from evaluation, (4) breaking the "one article, one journal" model with a grading system for evaluation and diversified dissemination outlets, (5) publishing peer review, and, (6) allowing open, continuous peer review. We address conceptual and practical barriers to change, and provide examples showing how the suggested practices are being used already. The critical barriers to change are not technical or financial; they are social. While scientists guard the status quo, they also have the power to change it.

Comments > Psychological Inquiry, 2012
Subjects > Physics and Society (physics.soc-ph); Digital Libraries (cs.DL)
Cite as > arXiv:1205.1055v1 [physics.soc-ph]
Submission history > From: Brian Nosek [view email]  [v1] Fri, 4 May 2012 17:27:17 GMT (555kb)

Source and Link to Fulltext Available At 


Colloquium on Rethinking the Future of Scientific Communication

SUMMARY: "In March 2012, an impressive roster of leaders in the technology and communication industries (including: Anurag Acharaya, Google Scholar; Phil Bourne, PLoS Computational Biology; Paul Saffo, Foresight, Discern Analytics; among other accomplished editors, librarians, publishers and graduate students) participated in a colloquium on the Stanford University campus. Convened by Nader Rifai, Harvard Medical School, Michael Keller, Stanford University Libraries; and John Sack, HighWire Press, the discussions focused on identifying innovative ways to create continuity across the scientific community ecosystem in order to keep pace with the revolutionary transformation of search and discovery by the popular press. The executive summary provides more information on this progressive colloquium. "

Links to Overview and Executive Summary Available At


Does Open Access Publishing Increase Citation or Download Rates?

Issue 28 - May 2012

Dr Henk Moed / May 2012

The effect of "Open Access" (OA) on the visibility or impact of scientific publications is one of the most important issues in the fields of bibliometrics and information science. During the past 10 years numerous empirical studies have been published that examine this issue using various methodologies and viewpoints. Comprehensive reviews and bibliographies are given amongst others by OPCIT ..., Davis and Walters ... and Craig et al. ... . The aim of this article is not to replicate nor update these thorough reviews. Rather, it aims to presents the two main methodologies that were applied in these OA-related studies and discusses their potentialities and limitations. The first method is based on citation analyses; the second on usage analyses.


Source and Fulltext Available At  


UK RepositoryNet+

About RepositoryNet+



UK RepositoryNet+ (RepNet) is a socio-technical infrastructure supporting deposit, curation & exposure of Open Access research literature.  [snip]. ... [Its] general approach is to envision a mix of distributed and centrally delivered service components within pro-active management, operation, support and outcome. While this infrastructure will be designed to meet the needs of UK research, it is set and must operate effectively within a global context.

Open Access

An entry in Wikipedia defines Open Access (OA) to research literature as the practice of providing unrestricted access via the Internet to peer-reviewed scholarly journal articles and increasingly to theses, scholarly monographs and book chapters. It also notes that OA comes in two degrees: Gratis OA, no-cost online access; Libre OA, being Gratis OA plus some additional usage rights. [snip].

There are various forms of mixed and hybrid Open Access (OA), but the main distinction is between Green and Gold. [snip].


The aim of the RepNet project is to increase the cost effectiveness of repositories ... by offering a sustained and well-used suite of services that facilitate cost effective operation.  Specifically RepNet will:

  • Scope and deliver repository and curation services via a production environment that offers economies of scale and scope
  • Set up a production environment for repository shared services which works closely with the proposed innovation zone.
  • Provide market research/ intelligence, quality assurance, business case and sustainability planning to support the project.


As an infrastructure hub, RepNet will facilitate efficient service delivery and service support, e.g. through provision of a professional helpdesk, development of appropriate service level definitions against which service levels can be monitored, and liaison around the scope and need for improvement of service components.


Sustainability planning is a key outcome, operating at several levels. Institutional support of repositories ultimately requires that they meet institutional objectives ... . The central task for RepNet is to provide sustainable infrastructure with service-quality components that assist cost-effective ingest, quality improvement and continuity of access for repository content.

First steps and priorities

First steps in this project have been development of technical infrastructure and of a suite of shared services.  The project sought to fund integration of components that were production ready or close to it and were already in use by the community.  [snip]. In the second ‘wave’ more attention will be given to scoping the curation requirements – and identifying or specifying the components required to meet those requirements.


The RepNet Project Board meets monthly, and the Advisory Board meets on a quarterly basis. The Repnet team meet monthly, with minutes and notes from all meetings posted on the project wiki. Regular and ad hoc meetings of the SIPG group are also minuted and published on the project wiki.


Source Available Via


Friday, June 1, 2012

Two Architects of Library Discovery Tools Launch an Altmetrics Venture

Michael Kelley / May 31 2012

“Our beta customers will be helping us prioritize the next sources from which to harvest,” Buschman said.

Two prominent veterans of the library vendor world recently launched a startup company which aims to capitalize on the rapidly flowering field of altmetrics.

Andrea Michalek and Mike Buschman had been the director of technology and director of product management, respectively, for ProQuest’s Summon discovery service since its inception. But the pair left the company in November 2011 and in January founded Plum Analytics, deciding that altmetrics presented enough promise to justify surrendering such prominent positions.


After raising money from friends, family, and angel investors, the duo demoed the public alpha product on March 14 at the monthly Philly Tech Meetup (see video).

Since that time, we have been talking to libraries interested in becoming beta customers to help build out the next level of the product, as well as take the opportunity to define the next generation of impact metrics,” Buschman said. [snip].


Altmetrics (short for alternative metrics) provides a new way to measure the impact of scholarly communication. Rather than rely solely on the traditional and slow measure of citations in peer-reviewed articles (the impact factor), altmetrics provides a complementary, instant measurement window that includes all Web-based traces of research communication. It pulls together all the usage data about each individual output a researcher has produced.


Plum Analytics and similar ventures in the field aggregate metrics, collected via open APIs, from sources as varied as Twitter to blogs to open access repositories that publish article-level metrics (such as PLoS) as well as from data repositories, scholarly social bookmarking sites (e.g., Mendeley or CiteULike), code source repositories (GitHub), presentation sharing sites (SlideShare), grant funding data, link shortener metrics, and more.

Plum Analytics is wading into an incipient but very active field, such as this user group on Mendeley or the Twitter hashtag #altmetrics shows. In addition to the article-level metrics application that PLoS has been developing, services similar to Plum Analytics, such as CitedIn, ReaderMeter, and Science Card, have also emerged.

One of the more prominent services is Total-Impact, which Jason Priem, a third-year doctoral student at the University of North Carolina at Chapel Hill’s School of Information and Library Science (SILS), and Heather Piwowar, a postdoctoral research associate at the National Evolutionary Synthesis Center (NESCent) in Durham, have developed. [snip]


In the case of Plum Analytics, the metrics are presently based on 13 main data sources, which Buschman said they are adding to (including sources more important to social sciences and humanities). Users will likely be allowed to weight the data as they choose (e.g., 50 Tweets equal one “like”).


Wendy Pradt Lougee, the university librarian at the University of Minnesota, said the library there has a very close partnership with the university’s office of research in order to explore ways of revealing more data about researchers, including metrics beyond citations, and rolling out SciVal from Elsevier and the Harvard Profiles research networking software. But attitudes toward altmetrics can vary considerably depending on the disciplinary context.

“Faculty are very discerning in how they are represented and the reputational value of different publication venues and metrics,” Lougee said. “We have seen a growing interest in research networking systems and tools that help move beyond just citations and represent a faculty’s research repertoire more fully,  ... ."

Nevertheless, university librarians such as Lougee and Sarah Michalak at the University of North Carolina at Chapel Hill are keeping a close eye on developments, even if they are not yet ready to plunge headfirst into altmetrics.

“We need to interest ourselves in new ways of measuring the impact of scholarship and new and powerful kinds of information tools,” Michalak said, noting that librarians were among the first to see the possibilities in the school’s Reach NC project.

Buschman of Plum Analytics saw the library as a natural ally. The data becomes a tool that libraries can use to help researchers determine which forms of communication generate the most meaningful interaction with their research and also track forms of impact that are not contained in the citation record.


However, pulling together all the usage data about each individual output a researcher has produced presents a number of technological challenges. [snip].


Befitting their background as architects of Summon, Buschman and Michalek also are attempting to build a commercial-grade product that can  scale up to the challenge of loading the research output from millions of researchers, and the data sources that report off them.

“The flexibility of data analysis at scale is the sweet spot of our solution,” Buschman said.  “We are building toolsets not just for collecting the article level metrics, but also for mapping the hierarchy of the institution and the affiliations of the researchers.”


“In the long term, this is a chance for libraries to lead the way into a web-native, post-journal world of scholarly communication, in which aggregated conversation and review by experts replaces the slow, closed, inefficient, atavistic system we’ve inherited from the last century,” said Priem of Total-Impact.

Source and Fulltext Available At 


Monday, May 28, 2012

BioMed Central Blog > Assessing Research Impact at the Article Level

Posted by Ciaran O'Neill / Friday May 25, 2012

The impact of academic research has long been measured using citations, often with the Journal Impact Factor being used to assess individual publications within it. However, the Impact Factor is a journal level - not an article level - metric and, as academic publishing and the surrounding discussion moves increasingly onto the web, novel opportunities to track and assess the impact of individual scientific publications have emerged.

These web-based approaches are starting to offer an article-level perspective of the way research is disseminated, discussed and integrated across the web. The hope is that a broader set of metrics to complement citations will eventually give a more comprehensive view of article impact, ... . is one of a growing number of web-based tools taking a novel approach to the assessment of scholarly impact – it aggregates the mentions on twitter and social media sites, and coverage in online reference managers, mainstream news sources and blogs to present an overview of the interest a published article is receiving online. BioMed Central has today added the 'donut' to the about page of published articles – the donut will display for articles receiving coverage which has been tracked by, along with an article score ... .

The donut visualization shown on the 'about this article' page aims convey information about the type of attention the article has received ... .

This summary supplements our existing article-level measures of impact – article accesses and citations are displayed on all 'about this article' pages, ... . [snip].

As more indicators of article performance, visibility and impact emerge, the hope is that authors, readers and funding institutions will be able to assess research impact in a way which is more informed than relying on Impact Factors alone. [snip]. We plan to keep adding to this range of metrics and indicators, as they continue to expose a fuller image of research impact.

Source and Fulltext Available At 


Saturday, May 19, 2012

Latest Developments in PLoS Article-Level Metrics

By Richard Cave
Posted: May 14, 2012

 PLoS continues to expand and refine Article-Level Metrics (ALM). This suite of performance measures (including usage statistics, citations, trackbacks from blogs, bookmarks, social media coverage and user comments and ratings) are available on every PLoS article so that authors and the scientific community can assess the impact of the research. We are also broadening our outreach activities to spread the word on ALM to more researchers, technical experts, other publishers, funders, and institutions.

A key part of the current effort is to convene scholarly metrics thought leaders to help spearhead the widespread adoption of ALM. By engaging leading authorities in metrics, and bringing them together in a working group, PLoS can better coordinate the development of ALM. The following experts serve on the ALM Technical Working Group in an advisory role to help steer the direction of PLoS ALM implementation:

  • Pedro Beltrao, University of California San Francisco
  • Phil Bourne, University of California Santa Cruz
  • Bjoern Brembs, Freie Universität Berlin
  • Martin Fenner, PLoS
  • Duncan Hull, European Bioinformatics Institute
  • Cameron Neylon, Science and Technology Facilities Council Oxford
  • Heather Piwowar, NESCent, Duke University
  • Jason Priem, University of North Carolina at Chapel Hill
  • Dario Taraborelli, Wikimedia Foundation
  • Jevin West, University of Washington
  • Johan Bollen, Indiana University


Source and Fulltext Available

PLoS > Article-Level Metrics

PLoS Article-Level Metrics (ALM): A Research Impact Footprint

Article-Level Metrics consist of a transparent suite of established measures that offer a view into the overall performance and reach of a research article;
  • Videos and Presentations
  • Upcoming Events
  • Latest News

Source and Links Available At 

New Data Sources Added to the PLoS Article-Level Metrics Program > Feb 2012

Cameron Neylon > Article-level Metrics > 2010

Article-Level Metrics at PLoS - What Are They, and Why Should You Care > Nov 2009

Wednesday, May 16, 2012

Journal of Librarianship and Scholarly Communication

The Journal of Librarianship and Scholarly Communication seeks to share useful innovations, both in thought and in practice, with the aim of encouraging scholarly exchange and the subsequent benefits that are borne of scrutiny, experimentation and debate. As modes of scholarly communication, the technologies and economics of publishing and the roles of libraries evolve, it is our hope that the work shared in the journal will inform practices that strengthen librarianship and that increase access to the "common Stock of Knowledge."

Current Issue: Volume 1, Issue 1 (2012)
Video Credit: Defining Scholarly Communication, created/owned by the contributors and produced by Kathryn Pope and Vin Aliberto, Center for Digital Research and Scholarship, Columbia University Libraries/Information Services. CC-BY.


What is in a Name? Introducing the Journal of Librarianship and Scholarly Communication
Isaac Gilman and Marisa Ramirez

In Memoriam: Deborah Barreau Gary Marchionini


The Movement to Change Scholarly Communication Has Come a Long Way – How Far Might It Go? / Joyce Ogburn

Coming in the Back Door: Leveraging Open Textbooks To Promote Scholarly Communications on Campus / Steven J. Bell

Point & Counterpoint: Is CC BY the Best Open Access License? / Klaus Graf and Sanford Thatcher

Research Articles

The Anatomy of a Data Citation: Discovery, Reuse, and Credit / Hailey Mooney and Mark P. Newton

The Accessibility Quotient: A New Measure of Open Access / Mathew A. Willmott, Katharine H. Dunn, and Ellen Finnie Duranceau

Does Tenure Matter? Factors Influencing Faculty Contributions to Institutional Repositories /
Anne M. Casey

Practice Article

Innovation Fair Abstracts, SPARC 2012 Open Access Meeting / Abstract Authors'

Theory Article

Open Access Publishing Practices in a Complex Environment: Conditions, Barriers, and Bases of Power / Thomas L. Reinsfelder

Brief Reviews of Books and Products

Developing Open Access Journals: A Practical Guide by David J. Solomon
Caitlin Bakker

Starting, Strengthening, and Managing Institutional Repositories / Jonathan A. Nabe
Therese F. Triumph

Embedding Repositories: A Guide and Self-Assessment Tool by JISC / Michele I. Wyngard

Source and Links Available At 

Tuesday, May 15, 2012

How Twitter Will Revolutionise Academic Research and Teaching

Monday 12 September 2011

Social media is becoming increasingly important in teaching and research work but tutors must remember, it's a conversation not a lecture, says Ernesto Priego

In chapter two of Christian Vandendorpe's From Papyrus to Hypertext titled: In the beginning was the ear, Vandendorpe says it took millenia "for literature to free itself from primary orality, albeit not completely". In the beginning all reading was done out loud, and it was not until the 12th century that books were created for silent reading. Orthographic signs and the separation between words had appeared around the 7th century, but did not become common until the 9th century amongst the learned communities of monks. Walter J Ong, in his classic study of writing and orality, Orality and Literacy defines as the "technologizing of the word" the process of developing a new relationship between language and thought.

Something similar is happening today in academia. Just like Augustine marveled, in the year 400, at the sight of Ambrose reading in silence, many members of academia marvel (or react with rejection) at the rapid changes in the production and dissemination of scholarly work and interaction between academics and those "outside" academic institutions. Thousands of scholars and higher education institutions are participating in social media (such as Twitter), as an important aspect of their research and teaching work.

There is still considerable resistance to embracing social media tools for educational purposes, but if you are reading this article you are probably willing to consider their positive effects. [snip].

Mobile phones and tablets enable the user in producing content, and to publish and disseminate it online. The microblogging platform Twitter is purposefully designed to exchange information and to facilitate reciprocal communication and attribution, [snip].

This is the key question where "the ear", our ability to listen and to place ourselves in a particular context at a particular time, comes to the fore. In traditional scholarly communications, academics produced a document ... . [snip]. Today, more and more academics are creating content and posting it online. [snip]. Therefore, the 21st century scholar has the tools not only to publish and disseminate, but also to facilitate the development of specialised audiences, and therefore of what is called "impact": people read, and in turn write about your work, which is in turn read by others.


"It's a conversation, not a lecture," is a well-known trope that is useful to remember in the scholarly web. This does not mean we should spend every waking hour chatting to strangers on social networks; it means that social media is not a uni-directional broadcasting tool. Those who "follow" us online are likely to be our students, colleagues, employers. They are not a passive audience.


Source and Fulltext Available At 

Sunday, May 13, 2012

Research Blogs and the Discussion of Scholarly Information

Shema H, Bar-Ilan J, Thelwall M (2012) Research Blogs and the Discussion of Scholarly Information. PLoS ONE 7(5): e35869. doi:10.1371/journal.pone.0035869


The research blog has become a popular mechanism for the quick discussion of scholarly information. However, unlike peer-reviewed journals, the characteristics of this form of scientific discourse are not well understood, for example in terms of the spread of blogger levels of education, gender and institutional affiliations. In this paper we fill this gap by analyzing a sample of blog posts discussing science via an aggregator called (RB). aggregates posts based on peer-reviewed research and allows bloggers to cite their sources in a scholarly manner. We studied the bloggers, blog posts and referenced journals of bloggers who posted at least 20 items. We found that RB bloggers show a preference for papers from high-impact journals and blog mostly about research in the life and behavioral sciences. The most frequently referenced journal sources in the sample were: Science, Nature, PNAS and PLoS One. Most of the bloggers in our sample had active Twitter accounts connected with their blogs, and at least 90% of these accounts connect to at least one other RB-related Twitter account. The average RB blogger in our sample is male, either a graduate student or has been awarded a PhD and blogs under his own name.

Source and Fulltext Availabae At


Sunday, May 6, 2012

Science and Truth: We’re All in It Together

Published: May 5, 2012

THE greatest bird news of our lifetime occurred at the height of the George W. Bush administration. In April 2005, amid a pageant of flags and cabinet ministers in Washington, John Fitzpatrick, the director of the Cornell Lab of Ornithology, announced that an ivory-billed woodpecker had been spotted for the first time in more than half a century in an Arkansas swamp.

President Bush pledged millions for habitat restoration. This and hundreds of other papers heralded the news. Public radio did one of those field reports in which you can hear the reporter’s canoe purling through swamp waters.

The news was exciting because the evidence of this new truth was overwhelming. There was an empirical article in the journal Science, an online video of the bird, audio clips reminiscent of its famous tinhorn squeak and seven sightings of the bird by credentialed experts.

Moreover, the ivory-bill is charismatic megafauna, regally beautiful and a natural mascot for fund-raising: a magnificent blast of snow in its trailing feathers, a jaunty red cap for a crown and a Harry Potteresque bit of white lightning down its neck. For it to appear after so many years was mythological, a message of forgiveness: maybe our environmental sins weren’t so bad. Not since the dove returned to Noah’s Ark has a bird’s appearance been so fraught.

Right away, though, there was controversy. Several academics, among them Richard Prum and Mark Robbins, questioned the evidence but held their criticisms when privately shown more and better data.

Then something new happened. Outsiders and other disbelievers kept on coming. A painter of birds, David Sibley (joined by several academics outside Cornell), dissected the video frame by frame and saw a common pileated woodpecker. Uh-oh. Then an amateur birder, Tom Nelson, began to gather the Internet commenters on his own blog. For the next several years, was a watering hole where weekend bird enthusiasts, field guides and others produced reams of counter-evidence and arguments, and so completely dismantled each piece of ivory-bill evidence that few outside the thin-lipped professionals at Cornell still believed in the bird.

Almost any article worth reading these days generates some version of this long tail of commentary. Depending on whether they are moderated, these comments can range from blistering flameouts to smart factual corrections to full-on challenges to the very heart of an article’s argument.


Should this part of every contemporary article be curated and edited, almost like the piece itself? Should it have a name? Should it be formally linked to the original article or summarized at the top? By now, readers understand that the definitive “copy” of any article is no longer the one on paper but the online copy, precisely because it’s the version that’s been read and mauled and annotated by readers. [snip].

We call the fallout to any article the “comments,” but since they are often filled with solid arguments, smart corrections and new facts, the thing needs a nobler name. Maybe “gloss.” [snip]


Sure, there is still the authority that comes of being a scientist publishing a peer-reviewed paper, or a journalist who’s reported a story in depth, but both such publications are going to be crowd-reviewed, crowd-corrected and, in many cases, crowd-improved. (And sometimes, crowd-overturned.) Granted, it does require curating this discussion, since yahoos and obscenity mavens tend to congregate in comment sections.

Yet any good article that has provoked a real discussion typically comes with a small box of post-publication notes. And, since many magazines are naming the editor of the article as well as the author, the outing of the editor can come with a new duty: writing the bottom note that reviews the emendations to the article and perhaps, most importantly, summarizes the thrust of the discussion. If the writer gains the glory of the writing, the editor can win the credit for chaperoning the best and most provocative pieces.

Some scientists are already experimenting with variations of this idea within the stately world of peer review. New ways to encourage wider collaboration before an article is published — through sites like ResearchGate — are attempts to bring the modern world of crowd-improvement to empirical research.

Already, among scientists, there is pushback, fear that incorporating critiques outside of professional peer review will open the floodgates to cranks. Not necessarily. The popular rejection last year of the discovery of a microbe that can live on arsenic was mercifully swift precisely because it was executed by online outsiders. Not acknowledging that crowd-checking and amateur commentary have created a different world poses its own dangers.

Take the case of the ivory-bill. The article in Science has never been retracted. Cornell still stands by its video. [snip]

Some may fear that recognizing the commentary of every article will turn every subject into an endless postmodern discussion. But actually, the opposite is true. Recognizing the gloss allows us to pause in the seemingly unending back and forth of contemporary free speech and free inquiry ... .


Source and Fulltext Available At 


CfP > Peer Reviewing and Research Evaluation

Tuesday, May 1, 2012

Refereeing Academic Articles in the Information Age


The new technology (such as ScholarOne) used for submitting papers to academic journals ...  increases the possibilities for gathering, analysing and presenting summary data on stages in the refereeing process. Such data can be used to clarify the roles played by editors and publishers as well as referees—roles less widely discussed in the previous literature. I conclude, after a review of related issues, that refereeing should be “open” in this information age—i.e. correspondence between editors, referees and authors should be open and available, and not private. Some of the issues involved in achieving this are outlined and discussed.

Practitioners' Notes
  • What is already known about this topic
The importance and the value of refereeing articles submitted to journals are widely debated and are contentious issues.
  • What this paper adds
This paper discusses these concerns in the context of electronic submission processes in general and the British Journal of Educational Technology in particular.
  • Implications for practice and/or policy
Electronic submissions allow for the collection of much more data (both public and private) on authors, editors and referees. These data should not be hidden but used to inform research and practice. In particular, open review (where the names of the authors and the referees are known to all concerned in the refereeing process) is possible and achievable.


Conclusions: a personal view

In this article I have discussed some of the current practices used by editors, authors and referees when using electronic submission and publishing systems. Some of these practices are more open than others, but I believe, in this information age of the WikiLeaks and Twitter, that little—if anything—should be hidden from different contributors to the total system. Thus, I feel that it is System 5 that we should be aiming for when it comes to refereeing. What little evidence there is (as opposed to opinion) suggests that, with open refereeing, there will be some improvement in the quality of the reports received and an increase in the number of reviewers recommending publication but that there will be a decrease in the number of reviewers willing to review. (This evidence can be found in the studies of Bingham et al, 1998; Smith, 1999; van Rooyen, Godlee, Evans, Black & Smith, 1999; Walsh, Rooney, Appleby & Wilkinson, 2000). However, no one to my knowledge has studied the effects of open peer reviewing and, because there is so little empirical evidence on these issues, it would be remiss of me to advocate strongly any or one particular approach. It would be good, nonetheless, to obtain more evidence on these issues in the future.

Note: Source and Fulltext Available To Subscribers

Hartley, J. (2012), Refereeing academic articles in the information age. British Journal of Educational Technology, 43: 520–528. doi: 10.1111/j.1467-8535.2011.01211.x

Related Blog Post Available At