Monday, December 3, 2012

Awareness, Attitudes and Participation of Teaching Staff Towards the Open Content Movement in One University / Peter Reed

Page Header


This research investigates the current awareness of, and participation in, the open content movement at one UK institution for higher education. The open content movement and the open educational resources can be seen as potential methods for reducing time and cost of technology-enhanced learning developments; however, its sustainability and, to some degree, its success are dependent on critical mass and large-scale participation. Teaching staff were invited to respond to a questionnaire. Respondents (n59) were open to the idea of sharing their own content and, similar to other studies, demonstrated existing practices of sharing resources locally amongst colleagues; however, there was little formal, large-scale sharing using suitable licenses. The data gathered concurs with other research suggesting a lack of awareness to the Creative Commons licenses as well as a lack of participation in large open educational resource repositories.
Keywords: open educational resources; staff attitudes; sustainability

(Published: 22 October 2012)

Citation: Research in Learning Technology 2012, 20: 18520

Source and Links to Full Text Available At 


Saturday, December 1, 2012

Visualizing Tweets Linking to a Paper

Martin Fenner / Posted: July 14, 2012

DNA Barcoding the Native Flowering Plants and Conifers of Wales has been one of the most popular new PLoS ONE papers in June. In the paper Natasha de Vere et al. describe a DNA barcode resource that covers the 1143 native Welsh flowering plants and conifers.

My new job as technical lead for the PLoS Article Level Metrics (ALM) project involves thinking about how we can best display the ALM collected for this and other papers. We want these ALM to tell us something important and/or interesting, and it doesn’t hurt if the information is displayed in a visually appealing way. There are many different ways this can be done, but here I want to focus on Twitter and CiteULike, the only two data sources where PLoS is currently storing every single event (tweet or CiteULike bookmark) with a date. Usage data (HTML and XML views, PDF downloads) are aggregated on a monthly basis, and PLoS doesn’t store the publication dates of citations.

We know from the work of Gunter Eysenbach and others that most tweets linking to scholarly papers are written in the first few days after publication. It therefore makes sense to display this information on a timeline covering the first 30 days after publication, and the tweets about the de Vere paper follow the same pattern.


Source and Full Text Available At 


altmetrics12 > An ACM Web Science Conference 2012 Workshop

Evanston, IL • 21 June 2012

Keynotes (9:00-10:00)

  • Johann Bollen
  • Gregg Gordon
Coffee break (10:00-10:30)

Paper presentations (10:30-01:00)

Position and theory papers, 10min each (10:30-11:30)

  • Martin Fenner / Altmetrics will be taken personally at PLoS (presentation)
  • William Gunn and Jan Reichelt / Social metrics for research: quantity and quality (presentation)
  • Elizabeth Iorns / Reproducibility: an important altmetric
  • Britt Holbrook / Peer review, altmetrics, and ex ante broader impacts assessment – a proposal
  • Kelli Barr / The Role of altmetrics and Peer Review in the Democratization of Knowledge (chalkboard notes)

Empirical papers, 15min each (11:30-1:00)

  • Judit Bar-Ilan / JASIST@mendeley
  • Jasleen Kaur and Johan BollenStructural Patterns in Online Usage (presentation)
  • Vincent Larivière, Benoit Macaluso, Staša Milojević, Cassidy R. Sugimoto and Mike Thelwall / Of caterpillars and butterflies: the life and afterlife of an arXiv e-print
  • Jason Priem, Heather Piwowar and Bradley Hemminger / Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact (presentation)
  • Jennifer Lin / A Case Study in Anti-Gaming Mechanisms for Altmetrics: PLoS ALMs and DataTrust (presentation)
  • Richard Price / Altmetrics and
Lunches on your own (1:00-2:00p)

Demos (2:00-3:00p)

  • total-impact (Heather Piwowar)
  • (Euan Adie) (presentation)
  • PLoS ALM (Martin Fenner)
  • Ubiquity Press metrics (Brian Hole)
  • Plum Analytics (Andrea Michalek)
  • BioMed Central metrics (Ciaran O’Neill)
  • (Richard Price)
  • Knode (David Steinberg)
  • CASRAI (David Baker)
  • Mendeley and ReaderMeter (William Gunn)
  • (Richard Price)
Group discussion (3:00-4:30p)

We’ll split into small groups to discuss key altmetrics issues; topics may include:

  • Gaming: how might it happen, and how do we stop it?
  • Standards: We’ve got COUNTER for downloads; should there be standards for other altmetrics? What should they look like?
  • Visualization: There’s a lot of data. How should we display it?
  • Peer review: Could altmetrics replace traditional peer review? Should it? Can we build new publishing models around altmetrics?
  • CVs and “impact dashboards”: What does an altmetrics-informed CV look like? Who wants (and doesn’t want) one?
  • Publishers: What do publishers want from altmetrics services? How about readers and authors?
  • Normalization: How do we compare metrics from different fields or disciplines?
Group presentations and discussion (4:30-5:30p)

Summing up (5:30-6:00p)

Conclusion (Summarize key points from live and online discussion)

Open discussion: what’s the next year of altmetrics look like?


Source and Presentation Links Available At 


Slideshare > NISO Webinar: Beyond Publish or Perish: Alternative Metrics for Scholarship

November 14, 2012
1:00 - 2:30 p.m. (Eastern Time)


About the Webinar

Increasingly, many aspects of scholarly communication—particularly publication, research data, and peer review—undergo scrutiny by researchers and scholars. Many of these practitioners are engaging in a variety of ways with Alternative Metrics (#altmetrics in the Twitterverse). Alternative Metrics take many forms but often focus on efforts to move beyond proprietary bibliometrics and traditional forms of peer referencing in assessing the quality and scholarly impact of published work. Join NISO for a webinar that will present several emerging aspects of Alternative Metrics.

Source and Q&A and Slideshare Only Available At


Reusing, Revising, Remixing and Redistributing Research

An OA Week guest post by Daniel Mietchen

The initial purpose of Open Access is to enable researchers to make use of information already known to science as part of the published literature. One way to do that systematically is to publish scientific works under open licenses, in particular the Creative Commons Attribution License that is compatible with the stipulations of the Budapest Open Access Initiative and used by many Open Access journals. It allows for any form of sharing of the materials by anyone for any purpose, provided that the original source and the licensing terms are shared alongside. This opens the door for the incorporation of materials from Open Access sources into a multitude of contexts both within and outside traditional academic publishing, including blogs and wikis.

Amongst the most active reusers of Open Access content are Wikimedia projects like the over 280 Wikipedia, Wikispecies and their shared media repository, Wikimedia Commons. In the following, a few examples of reusing, revising, remixing and redistributing Open Access materials in the context of Wikimedia projects shall be highlighted.


Source and Full Text Available At 


Slideshare > Current and Future Effects of Social Media-Based Metrics on Open Access and IRs

The Power of Post Publication Review: A Case Study

There are many discussions and examples of post-publication review as an alternative to the currently more common peer-review model. While this comes up fairly regularly in my Twitter stream, I don’t think I’ve done more than hint at it within the blogposts here. I’ve also been watching (but neglecting to mention here) the emergence of data journalists and data journalism as a field, or perhaps perhaps I should say co-emergence, since it seems to be tightly coupled with shifts in the field of science communication and communicating risk to the public. Obviously, these all tie in tightly with the ethical constructs of informed consent and shared decisionmaking in healthcare (the phrase from the 1980s) which is now more often called participatory medicine.

That is quite a lot of jargon stuffed into one small paragraph. I could stuff it equally densely with citations to sources on these topics, definitions, and debates. Instead, for today, I’d like to give a brief overview of a case I’ve been privileged to observe unfolding over the weekend. If you want to see it directly, you’ll have to join the email list where this took place.


Source and Full Text Available At 


Open Access and Its Impact on the Future of the University Librarian

We are shifting from content ownership by individual libraries to joint provision of services on a larger scale, says Stephen Barr

With the publication of the Finch report earlier this year and the UK government's announcement to commit £10m to help make research findings freely available, there has been a gear shift towards a more rapid movement into an open access world for the publishing of scholarly information.

While there has been a lot of discussion around what that shift means for academic publishers, and there is now a lively dialogue between researchers and scholars in different disciplines, there seems to have been less discussion of what this shift means for libraries and librarians. Yet the move towards open access is a profound change for the whole infrastructure of scholarly communication, and is bound to have impacts on the library as it does on other parts of the process.


Source and Full Text Available At 


YouTube > Article-Level Metrics

YouTube > Alt Metrics -- A Funder's Perspective


Duration =  ~14:00 Minutes 

From Bibliometrics to Altmetrics A Changing Scholarly Landscape

C&RL News > 73 (10) > November 2012 > Robin Chin Roemer and Rachel Borchardt

When future Science Citation Index founder Eugene Garfield first came up with the idea of journal impact factor in 1955, it never occurred to him “that it would one day become the subject of widespread controversy.”

Today, techniques for measuring scholarly impact—traditionally known as bibliometrics —are well known for generating conflict and concern, particularly as tenure-track scholars reach beyond previously set boundaries of discipline, media, audience, and format. From the development of more nuanced academic specialties to the influence of blogs and social media, questions about the scope of scholarly impact abound, even as the pressure to measure such impact continues unabated or increases.

As faculty at universities around the world struggle to find new ways of providing evidence of their changing scholarly value, many librarians have stepped forward to help negotiate the landscape of both traditional impact metrics, such as h-index and journal impact factor, and emerging Web-based alternatives, sometimes called altmetrics, cybermetrics, or webometrics. With interest in online venues for scholarly communication on the rise, and the number of tools available for tracking online influence growing steadily, librarians are in a key position to take the lead in bolstering researchers’ knowledge of current trends—and concerns—in the new art and science impact measurement.


Source and Full Text Available At  


Scholarly Metrics with a Heart

I attended last week the PLOS workshop on Article Level Metrics (ALM). As a disclaimer, I am part of  the PLOS ALM advisory Technical Working Group (not sure why :). Alternative article level metrics refer to any set of indicators that might be used to judge the value of a scientific work (or researcher or institution, etc). As a simple example, an article that is read more than average might correlate with scientific interest or popularity of the work. There are many interesting questions around ALMs, starting even with simplest - do we need any metrics ? The only clear observation is that more of the scientific process is captured online and measured so we should at least explore the uses of this information.


Source and Full Text and Links Available At 


Open Post-Publication Peer Review


Beyond open access, which is generally considered desirable, the essential drawbacks of the current system of scientific publishing are all connected to the particular way that peer review is used to evaluate papers. In particular, the current system suffers from a lack of quality and transparency of the peer review process, a lack of availability of evaluative information about papers to the public, and excessive costs incurred by a system, in which private publishers are the administrators of peer review. These problems can all be addressed by open post-publication peer review.


Source and Links To Full Text and Brief and Full Arguments Available At


Beyond Open Access: Visions for Open Evaluation of Scientific Papers by Post-Publication Peer Review


This Research Topic in Frontiers in Computational Neuroscience collects visions for a future system of open evaluation. Because critical arguments about the current system abound, these papers will focus on constructive ideas and comprehensive designs for open evaluation systems. Design decisions include: Should the reviews and ratings be entirely transparent, or should some aspects be kept secret? Should other information, such as paper downloads be included in the evaluation? How can scientific objectivity be strengthened and political motivations weakened in the future system? Should the system include signed and authenticated reviews and ratings? Should the evaluation be an ongoing process, such that promising papers are more deeply evaluated? How can we bring science and statistics to the evaluation process (e.g. should rating averages come with error bars)? How should the evaluative information about each paper (e.g. peer ratings) be combined to prioritize the literature? Should different individuals and organizations be able to define their own evaluation formulae (e.g. weighting ratings according to different criteria)? How can we efficiently transition toward the future system?


Source and Full Text and Articles Links Available At 




The Digital Scholar: How Educators Can Be Part of the Digital Transformation / Martin Weller

Publication Date: 2011 / Pages: 256 /DOI:

While industries such as music, newspapers, film and publishing have seen radical changes in their business models and practices as a direct result of new technologies, higher education has so far resisted the wholesale changes we have seen elsewhere. However, a gradual and fundamental shift in the practice of academics is taking place. Every aspect of scholarly practice is seeing changes effected by the adoption and possibilities of new technologies. This book will explore these changes, their implications for higher education, the possibilities for new forms of scholarly practice and what lessons can be drawn from other sectors.

Table of Contents

  • Acknowledgements
  • Digital, Networked and Open
  • Is the Revolution Justified?
  • Lessons from Other Sectors
  • The Nature of Scholarship
  • Researchers and New Technology
  • Interdisciplinarity and Permeable Boundaries
  • Public Engagement as Collateral Damage
  • A Pedagogy of Abundance
  • Openness in Education
  • Network Weather
  • Reward and Tenure
  • Publishing
  • The Medals of Our Defeats
  • Digital Resilience
  • References

Source and Full Text Available At 




Article-level Metrics: Which Service to Choose?

26 Oct, 12 /  Claire Bower, Digital Comms Manager

Article-level metrics (or ALMs) were a hot topic at this week’s HighWire publisher meeting in Washington. (Highwire hosts both the BMJ and its stable of 42 specialist journals). From SAGE to eLife, publishers seem sold on the benefits of displaying additional context to articles, thereby enabling readers to assess their impact. These statistics range from traditional indicators, such as usage statistics and citations, to alternative values (or altmetrics) like mentions on Twitter and in the mainstream media.

So, what services are available to bring this information together in one simple interface? There are quite a few contenders in this area, including Plum Analytics, PLoS Article-Level Metrics application, Science Card, CitedIn and ReaderMeter. One system in particular has received a good deal of attention in the past few weeks; ImpactStory, a relaunched version of total-impact. It’s a free, open-source webapp that’s been built with financial help from the Sloan Foundation (and others) “to help researchers uncover data-driven stories about their broader impacts”.


Source and Full Text Available At 


Scientists Seek New Credibility Outside of Established Journals

Altmetrics: An App Review > Stacy Konkiel

Keywords: altmetrics; bibliometrics


Date: 2012-10-07

Rights: Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0)

Rights URL:

Type: Presentation


In a university culture increasingly influenced by metrics, academic libraries can use altmetrics to highlight scholarship’s hidden value. This session will cover the apps and services that can help faculty, administration, and librarians learn the full, true impact of research.


Presented at OCLC Innovation in Libraries post-conference event, LITA Forum 2012.

Source and Links Available At 


YouTube Video 


Duration = ~23:30