Saturday, June 2, 2012

Scientific Utopia: I. Opening Scientific Communication

Brian A. Nosek, Yoav Bar-Anan / Submitted on 4 May 2012

Existing norms for scientific communication are rooted in anachronistic practices of bygone eras, making them needlessly inefficient. We outline a path that moves away from the existing model of scientific communication to improve the efficiency in meeting the purpose of public science - knowledge accumulation. We call for six changes: (1) full embrace of digital communication, (2) open access to all published research, (3) disentangling publication from evaluation, (4) breaking the "one article, one journal" model with a grading system for evaluation and diversified dissemination outlets, (5) publishing peer review, and, (6) allowing open, continuous peer review. We address conceptual and practical barriers to change, and provide examples showing how the suggested practices are being used already. The critical barriers to change are not technical or financial; they are social. While scientists guard the status quo, they also have the power to change it.

Comments > Psychological Inquiry, 2012
Subjects > Physics and Society (physics.soc-ph); Digital Libraries (cs.DL)
Cite as > arXiv:1205.1055v1 [physics.soc-ph]
Submission history > From: Brian Nosek [view email]  [v1] Fri, 4 May 2012 17:27:17 GMT (555kb)

Source and Link to Fulltext Available At 


Colloquium on Rethinking the Future of Scientific Communication

SUMMARY: "In March 2012, an impressive roster of leaders in the technology and communication industries (including: Anurag Acharaya, Google Scholar; Phil Bourne, PLoS Computational Biology; Paul Saffo, Foresight, Discern Analytics; among other accomplished editors, librarians, publishers and graduate students) participated in a colloquium on the Stanford University campus. Convened by Nader Rifai, Harvard Medical School, Michael Keller, Stanford University Libraries; and John Sack, HighWire Press, the discussions focused on identifying innovative ways to create continuity across the scientific community ecosystem in order to keep pace with the revolutionary transformation of search and discovery by the popular press. The executive summary provides more information on this progressive colloquium. "

Links to Overview and Executive Summary Available At


Does Open Access Publishing Increase Citation or Download Rates?

Issue 28 - May 2012

Dr Henk Moed / May 2012

The effect of "Open Access" (OA) on the visibility or impact of scientific publications is one of the most important issues in the fields of bibliometrics and information science. During the past 10 years numerous empirical studies have been published that examine this issue using various methodologies and viewpoints. Comprehensive reviews and bibliographies are given amongst others by OPCIT ..., Davis and Walters ... and Craig et al. ... . The aim of this article is not to replicate nor update these thorough reviews. Rather, it aims to presents the two main methodologies that were applied in these OA-related studies and discusses their potentialities and limitations. The first method is based on citation analyses; the second on usage analyses.


Source and Fulltext Available At  


UK RepositoryNet+

About RepositoryNet+



UK RepositoryNet+ (RepNet) is a socio-technical infrastructure supporting deposit, curation & exposure of Open Access research literature.  [snip]. ... [Its] general approach is to envision a mix of distributed and centrally delivered service components within pro-active management, operation, support and outcome. While this infrastructure will be designed to meet the needs of UK research, it is set and must operate effectively within a global context.

Open Access

An entry in Wikipedia defines Open Access (OA) to research literature as the practice of providing unrestricted access via the Internet to peer-reviewed scholarly journal articles and increasingly to theses, scholarly monographs and book chapters. It also notes that OA comes in two degrees: Gratis OA, no-cost online access; Libre OA, being Gratis OA plus some additional usage rights. [snip].

There are various forms of mixed and hybrid Open Access (OA), but the main distinction is between Green and Gold. [snip].


The aim of the RepNet project is to increase the cost effectiveness of repositories ... by offering a sustained and well-used suite of services that facilitate cost effective operation.  Specifically RepNet will:

  • Scope and deliver repository and curation services via a production environment that offers economies of scale and scope
  • Set up a production environment for repository shared services which works closely with the proposed innovation zone.
  • Provide market research/ intelligence, quality assurance, business case and sustainability planning to support the project.


As an infrastructure hub, RepNet will facilitate efficient service delivery and service support, e.g. through provision of a professional helpdesk, development of appropriate service level definitions against which service levels can be monitored, and liaison around the scope and need for improvement of service components.


Sustainability planning is a key outcome, operating at several levels. Institutional support of repositories ultimately requires that they meet institutional objectives ... . The central task for RepNet is to provide sustainable infrastructure with service-quality components that assist cost-effective ingest, quality improvement and continuity of access for repository content.

First steps and priorities

First steps in this project have been development of technical infrastructure and of a suite of shared services.  The project sought to fund integration of components that were production ready or close to it and were already in use by the community.  [snip]. In the second ‘wave’ more attention will be given to scoping the curation requirements – and identifying or specifying the components required to meet those requirements.


The RepNet Project Board meets monthly, and the Advisory Board meets on a quarterly basis. The Repnet team meet monthly, with minutes and notes from all meetings posted on the project wiki. Regular and ad hoc meetings of the SIPG group are also minuted and published on the project wiki.


Source Available Via


Friday, June 1, 2012

Two Architects of Library Discovery Tools Launch an Altmetrics Venture

Michael Kelley / May 31 2012

“Our beta customers will be helping us prioritize the next sources from which to harvest,” Buschman said.

Two prominent veterans of the library vendor world recently launched a startup company which aims to capitalize on the rapidly flowering field of altmetrics.

Andrea Michalek and Mike Buschman had been the director of technology and director of product management, respectively, for ProQuest’s Summon discovery service since its inception. But the pair left the company in November 2011 and in January founded Plum Analytics, deciding that altmetrics presented enough promise to justify surrendering such prominent positions.


After raising money from friends, family, and angel investors, the duo demoed the public alpha product on March 14 at the monthly Philly Tech Meetup (see video).

Since that time, we have been talking to libraries interested in becoming beta customers to help build out the next level of the product, as well as take the opportunity to define the next generation of impact metrics,” Buschman said. [snip].


Altmetrics (short for alternative metrics) provides a new way to measure the impact of scholarly communication. Rather than rely solely on the traditional and slow measure of citations in peer-reviewed articles (the impact factor), altmetrics provides a complementary, instant measurement window that includes all Web-based traces of research communication. It pulls together all the usage data about each individual output a researcher has produced.


Plum Analytics and similar ventures in the field aggregate metrics, collected via open APIs, from sources as varied as Twitter to blogs to open access repositories that publish article-level metrics (such as PLoS) as well as from data repositories, scholarly social bookmarking sites (e.g., Mendeley or CiteULike), code source repositories (GitHub), presentation sharing sites (SlideShare), grant funding data, link shortener metrics, and more.

Plum Analytics is wading into an incipient but very active field, such as this user group on Mendeley or the Twitter hashtag #altmetrics shows. In addition to the article-level metrics application that PLoS has been developing, services similar to Plum Analytics, such as CitedIn, ReaderMeter, and Science Card, have also emerged.

One of the more prominent services is Total-Impact, which Jason Priem, a third-year doctoral student at the University of North Carolina at Chapel Hill’s School of Information and Library Science (SILS), and Heather Piwowar, a postdoctoral research associate at the National Evolutionary Synthesis Center (NESCent) in Durham, have developed. [snip]


In the case of Plum Analytics, the metrics are presently based on 13 main data sources, which Buschman said they are adding to (including sources more important to social sciences and humanities). Users will likely be allowed to weight the data as they choose (e.g., 50 Tweets equal one “like”).


Wendy Pradt Lougee, the university librarian at the University of Minnesota, said the library there has a very close partnership with the university’s office of research in order to explore ways of revealing more data about researchers, including metrics beyond citations, and rolling out SciVal from Elsevier and the Harvard Profiles research networking software. But attitudes toward altmetrics can vary considerably depending on the disciplinary context.

“Faculty are very discerning in how they are represented and the reputational value of different publication venues and metrics,” Lougee said. “We have seen a growing interest in research networking systems and tools that help move beyond just citations and represent a faculty’s research repertoire more fully,  ... ."

Nevertheless, university librarians such as Lougee and Sarah Michalak at the University of North Carolina at Chapel Hill are keeping a close eye on developments, even if they are not yet ready to plunge headfirst into altmetrics.

“We need to interest ourselves in new ways of measuring the impact of scholarship and new and powerful kinds of information tools,” Michalak said, noting that librarians were among the first to see the possibilities in the school’s Reach NC project.

Buschman of Plum Analytics saw the library as a natural ally. The data becomes a tool that libraries can use to help researchers determine which forms of communication generate the most meaningful interaction with their research and also track forms of impact that are not contained in the citation record.


However, pulling together all the usage data about each individual output a researcher has produced presents a number of technological challenges. [snip].


Befitting their background as architects of Summon, Buschman and Michalek also are attempting to build a commercial-grade product that can  scale up to the challenge of loading the research output from millions of researchers, and the data sources that report off them.

“The flexibility of data analysis at scale is the sweet spot of our solution,” Buschman said.  “We are building toolsets not just for collecting the article level metrics, but also for mapping the hierarchy of the institution and the affiliations of the researchers.”


“In the long term, this is a chance for libraries to lead the way into a web-native, post-journal world of scholarly communication, in which aggregated conversation and review by experts replaces the slow, closed, inefficient, atavistic system we’ve inherited from the last century,” said Priem of Total-Impact.

Source and Fulltext Available At