Tuesday, December 30, 2008

MESUR For Measure: MEtrics from Scholarly Usage of Resources

MESUR: MEtrics from Scholarly Usage of Resources


The project's major objective is enriching the toolkit used for the assessment of the impact of scholarly communication items, and hence of scholars, with metrics that derive from usage data. The project has created a semantic model of the scholarly communication process, and an associated large-scale semantic store that relates a range of bibliographic, citation and usage data obtained from a variety of sources.

After mapping the structure of the scholarly community on the basis of the established reference data set, MESUR will conduct an investigation into the definition and validation of a range of usage-based metrics. The defined metrics will be cross-validated, resulting in the formulation of guidelines and recommendations.
MESUR Database

The MESUR data base now contains 1B usage events (2002-2007) obtained from 6 significant publishers, 4 large institutional consortia and 4 significant aggregators! The collected usage data spans more than 100,000 serials (including newspapers, magazines, etc.) and is related to journal citation data that spans about 10,000 journals and nearly 10 years (1996-2006). In addition we have obtained significant publisher-provided COUNTER usage reports that span nearly 2000 institutions worldwide.

The data is being ingested into a combination of relational and semantic web databases, the latter of which is now estimated to result in nearly 10 billion semantic statements (triples). MESUR is now producing large-scale, longitudinal maps of the scholarly community and a survey of more than 60 different metrics of scholarly impact.

Quick Facts

Funding: The Andrew W. Mellon Foundation / Timeline: October 2006 - October 2008

Principal investigator: Johan Bollen / Institution: Los Alamos National Laboratory (LANL) / Team: Digital Library Research & Prototyping Team of the LANL Research Library

People: Johan Bollen is the Principal Investigator; Herbert Van de Sompel serves as an architectural consultant; and Aric Hagberg of the LANL Mathematical Modeling and Analysis group serves as modeling consultants.

Marko A. Rodriguez, a recent PhD graduate at the University of California Santa Cruz and now LANL post-doc at the LANL Center for Non-Linear Science, has supported the project's research and development. Ryan Chute of the LANL Research Library is now the project’s main developer and database manager.




Overview Papers / Lectures and Slides / MESUR Timeline / Metrics / MESUR Official Summary


MESUR Publications

Articles / Lectures and Slides






See Also

A Principal Component Analysis of 39 Scientific Impact Measures


Monday, December 29, 2008

Final Impact: What Factors Really Matter?

Columbia University Libraries. Information Services. Scholarly Communications Program / Fostering Innovation in Scholarly Communication /

Research Without Borders: The Changing World of Scholarly Communication

Final Impact: What Factors Really Matter?

October 30 2008 / 3:00 pm to 5:00 pm EDT/

Columbia University Medical Center / Hammer Health Sciences Center / 701 W. 168th Street / Room 401

A panel discussion on the debate about the best way to rank the importance and influence of scholarly publications.


  • Marian Hollingsworth, director of Publisher Relations at Thomson Reuters and former assistant director of the National Federation of Abstracting and Information Services [Start: 4:15]

  • Jevin West, an Achievement Awards for College Scientists Fellow at the University of Washington's Biology Department and head developer for Eigenfactor.org [Start: 20:10]

  • Johan Bollen, a staff researcher at Los Alamos National Laboratory and the principal investigator of the MESUR project [Start: 61:45]

Columbia University Librarian Jim Neal introduces the talk [Start: 00:00]

A/V Available at

[http://scholcomm.columbia.edu/content/final-impact-what-factors-really-matter] [Duration: 106:02 minutes]

>>PDF/PPT Slides NOW Available [01-05-09]<<



Also Available As A Podcast



"The Scholarly Communication Program is pleased to present a speaker series for the 2008-09 academic year on today's pivotal issues in scholarly communication. Six events will explore how scholars and researchers can take advantage of new and powerful ways of creating, sharing, reusing, and preserving knowledge."


Research Without Borders: The Changing World of Scholarly Communication Programs Also Available As A Podcast


Research Without Borders: The Changing World of Scholarly Communication Facebook Global Group


Friday, December 12, 2008

CHE: Bringing Tenure Into The Digital Age: Q&A With Christine L. Borgman

Bringing Tenure Into the Digital Age
New tools for analyzing information are arriving every day, but that doesn’t mean scholars who use them well are being rewarded, says Christine L. Borgman, a professor of information studies at the University of California at Los Angeles. She contends that the new “scholarly information infrastructure” must be shaped with collaborative, interdisciplinary research.

Q. In your recent book, “Scholarship in the Digital Age,” you contend that the tenure system needs to reward people for contributions to collaborative digital projects instead of recognizing only those who publish books and articles. Why?

A. Data is becoming a first-class object. In the days of completely paper publication, the article or book was the end of the line. And once the book was in libraries, the data were often thrown away or allowed to deteriorate.

Now we’re in a massive shift. Data become resources. They are no longer just a byproduct of research. And that changes the nature of publishing, how we think about what we do, and how we educate our graduate students. The accumulation of that data should be considered a scholarly act as well as the publication that comes out of it.


Q. Do you have any tips for the young scholar who feels deluged and overwhelmed?

A. Look for good data that have already been generated and are available. It’s the old saw about how an hour in the library can keep you from spending 60 hours in the lab. It’s similar in research nowadays. Finding good data that someone else has done, that you can build upon, is time well spent. [And] find partners that complement your expertise.

Q. What is your prescription when it comes to building infrastructure that makes all this information available?

A. We need a new conversation. We need to determine what we should be building, instead of just figuring if we build it, they will come. We’ve spent a lot of money on the technology without asking a lot of questions about the nature of scholarship.

When we do ask those questions, we will come up against entrenched interests, like the way we publish and get tenure. So we need to consider the policies and incentives for the reward system and for the use and reuse of information. These will need to change. —Lisa Guernsey


Monday, December 1, 2008

Workshop: Making The Web Work For Science

Making the Web Work for Science:

The Impact of e-Science and the Cyber-Infrastructure

A One-Day Workshop Co-sponsored by CENDI and NFAIS and Hosted by FLICC

Library of Congress, 101 Independence Ave, SE, Washington, DC 20540

Mumford Room / December 8, 2008 / 9:00am - 4:30pm

AGENDA (11-4-08)

8:30am - 9:00am: Registration/Coffee

9:00am - 9:15am:
Welcome / Opening Remarks Roberta Shaffer, Director of FLICC, Library of Congress

9:15am - 10:00am: Making the Web Work for Science: The Current Landscape

The opening keynote will provide an overview of how the Web is currently being utilized for the advancement of science and scholarly communication. Roberta Shaffer will introduce Dr. Christine Borgman, Professor & Presidential Chair in Information Studies, University of California, Los Angeles, and author of Scholarship in the Digital Age: Information, Infrastructure, and the Internet.

10:00am - 10:15am: Break and Networking Opportunity

10:15am - 11:45 pm: Making the Web Work for Science: The Content Providers’ Perspective

This session will focus on how innovative content providers, including Federal STI program leaders, librarians, and publishers are leveraging current Web technologies in order to maximize global access to and use of scientific and scholarly information. The use of Web 2.0 features such as Wiki’s, RSS feeds and blogs will be discussed as will plans for the future.

The panel participants are Dr. Walter Warnick, Director, Office of Scientific and Technical Information, Department of Energy; Dr. Sayeed Choudhury, Johns Hopkins University; and Howard Ratner, Executive Vice President and Chief Technology Officer, Nature Publishing Group. Karen Spence, DOE/OSTI, will moderate.

11:45am - 12:45pm: Lunch

12:45pm - 2:00pm: Making the Web Work for Science: What Scientists Really Need!

In this session, two practicing scientists will discuss their use of conventional and Web-based information tools for scientific research, what works and what does not, and what they believe the information community needs to provide in to maximize the full potential of the Web as an effective and essential resource for scientific discovery.

The panel participants are Dr. Antony Williams, Founder, ChemSpider; and Dr. Alberto Conti, Astrophysicist, Space Telescope Science Institute. Jill O’Neill, NFAIS, will moderate.

2:00pm - 3:30pm: Making the Web Work for Science: Challenges to Implementation

In this session, three experts will discuss the technological, legal and cultural challenges that all organizations must overcome – libraries, publishing institutions, scientific laboratories, etc. - so that each can utilize the full potential of the Internet and the Web met in the fulfillment of their common mission – to build the world’s knowledgebase through enabling research and managing the flow of scholarly communication.

The participants are Dr. Michael R. Nelson, Visiting Professor at Georgetown University; Fred Haber, Vice President and General Counsel, Copyright Clearance Center; Dr. Michael Nielsen, Physicist and Science Writer, Perimeter Institute for Theoretical Physics (Canada). Bonnie C. Carroll, Executive Director of the CENDI Secretariat, will moderate.

3:30pm - 3:45pm: Break

3:45pm - 4:30pm: Making the Web Work for Science: What the Future Holds

This final keynote will explore the future promise of the Web and the various ways in which the cyber-infrastructure can ultimately re-engineer not only how scientific research is conducted, but also how the resultant information is communicated, shared, verified, and built upon as scientists and scholars around the globe increasingly collaborate in building the world’s knowledgebase of scientific and scholarly information.

Ellen Herbst, NTIS Director, will introduce Dr. Christopher Greer, recently of the National Science Foundation’s Cyber-Infrastructure Office, and currently the Director of Networking and Information Technology Research and Development (NITRD) of the National Coordination Office.

4:30pm: Adjournment

PDF Version Available At




General Information / Registration / Etc.

There is a two-fee structure for this workshop to allow the sponsors’ and host’s members an opportunity to attend at a reduced cost. CENDI, NFAIS, and FLICC members will be charged $65.00; all others have a registration fee of $95.00.


Saturday, November 8, 2008

Defrosting The Digital Library: Bibliographic Tools For The Next Generation Web

Hull, D., S. R. Pettifer, and D. B. Kell / October 2008 / Defrosting The Digital Library: Bibliographic Tools For The Next Generation Web / PLoS Comput Biol 4 (10), e1000204+

Extensive Bibliography (210 Items)

Many scientists now manage the bulk of their bibliographic information electronically, thereby organizing their publications and citation material from digital libraries. However, a library has been described as “thought in cold storage,” and unfortunately many digital libraries can be cold, impersonal, isolated, and inaccessible places.

In this Review, we discuss the current chilly state of digital libraries for the computational biologist, including PubMed, IEEE Xplore, the ACM digital library, ISI Web of Knowledge, Scopus, Citeseer, arXiv, DBLP, and Google Scholar. We illustrate the current process of using these libraries with a typical workflow, and highlight problems with managing data and metadata using URIs.

We then examine a range of new applications such as Zotero, Mendeley, Mekentosj Papers, MyNCBI, CiteULike, Connotea, and HubMed that exploit the Web to make these digital libraries more personal, sociable, integrated, and accessible places.

We conclude with how these applications may begin to help achieve a digital defrost, and discuss some of the issues that will help or hinder this in terms of making libraries on the Web warmer places in the future, becoming resources that are considerably more useful to both humans and machines.

Source and Full Text

Friday, October 10, 2008

Current Biomedical Publication System: A Distorted View of the Reality of Scientific Data?

Why Current Publication Practices May Distort Science

Young NS, Ioannidis JPA, Al-Ubaydli O

PLoS Medicine Vol. 5, No. 10, e201 / October 7 2008



The current system of publication in biomedical research provides a distorted view of the reality of scientific data that are generated in the laboratory and clinic. This system can be studied by applying principles from the field of economics. The “winner's curse,” a more general statement of publication bias, suggests that the small proportion of results chosen for publication are unrepresentative of scientists' repeated samplings of the real world.

The self-correcting mechanism in science is retarded by the extreme imbalance between the abundance of supply (the output of basic science laboratories and clinical investigations) and the increasingly limited venues for publication (journals with sufficiently high impact). This system would be expected intrinsically to lead to the misallocation of resources. The scarcity of available outlets is artificial, based on the costs of printing in an electronic age and a belief that selectivity is equivalent to quality.

Science is subject to great uncertainty: we cannot be confident now which efforts will ultimately yield worthwhile achievements. However, the current system abdicates to a small number of intermediates an authoritative prescience to anticipate a highly unpredictable future. In considering society's expectations and our own goals as scientists, we believe that there is a moral imperative to reconsider how scientific data are judged and disseminated.

Full Text Available At:


Box 1. Potential Competing or Complementary Options and Solutions for Scientific Publication

  • Accept the current system as having evolved to be the optimal solution to complex and competing problems.
  • Promote rapid, digital publication of all articles that contain no flaws, irrespective of perceived “importance”.
  • Adopt preferred publication of negative over positive results; require very demanding reproducibility criteria before publishing positive results.
  • Select articles for publication in highly visible venues based on the quality of study methods, their rigorous implementation, and astute interpretation, irrespective of results.
  • Adopt formal post-publication downward adjustment of claims of papers published in prestigious journals.
  • Modify current practice to elevate and incorporate more expansive data to accompany print articles or to be accessible in attractive formats associated with high-quality journals: combine the “magazine” and “archive” roles of journals.
  • Promote critical reviews, digests, and summaries of the large amounts of biomedical data now generated.
  • Offer disincentives to herding and incentives for truly independent, novel, or heuristic scientific work.
  • Recognise explicitly and respond to the branding role of journal publication in career development and funding decisions.
  • Modulate publication practices based on empirical research, which might address correlates of long-term successful outcomes (such as reproducibility, applicability, opening new avenues) of published papers.

>>>Extended Version<<<

The Market for Exchange of Scientific Information: The Winner’s Curse, Artificial Scarcity, and Uncertainty in Biomedical Publication


Guest Blog

More Evidence on Why We Need Radical Reform of Science Publishing / Richard Smith

PLoS Medicine invited Richard Smith, former editor of the BMJ and current board member of PLoS, to discuss an essay published this week by Neal Young, John Ioannidis and Omar Al-Ubaydli that argues that the current system of publication in biomedical research provides a distorted view of the reality of scientific data.

"For me this paper simply adds to the growing evidence and argument that we need radical reform of how we publish science. I foresee rapid publication of studies that include full datasets and the software used to manipulate them without prepublication peer review onto a large open access database that can be searched and mined. Instead of a few studies receiving disproportionate attention we will depend more on the systematic reviews that will be updated rapidly (and perhaps automatically) as new results appear."


News Coverage

The Economist [10-09-08] : Scientific Journals: Publish and Be Wrong

"Dr Ioannidis made a splash three years ago by arguing, quite convincingly, that most published scientific research is wrong. Now, along with Neal Young of the National Institutes of Health in Maryland and Omar Al-Ubaydli, an economist at George Mason University in Fairfax, Virginia, he suggests why."

>>>With Comments<<<


Newsweek [10-06-08] : Don't Believe What You Read, Redux / Sharon Begley

"Bottom line: when it comes to 'the latest studies,' take what you read with a grain of salt.



Why Most Published Research Findings Are False / John P. A. Ioannidis / PLoS Med. 2005 August; 2(8): e124 / August 30 2005



There is increasing concern that most current published research findings are false. The probability that a research claim is true may depend on study power and bias, the number of other studies on the same question, and, importantly, the ratio of true to no relationships among the relationships probed in each scientific field. In this framework, a research finding is less likely to be true when the studies conducted in a field are smaller; when effect sizes are smaller; when there is a greater number and lesser preselection of tested relationships; where there is greater flexibility in designs, definitions, outcomes, and analytical modes; when there is greater financial and other interest and prejudice; and when more teams are involved in a scientific field in chase of statistical significance.

Simulations show that for most study designs and settings, it is more likely for a research claim to be false than true. Moreover, for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias. In this essay, I discuss the implications of these problems for the conduct and interpretation of research.


Sunday, September 14, 2008

The Wiki: An Environment For Scholarly Conversation and Publishing

The Wiki: An Environment For Scholarly Conversation and Publishing

Gerry McKiernan, Associate Professor and Science and Technology Librarian, Iowa State University Library
Ames Iowa USA
September 09 2008 / 18:00 – 18:20

"The Medium Is The Message ... The Audience Is The Content"

Marshall McLuhan. Understanding Media: The Extensions of Man. New York: McGraw-Hill, 1964.

A "wiki is a ... collaborative space ... because of its total freedom, ease of access, and use, [and] simple and uniform navigational conventions ... ." "[It] ... is also a way to organize and cross-link knowledge ..." Ward Cunningham, Father of The Wiki (Leuf and Cunningham, 2001, 16). Most wikis provide the user with a set of navigation or utility tools such as the ability to create and edit a page, view recently changed pages, and rollback to previous page versions. In addition, many wikis include a discussion forum for proposed page changes.

Among its many perceived benefits are its potential for facilitating a more creative environment and expanding knowledgebase, and a significant ability to harness the power of diverse point-of-views in creating collaborative works.

In this presentation, we will speculate on the Wiki as a digital environment that not only supports current scholarly practices, but more importantly, offers a framework for their enhancement and transformation.


A/V For Presentation Available
Windows Presentation, Windows Video File, Flash Presentation, MP3, PDF

==>I Recommend The Flash Presentation<==


BTW: May I Recommend A Most Excellent Bed & Breakfast

If/When You Visit Waterloo

[It's The Home of Research in Motion / BlackBerry]

Sugar Bush Guest House B&B



BTW-2: The Wiki: An Environment for Scholarly Conversation and Publishing is based upon

“Wikis: Disruptive Technologies for Dynamic Possibilities” PowerPoint presentation delivered at Digital Libraries à la Carte: Choices for the Future, Tilburg University, The Netherlands, August 23, 2005


(accessed 21 September 2008)

Friday, September 5, 2008

LiveScience: Era of Scientific Secrecy Near End

Era of Scientific Secrecy Near End / By Robin Lloyd, LiveScience Senior Editor / posted: 02 September 2008 11:30 am ET

Secrecy and competition to achieve breakthroughs have been part of scientific culture for centuries, but the latest Internet advances are forcing a tortured openness throughout the halls of science and raising questions about how research will be done in the future.

The openness at the technological and cultural heart of
the Internet is fast becoming an irreplaceable tool for many scientists, especially biologists, chemists and physicists — allowing them to forgo the long wait to publish in a print journal and instead to blog about early findings and even post their data and lab notes online. The result: Science is moving way faster and more people are part of the dialogue.


Open Science

The open science approach forces researchers to grapple with the question of whether they can still get sufficient credit for their ideas, said physicist Sabine Hossenfelder, co-organizer of a conference on the topic set to begin Sept. 8 at the Perimeter Institute in Ontario, Canada.

[BTW: I Will Be Attending This Unique Conference
Science in the 21st Century: Science, Society, and Information Technology [http://tinyurl.com/6ll8fb] / Look For Conference-Related Postings on the _Scholarship 2.0_ Blog [http://scholarship20.blogspot.com/] within the next two weeks]


Open science is a shorthand for
technological tools, many of which are Web-based, that help scientists communicate about their findings. At its most radical, the ethos could be described as "no insider information." Information available to researchers, as far as possible, is made available to absolutely everyone.

Beyond email, teleconferencing and search engines, there are many examples: blogs where scientists can correspond casually about their work long before it is published in a journal; social networks that are scientist friendly such as Laboratree and Ologeez; GoogleDocs and wikis which make it easy for people to collaborate via the Web on single documents; a site called Connotea that allows scientists to share bookmarks for research papers; sites like Arxiv, where physicists post their "pre-print" research papers before they are published in a print journal; OpenWetWare which allows scientists to post and share new innovations in lab techniques; the Journal of Visualized Experiments, an open-access site where you can see videos of how research teams do their work; GenBank, an online searchable database for DNA sequences; Science Commons, a non-profit project at MIT to make research more efficient via the Web, such as enabling easy online ordering of lab materials referenced in journal articles; virtual conferences; online open-access (and free) journals like Public Library of Science (PLoS); and open-source software that can often be downloaded free off Web sites.

[BTW: Several Of These Innovations Have Been Profiled In My SciTechNet(sm) Blog [http://scitechnet.blogspot.com/] and/or The Scholarship 2.0 Blog [http://scholarship20.blogspot.com/]

The upshot: Science is no longer under lock and key, trickling out as it used to at the discretion of laconic professors and tense PR offices. For some scientists, secrets no longer serve them. But not everyone agrees.

Networked Cyborgs

Just a few decades ago, as a scientist, here is how you did your work: You toiled in obscurity and relative solitude.


However, today,
more and more scientists, as well as researchers in the humanities, operate like transparent, networked cyborgs. Background research is mostly done online, not in the library. Some data and preliminary research might be posted online via a blog or open notebook. Early write-ups of the work might be announced to the public, or at least discussed online with peers. And these early write-ups might also be posted to an online publication that is not peer-reviewed in the strict sense.


"In areas like my own subfields of theoretical physics," said MIT physicist David Kaiser, "the only constraint [on how rapidly one generates research papers] is, 'Did you have more coffee that day?' We aren't usually held up trying to get an instrument to work, or slogging through complicated data analysis."Most people think faster is better, but there are other issues.

Is It A Good Thing?

There is "no question" that all efforts to make science more open are positive for the progress of science, says open science proponent and chemist Jean-Claude Bradley at Drexel University in Philadelphia, who posts his lab notebook online and started a blog in 2005 called UsefulChemistry where he and his colleagues regularly discuss chemistry problems as well as Web 2.0 tools and the technical and philosophical issues they raise.His online notebook and blog definitely make it easier to communicate with colleagues, he said. Such sharing also makes it easier for others to "replicate" scientists' work — try it themselves and convince themselves that you are right. And this replication issue is one of the principles behind scientific research. Anyone who has written down a recipe for a friend knows that we all tend to spell things out more clearly when sharing them than we would if we were just taking notes for ourselves in our own shorthand.

Open science also has the potential to prevent discrimination in access to information. Arxiv, the site for posting pre-print physics papers, was started in 1991 by Cornell physicist Paul Ginsparg, then at Los Alamos National Laboratory, to help provide equal access to prepublication information to graduate students, postdocs and researchers in developing countries.

[BTW: Paul Ginsparg will be one of several Major Players attending/presenting at The Conference [


And open science benefits the public, Bradley said. He tries to keep his posts fairly accessible (although this is not the case for all open notebooks and open science blogs).


"It's not clear to me that professional scientists or people in academic institutions have a
monopoly on good ideas," he said. "There are very smart people outside of academia, for example hobbyists or people in industry who could contribute, and having more contributors can only help. The same applies to interdisciplinary and cross-disciplinary approaches."


Drawbacks of Open Science

One of the biggest fears of nearly all researchers is that someone else hears what you're doing and beats you to publication. That means you wasted a lot of time (and most researchers work extremely long hours, so loss of productivity is especially painful and can also harm one's chances for
getting a job or promotion or funding for the next research project). Once you publicly reveal your thoughts, data or experimental results, some say, you lose control over ownership of that information. This topic is covered by an area of law called intellectual property, as well as patent law, and there can be significant money to be fought over when it comes to patents.

Hossenfelder, the conference organizer, says she knows of several examples in which scientists have had an idea for something, talked about it openly and then somebody else has published the fleshed-out idea first without giving any credit beyond an acknowledgment to the original idea-holder. Acknowledgments don't advance careers.

However there are solutions to this, she said. For instance, the prominent scientific journal Nature encourages authors to include brief summaries of which author contributed what to a project. Some say that online posts provide a time-stamped record of when an experiment was documented. Those stamps can easily be arbitrarily altered after the fact, but it might also be possible to "lock" posts at a certain date after which they could not be changed without some sign-off permission to break the lock, Hossenfelder said. [snip]

Fear of Losing Peer Review

Another drawback of open science can be that results go public before they should. In science, experimental results are frequently proven wrong by subsequent work. Yet even peer review cannot ensure against this, nor can it prevent outright fraud, as proven by
a 2005 case involving a South Korean scientist who claimed to have achieved the first cloning of a human embryo. A later examination of his work showed he had fabricated his results.


"The social system of science has become so complicated, unregulated and dispersed in terms of geography and disciplines, so peer review has been elevated to a principle that unifies a fragmented field," Biagioli said.


And today, Arxiv, one of the most frequently cited examples of open science, has no peer review for individual papers, but it has begun to add in some constraints on allowable authors. The site used to allow anyone with email addresses associated with academic institutions to post their papers. Now, authors of research papers who post in Arxiv are vetted before they can post for the first time. In some ways, things are tightening up when it comes to openness in physics, Kaiser said. In any case, the function of print journals, in physics at least, is changing.

"Ease of sharing everything prior to peer review is flourishing, and in my opinion very few physicists are reading journals for information these days," Kaiser said. "Journals have largely lost their information function."


For The Good Of Truth, Humanity, Economies?

Another argument in favor of open science is sort of a big picture issue for humanity, scientific truth and economies, Neylon said.

"Making things more open leads to more innovation and more economic activity, and so the technology that underlies the Web makes it possible to share in a way that was never really possible before, while at same time it also means that kinds of models and results generated are much more rich," he said.

This is the open source approach to software development, as opposed to commercial closed source approaches, Neylon said. The internals are protected by developers and lawyers, but the platform is available for the public to build on in very creative ways.

"Science was always about mashing up, taking one result and applying it to your [work] in a different way," Neylon said. "The question is 'Can we make that as effective as samples data and analysis as it does for a map and set of addresses for a coffee shop?' That is the vision."


Thanks to Sabine Hossenfelder For The HeadsUp !


Sunday, August 31, 2008

NeoNote: User Centered Design Suggestions for a Global Shared Scholarly Annotation System

NeoNote: User Centered Design Suggestions for a Global Shared Scholarly Annotation System (Abstract)

Brad Hemminger, UNC/CH School of Information and Library Science

Information Seeking Support Systems Workshop / An Invitational Workshop Sponsored by the National Science Foundation / June 26-27 2008 / Chapel Hill, NC USA


Significant changes are occurring to scholarly communications due to technological innovations, primarily the advent of computers and the internet. Compared to thirty years ago, scholars use computers both to do their work and write their papers. They publish in journals still, but the articles are available and accessed more in digital format than in print format. Scholars increasing share their work with others by putting their work on their websites, in institutional repositories, or emailing to others. They use collaborative tools for writing papers, or creating shared content on wikis. Scholars are beginning to compile large digital collections of research papers instead of print collections. They are beginning to annotate these papers electronically, and to share these annotations. They save citation information digitally in citation managers and automatically incorporate this information when writing their papers. They start their searches more often with search engines that traditional library catalog resources, andthey spend most of their time searching for and looking for information in web browsers.

There has been a proliferation of tools developed to help support scholars in performing this myriad of activities, but in most all cases, the tools are designed for only a specific task or environment. As a result, scholars are forced to utilize many different incompatible tools to perform their scholarly work and communication. This paper looks at the problem from the scholar’s perspective and proposes a user interface well suited to scholarly work practices. It also proposes a paradigm for how scholarly annotations could be captured, stored, searched and re‐used on both a global scale, and at the level of an individual research laboratory or group. It is our hope that the paradigms proposed in this paper will provoke discussions in the digital library community about shared global designs for digital library content, including annotations.

As part of a user centered design process to develop a global shared annotation system, information from surveys [Hemminger ASIST 2007], our interviews with scientists at UNC, and feedback from users of scholarly annotation tools were analyzed to generate a list of features scholarly researchers required. Currently, there are no systems designed to provide a single comprehensive system to address all user’s needs. There are, however, many individual applications that provide excellent support for one or more features needed by the scholarly researcher. This paper describes the desired features of a single comprehensive system, gives examples of current tools that support these features, and then describes the architecture of our resulting design.



NeoNote: A User Interface for "Memex" (Duration 8:11)



NeoNote: Suggestions for a Global Shared Scholarly Annotation System / Bradley Hemminger, University of North Carolina, Chapel Hill / D-Lib Magazine / May/June 2009 / Volume 15 Number 5/6 / doi:10.1045 /may2009-hemminger


Friday, August 29, 2008

Mememoir: The Radical Scientific Wiki Engine

Scientific Wiki Solves The 'Who Wrote What' Problem

Next Generation Wiki [Engine] Links Every Word To Its Author

Reporting in Nature Genetics, scientist Robert Hoffmann develops the first Wiki where authorship really matters. Based on a powerful authorship tracking technology, this next generation wiki links every word to its corresponding author. This way readers can always know their sources and authors receive due credit.

The history of a collaborative wiki article can become extremely complex within a few editing cycles. Someone creates a paragraph; someone else deletes a sentence, inserts a word here and there, and so forth. - "How could the reader of such an article know who wrote what," asks Dr. Robert Hoffmann, Society in Science fellow and visiting scientist at the Massachusetts Institute of Technology, MIT.

In first generation wikis, this information could theoretically be found in the archives, but in practice, it is impossible for a reader to reconstruct the authorship of specific texts from hundreds of previous versions. This has been the root cause of a lasting suspicion against wikis in academia and the business world, since the uncertainty as to the source of a single word can decrease the value of a collaborative text in its entirety.

Apart from being an important guidance to the reader, authorship is often key to a successful academic and professional career. Authorship provides an important basis to establish priority of ideas and discoveries and to build a reputation among peers. "It is only fair to duly acknowledge authors, who invest time and knowledge in their contributions," Hoffmann says in his article.

Clear authorship attribution in this next generation wiki makes it also possible that users can rate each other based on their contributions. For the first time, collaborative publishing can therefore be enhanced with the advantages of a reputation system. Hoffmann describes how a self-regulating reputation system can help to settle editing conflicts, which were an important problem in first generation wikis and used to depend on slow and refutable top-down decisions.

The scientific wiki project, introduced in the September issue of Nature Genetics and released online today, is the first of its kind and a milestone in the Mememoir project. "This release is an important proof of principle, but our ambitious aim with the Mememoir project is to revolutionize publishing in all of science," says Dr. Hoffmann, "with a knowledge base that is open access, interdisciplinary and combines the altruistic possibilities of wikis with explicit authorship."

The first scientific wiki system of the Mememoir project has been released online today at WikiGenes.

Source [http://www.mememoir.org/]

Robert Hoffman /A Wiki for the Life Sciences Where Authorship Matters / Nature Genetics / volume 40 / number 9 / 1047 - 1051 /September 2008 / Published online 27 August 2008 / doi:10.1038/ng.f.217

Sample Text



WikiGenes is a collaborative knowledge resource for the life sciences, which is based on the general wiki idea but employs specifically developed technology to serve as a rigorous scientific tool. The rationale behind WikiGenes is to provide a platform for the scientific community to collect, communicate and evaluate knowledge about genes, chemicals, diseases and other biomedical concepts in a bottom-up process.


In WikiGenes, authorship tracking technology is used to link every contribution unambiguously to its author, creating the first hybrid of traditional, scientific and collaborative, dynamic publishing ... . This technical innovation in WikiGenes also supports the other central function of authorship as guidance for the reader. Authorship is essential to appraise origin, authority and reliability of information. This is especially important in the wiki model, with its dynamic content and large number of authors.


How could the reader of such an article know who wrote what? In first generation wikis, this information can theoretically be found in the archives and attempts have been made to establish reliability measures, but in practice, it is impossible for a user to reconstruct the authorship of specific text passages from hundreds of previous versions.

The uncertainty as to the source of specific texts is therefore an important problem in dynamic publications and decreases the value of articles in their entirety. In WikiGenes, on the contrary, new contributions are identified with every editing step and attributed to their authors. Thus readers can always know the corresponding author of any part of a WikiGenes article.


Future prospects

The technological innovation in WikiGenes is central to the attempt to turn the wiki model into a rigorous scientific tool. To this aim it is also important to provide a framework that supports the contribution of novel and original research. Clear authorship attribution facilitates this essentially, but the integrative and harmonizing forces in dynamic publications tend to work against original and novel views. In WikiGenes, authors are therefore provided with the option to create protected articles with a limited number of selected co-authors. These articles cannot be edited by others, but they can still be linked to the encyclopedic core and discussed and rated by everyone. This way, it would be possible in the near future to publish original research and establish priority of discoveries and theories.


[http://www.nature.com/ng/journal/v40/n9/full/ng.f.217.html] (Subscriber Access)

[http://www.nature.com/ng/journal/v40/n9/pdf/ng.f.217.pdf] (Subscriber Access)



WikiGenes Introduction & Tutorial


Sample 'Author' Contribution Page


Sample 'Author' Contribution


See Also


Thanks / Bernie Sloan / Sora Associates / Bloomington, Indiana / For The HeadsUp

Thursday, August 28, 2008

Lowering the Interactive Cost Of Tagging Systems: SparTag.us and Click2Tag:

SparTag.us and Click2Tags: Lowering the Interactive Cost Of Tagging Systems

Tagging systems such as del.icio.us and Diigo have become important ways for users to organize information gathered from the Web. However, despite their popularity among early adopters, tagging still incurs a relatively high interaction cost for the general users.

To understand the costs of tagging, for each of these systems, we performed a GOMS-like analysis of the interface and identified the overall number of steps involved in tagging. We count these steps to get a gross measure of the tagging costs ... [snip]


We introduce a new tagging system called SparTag.us, which uses an intuitive Click2Tag technique to provide in situ, low cost tagging of web content. In SparTag.us, we bring the tagging capability into the same browser window displaying the web page being read. When a user loads a web page in his browser, we augment the HTML page with AJAX code to make the paragraphs of the web pages as well as the words of the paragraphs live and clickable.

As users read a paragraph, they can simply click on any words in the paragraph to tag it. SparTag.us also lets users highlight text snippets and automatically collects tagged or highlighted paragraphs into a system-created notebook, which can be later browsed and searched. We're currently conducting an internal PARC beta-testing of this tool, and hope to release it for public use in the near future.


Augmented Social Cognition / Augmented Social Cognition Research Group at Palo Alto Research Center (PARC).


Lichan Hong, Ed H. Chi, Raluca Budiu, Peter Pirolli, and Les Nelson. SparTag.us: Low Cost Tagging System for Foraging of Web Content. In Proceedings of the Advanced Visual Interface (AVI2008), pp. 65--72. ACM Press, 2008.




PARC Forum Webcast

Enhancing the Social Web Through Augmented Social Cognition Research / Ed Chi /PARC Augmented Social Cognition Group

May 1 2008 / 4:00 p.m. / George E. Pake Auditorium / Palo Alto, CA ,USA


New Age Tagging = SharedTags(sm) | TagFont(sm) | TagSort(sm)


.... Tag Queries for a Thursday Early Afternoon ....

I. Are there implementions/technologies that can display the degree of association / co-occurence of Tags within a corpus (and enable one to navigate in one way or another (e.g., tag cloud and/or other visualization))?

II. Are there implementations/technologies that allow one to designate the relevative importance of a tag for a section and/or corpus of text? Major / Minor Importance Would Be A Good Beginning (e.g. Bold vs. Non-Bold)

BTW: Certainly The Ability To Indicate Relative Importance by Font Size / Style Would Be More Interesting [:-)

[And Lets Not Forget About Color [http://tinyurl.com/53vk36]]

III. Are there implementations/technologies that allow one to sort sub-document tagged text by select criteria? For example: Have the ability (re)sort tagged text sections (e.g. paragraphs) such that the most relevant sorted sections are displayed before those of less relevance (e.g. sections with tags that are more 'associated' or co-occurring)

BTW: Tagging Is Not Limited To 'Text' (Can Also Apply To Photos (Flickr), Videos (YouTube), Other Media .

Please Post Responses / Thoughts As (A) Comment (s) On This Blog Entry



Wednesday, August 27, 2008

Tagging @ Sub-Document Level(s)?


Are you aware of any current applications/technologies that allows one to Tag a document at the **Paragraph** (or Chapter) level? (and Not Only At The Document Level)?

I am also interested Tag Cloud technologies that allows for the visualization of Tags associated with selected Paragraphs (or Book Chapters or Other Formal Parts Of A Work).



Saturday, August 16, 2008

T-As-In-Team: Management Through Collaboration

Management Through Collaboration: Teaming in a Networked World (Routledge publishers, 2010)

Charles Wankel / Author and Organizer / St. John’s University, New York, USA /

The idea is that this book will be produced using an immense network of coauthors. The chapters will present text, examples, and exercises using networking in a globalized world as a prism through which the key management functions are refracted in telling, useful and important ways. This introductory management textbook is using a new authoring structure to create a high quality, cutting-edge, and well-researched book.

The coauthors of this breakthrough endeavor number almost a thousand management educators and researchers in about ninety nations. The twenty-first century global virtual community creating this work is itself an interesting constellation of management phenomena that provides a wide range of exciting management experiences for its members to use as examples in their teaching and writing. More importantly, being part of such a diverse, constantly self-creating, mob of innovators is immense fun! It is our hope that our contributions from Tonga to Peru, from Iceland to Botswana, from Hawaii to Tunisia, from China to Grenada, will reflect our diversity and yet our communality in this increasingly connected world in ways that will engage and excite learners in all the nations of the world.



1: Managing the New Workplace: Collaborating in the organization
2: Historical Context of Contemporary Management: From Individual Stars to Winning Teams

3: Shaping Corporate Culture
4: Managing in a Global Environment
5: Ethics and Corporate Social Responsibility
6: Entrepreneurship and E-commerce

7: Organizational Planning and Goal Setting
8: Strategy Formulation and Implementation
9: Managerial Decision Making
10: Global Management

11: Organizing in a Networked World
12: Structures for Coordinating in a High Tech World
13: Change at All Levels and Speeds
14: Human Resource Management
15: Diversity in Multicultural Organizations

16: Attitudes, Perceptions, Learning and Stress
17: Leadership in Organizations
18: Motivation in Organizations
19: Communicating in Organizations
20: Teamwork in Organizations

21: The Importance of Control
22: Information Technology and E-Business
23: Operations and Service Management









Citation Style
Follow The Chicago Manual of Style, newest edition, for citation and other stylistic formats.

Microsoft Word

Textual material for the book should be submitted in Microsoft Word (Windows PC version).


When will I be assigned to a chapter team?

Currently we are registering authors into chapter wikis. As new colleagues join the project, they will be registered within a week's time after completing the authors' survey.

What is the general project timeline?

The draft of the main paper-form textbook is due on December 1st, 2008. However, the digital form and ancillaries can be worked on after that. The book comes out in January 2010.

See Also

"Management Professor Uses 'Crowdsourcing' to Write Textbook"


Sunday, August 10, 2008

Use And Misuse Of Bibliometric Indices In Evaluating Scholarly Performance

Ethics In Science And Enviromental Politics / THEME SECTION / The Use And Misuse Of Bibliometric Indices In Evaluating Scholarly Performance

Editors: Howard I. Browman, Konstantinos I. Stergiou

Quantifying the relative performance of individual scholars, groups of scholars, departments, institutions, provinces/states/regions and countries has become an integral part of decision-making over research policy, funding allocations, awarding of grants, faculty hirings, and claims for promotion and tenure. Bibliometric indices (based mainly upon citation counts), such as the h-index and the journal impact factor, are heavily relied upon in such assessments. There is a growing consensus, and a deep concern, that these indices — more-and-more often used as a replacement for the informed judgement of peers — are misunderstood and are, therefore, often misinterpreted and misused. The articles in this ESEP Theme Section present a range of perspectives on these issues. Alternative approaches, tools and metrics that will hopefully lead to a more balanced role for these instruments are presented.

TITLE PAGE [Preface] ;
Full text in pdf format

Browman HI, Stergiou KI / INTRODUCTION: Factors and indices are one thing, deciding who is scholarly, why they are scholarly, and the relative value of their scholarship is something else entirely
ESEP 8:1-3 ; Full text in pdf format

Campbell P / Escape from the impact factor
ESEP 8:5-7 ; Full text in pdf format

Lawrence PA / Lost in publication: how measurement harms science
ESEP 8:9-11 ; Full text in pdf format

Todd PA, Ladle RJ / Hidden dangers of a ‘citation culture’
ESEP 8:13-16 ; Full text in pdf format

Taylor M, Perakakis P, Trachana V / The siege of science
ESEP 8:17-40 ; Full text in pdf format

Cheung WWL/ The economics of post-doc publishing
ESEP 8:41-44 ; Full text in pdf format

Tsikliras AC/ Chasing after the high impact
ESEP 8:45-47 ; Full text in pdf format

Zitt M, Bassecoulard E/ Challenges for scientometric indicators: data demining, knowledge flows measurements and diversity issues
ESEP 8:49-60 ; Full text in pdf format

Harzing AWK, van der Wal R / Google Scholar as a new source for citation analysis
ESEP 8:61-73 ; Full text in pdf format

Pauly D, Stergiou KI / Re-interpretation of ‘influence weight’ as a citation-based Index of New Knowledge (INK)
ESEP 8:75-78 ; Full text in pdf format

Giske J / Benefitting from bibliometry
ESEP 8:79-81 ; Full text in pdf format

Butler L/ Using a balanced approach to bibliometrics: quantitative performance measures in the Australian Research Quality Framework
ESEP 8:83-92 ; Full text in pdf format Erratum

Bornmann L, Mutz R, Neuhaus C, Daniel HD / Citation counts for research evaluation: standards of good practice for analyzing bibliometric data and presenting and interpreting results
ESEP 8:93-102 ; Full text in pdf format

Harnad S / Validating research performance metrics against peer rankings
ESEP 8:103-107 ; Full text in pdf format

Table of Contents