Today’s networked information environment is creating new responsibilities for “curating” your online record—and new tools to help do just that.
The Web has opened up myriad opportunities for finding information, for international collaboration and for doing science. But the Web’s vast scale has also created new tasks related to managing your online persona. For researchers, these tasks go well beyond creating a LinkedIn profile or maintaining a Twitter feed. How do you ensure that your research output is findable, and tied to you alone online? How do you keep track of the impact of your overall scientific output on an increasingly fragmented Web?
In this column, we touch on a few of the many online services that are springing up to help scientists manage and assess their online reputations.
A scientist’s career is largely defined by his or her publication record; in an era of automated search and indexing, it makes sense to tie your publications unambiguously to yourself. A number of tools exist to create that tie.
One of the highest-profile tools for researcher identity management is the ORCID ID. ORCID is a nonprofit organization of researchers, publishers, academic institutions and others that offers the individual scholar a unique, persistent identifier and provides tools to link that identifier to the individual’s research output. Since fall 2012, when the service was opened to the public, more than 515,000 scholars have set up ORCID IDs and some 120 member organizations, including OSA, have joined ORCID.
Having an ORCID ID lets your research contributions—publications, data sets, patents, professional association activities and employment history—remain linked to you even if you change jobs, get a new email address, move to a new country or even change your name. It also can connect content identifiers from other indexing services like Scopus, CrossRef and ResearcherID to your record. ORCID does not collect private identifiers like gender, address and phone numbers, and you can make any part of your ORCID record private.
Registration with ORCID is free and open to researchers in academia or industry, at any level, from any discipline and from any country. The registry service is available in five languages: English, French, Spanish, traditional Chinese and simplified Chinese.
ORCID offers membership and subscription status to universities and research organizations, publishers, professional associations, repositories and funders, who can use ORCID data for diverse purposes such as conducting faculty searches, viewing a researcher’s current and past funding and auto-populating fields in manuscript submission systems. ORCID actively encourages its members to integrate ORCID’s persistent identifiers into their workflows (as OSA has done in its Prism peer review system), so that a researcher’s ORCID ID is updated as soon as new metadata is entered.
What sets ORCID apart from other contributor ID systems, such as PubMed IDs and Thomson Reuters’s ResearcherID? ORCID is different, says executive director Laure Haak, “because IDs are collected in research workflows so that they become a part of the document as it happens. This means better data accuracy and less time spent cleaning up databases.”
A profile on Google Scholar offers another way to keep track of your published content, as well as to find other articles that cite your work.
You can set up a profile from the Google Scholar home page, where you will be asked to create or log in to your Google account. Next, you will be walked through a series of steps to create your profile, which includes entering your affiliation and areas of interest. Inputting your university email address (if you have one) is a good idea because it will make your profile eligible for inclusion in Google Scholar search results. Next, you will be asked to select your published works from a list that is potentially related to you, based on the information you provided about yourself.
As new material is published under your name, Google Scholar gives you the opportunity to claim it. You also have the ability to make your profile private or public. If you make your profile public, and someone searches for your name, your profile page will pop to the top of the search results.
Having a Google Scholar profile allows you to attribute journal articles, theses, books, abstracts and documents from professional societies, online repositories and universities directly to you. It is also a tool for networking with current or potential colleagues and exploring work across many disciplines.
One benefit of organizing all of your research output under a single identity, such as an ORCID ID, is that it offers a convenient window into the full scope and impact of your work online. But how is that impact measured?
Traditionally, the scientific community has assessed an article’s impact based on the journal in which it is published, and article citation data and journal impact factor remain the default evaluation criteria of tenure committees, funders and others. However, the manner in which people find, access and share research is changing, and the prominence of online search has resulted in a focus on the impact of individual articles as well as that of the journals they’re published in.
In light of that shift, some advocate an alternative set of “article-level” metrics. Known collectively as “altmetrics,” these include online usage statistics, social-media uptake (Twitter mentions, Facebook likes, etc.), blog postings and press releases, among others. These new methods of assessing influence can be applied not only to articles but to other research outputs like data sets, software or slide presentations.
Some publishers have picked up on this trend and are offering selected article-level statistics for their authors and readers. In 2012, for example, OSA added charts and graphs that reflect full-text download and abstract request numbers, along with citation data, to a “metrics” tab on the article abstract page in each of its journals.
A number of third-party firms provide this data and other social-media metrics as well. Altmetric.com, for example, provides publishers, librarians and repository managers, as well as researchers, with a tool that provides links to an article’s mentions and shares in social media, as well as a score for an article based on these and other measures. Plum Analytics, an altmetrics provider acquired by EBSCO in early 2014, maintains a database called PlumX that gathers metrics across five categories: usage, mentions, captures, social media and citations. Another third-party tool, ImpactStory, is geared more toward individuals, and allows researchers to create profiles populated with altmetrics data on their body of work.
Proponents of altmetrics argue that these measures provide a more complete and accurate depiction of an article’s importance, faster than citation counting and peer review, and suggest that while citations remain a key part of judging impact, altmetrics can provide a broader picture of the various types of engagement and conversation surrounding research, as well as counts of how many people are reading and reusing it.
However, what altmetrics actually mean for assessing an article’s impact is still unclear. Critics point out that there is potential for gaming of the numbers; likes and mentions can be bought from companies that specialize in artificially inflating one’s social-media presence. There are also questions about the merit of social-media uptake and download statistics when the sources of the numbers are unknown—do tweets about an article really matter if they don’t come from scholars in the field? Are people sharing an article because they read and understand it, or because it has a buzzword in the title? Not all factors that might contribute to an article’s Internet popularity are scientifically relevant.
Although the future of altmetrics in academia is uncertain, the subject is clearly generating a lot of debate. One indication: The National Information Standards Organization (NISO) is working on a set of standards and recommended practices, expected by late 2015, to help the community understand whether and how best to use these measures. It’s a conversation that could be well worth following.
Sarah Michaud is OPN’s associate editor. Hannah Bembia is OSA’s publishing and special projects manager.