Where’s the evidence? Obstacles to impact-gathering and how researchers might be better supported in future

Despite the increased importance of demonstrating impact, it remains a concept many academics feel ill-equipped to measure or evidence. Clare Wilkinson reveals how researchers from a broad range of disciplines think about evidencing impact, what obstacles might stand in their way, and how they might be further supported in future. Knowledge around research impact continues to exist in siloes, with […]

The Twitter Effect

The session started with a presentation from Brett Butliere, who asked: Are more tweeted papers and topics also more contradicted? Brett and his collaborators had conducted a study of over 160,000 tweets about research outputs to determine whether or not research that promoted controversy resulted in more attention – a trend that he noted could potentially be seen in the Altmetric Top 100 lists every year. Amongst the topics most often Tweeted about were things like the dinosaurs and obesity – certainly areas that many people have opinions on. The research team concluded that there was some statistical evidence of their hypothesis, but noted that more research would need to be done and that different methods could also be explored.

Next up was Zohreh Zahedi, who was presenting her research that looks at the imbalanced use of social media across different countries. Zohreh conducted a cross-country analysis of Twitter uses to see whether tweeting about research was ore prevalent in certain countries, and the extent to which people tweeted about authors with affiliations in the same country as the tweeter themselves. Zohreh found some fairly significant differences: although an increasing amount of research comes from China, Russia, South America and India, the majority of twitter activity still takes place in the US and UK, and focuses on research from those countries. She summarized by highlighting that we need to be careful to bear these differences in mid when considering what constitutes ‘high’ vs. ‘low’ amounts of attention, and questioned whether we are in danger of creating an ‘altmetrics divide’.

Fereshteh Didegah then shared some really interesting results from a study which looked at the quality of interactions and engagement around research articles on Twitter. Using 250 articles and their 8,000 associated Tweets, Fereshteh and her team classified engagement into 3 categories: dissemination, consultation and evaluation.

 

The research team also explored issues of false popularity, including harassment and clone accounts, in detail – trying to understand some of the causes behind them and the resulting effect. Overall, the research found that there are a broad array of people involved in discussions about science on Twitter – and that the majority of it focuses on simply sharing rather than adding any meaningful commentary.

‘Making piracy fun’ became the (perhaps unintended) tag-line of the next talk, where Tim Bowman discussed the results that came from analyzing the use of the #icanhazpdf hashtag. The study aimed to understand what percentage of tweets were requesting documents that were behind a paywall, and what other conversations were taking place around that. By pulling in and matching data from a variety of sources, Tim was able to identify some interesting interactions – including a relatively substantial number of people requesting papers that were in fact already open access. Further digging into the data revealed people using the hashtag to advertise their access to full text content as a service to others, and many librarians seeking to find alternative ways to deliver to the needs of the researchers. Use of the hasthag has become so widespread, Tim noted, that social ‘norms’ have emerged – such as deleting the request Tweet as soon as the paper had been received. These norms provide a ‘frame’ for people to normalize their subversive sharing activity.

Last up was Rodrigo Costas, who has been doing some really exciting work to improve the way we identify scholars on Twitter. Rodrigo matched article records from Web of Science with attention data from Altmetric to identify researcher Twitter accounts (pointing out the limitations of this along the way – there are many researchers on Twitter who have never shared a paper and therefore do not appear in the Altmetric database) and then cross-referencing those accounts with ORCID records to validate the identify of each Tweeter. Rodrigo found over 387,000 scholars with a Twitter account, and was confident of a 94% accuracy amongst those based on the ORCID validation. What makes this research so exciting, Rodrigo posed, is not necessarily what has been done so far but the potential to expand from here: if we are able to better identify researchers in social spaces then we can start to look in more depth at not just how they share papers, but how they communicate and engage on social networks in general, and what that might mean for their field.

This was a brilliant session brimming with insights and ideas for further investigation – do get in touch with the authors if you have more to discuss (via Twitter, of course ;))

Weekend reads: The ‘Journal Grand Master,’ what drives online attention to studies; a song of replication

The week at Retraction Watch featured a story of unintended consequences and a broken relationship, and a retraction for a paper that had just about everything wrong with it. Here’s what was happening elsewhere: “Who is the ‘Journal Grand Master?’” An argument for using a journal ranking system based on sports rankings. (Robert Lehmann, Klaus […]

The post Weekend reads: The ‘Journal Grand Master,’ what drives online attention to studies; a song of replication appeared first on Retraction Watch.

‘How do scholarly citations in Wikipedia appear?’ Dr Evan Goldstein introduces his project

On the 24th May we announced that Dr Evan Goldstein, Postdoctoral Associate in the Department of Geological Sciences at The University of North Carolina at Chapel Hill, was the winner of this year’s Altmetric Research Grant. Here’s Evan’s take on the project he hopes to complete with his awarded funds…

As a postdoctoral researcher in the Department of Geological Sciences at the University of North Carolina at Chapel Hill (USA), most of my work is centered around geomorphology (predicting changes on Earth’s surface). Recently I have also focused on how Earth Scientists communicate with each other and the public on social media and across the Web. Much of my preliminary work appears on my blog and formed the basis for my Altmetric research grant proposal — investigating how scholarly citations in Wikipedia appear. My proposal is motivated by my own personal experience. I have edited Wikipedia to include references to my own articles (in addition to references to other articles that I did not write), and I have wondered how citations to scholarly literature in Wikipedia are generated (i.e., are most mentions generated by journal article authors?).

To address this question I will be focusing my effort on the >33,000 article corpus from the high impact, high volume (>1000 articles/year) Earth science journal ‘Geophysical Research Letters’ (GRL), a society journal of the American Geophysical Union. GRL is well represented on Social media and across Web (for an Earth Science journal), for example, most articles are mentioned on Twitter, and many are mentioned multiple times. Earlier this year I looked at the Wikipedia mentions for GRL articles published through 2015 using the DOI records for GRL articles (through 2015; downloaded form the Web of Science) and Wikipedia mentions from Altmetric retrieved using the rAltmetric package created by rOpenSci. In total there were more than 1000 GRL mentions on Wikipedia. The figure below is a plot of the number of Wikipedia mentions for GRL articles published in a given year.

 

For my Altmetric research grant, I intend to use an expanded version of the data in this plot to answer three linked questions. First, I will investigate what percent of Wikipedia citations of GRL articles can be traced to the GRL article authors. Second, I will determine when mentions in Wikipedia occur relative to article publication. Third, I intend to examine specific Wikipedia edits to determine the number of words added and if editors tend to add a single or multiple citations.

My enthusiasm for this research topic is also rooted in my belief that Wikipedia represents a mechanism for scientists to engage with a wide audience. Since Wikipedia is one of the largest websites in terms of global web traffic, it represents an opportunity for researchers to provide long lasting outreach and engagement with both academic and non-academic audiences. As an example, page views for geomorphology articles on Wikipedia are much larger than page views to my own scientific articles. Yet for many of the critical Earth and Space science journals, papers are only rarely mentioned on Wikipedia, even with the growing call for academics to edit and engage with Wikipedia (e.g. Logan et al., 2007; Bateman and Logan, 2010; Bond, 2011; Mandler, 2016; Goldstein, 2017).

I look forward to pursing this research and sharing the results over the coming months and would like to thank Altmetric for the support.

Advocacy and Engagement: Guidance for Integrating Altmetric Data into the Research Lifecycle

The following guest post was written by Lily Troia, Engagement Manager at Altmetric.

Altmetrics offer researchers and organizations powerful insights into conversations happening around scholarship, yet achieving widespread adoption of altmetrics, especially at academic institutions, can often feel like a Sisyphean endeavor. Much like advocating for any new digital tool, approach, or culture shift, advocating for use of altmetrics requires a concerted, yet reflexive approach, strategic communication and outreach, and above all, alignment with organizational mission and vision.

‘Much like advocating for any new digital tool, approach, or culture shift, advocating for use of altmetrics requires a concerted, yet reflexive approach, strategic communication and outreach, and above all, alignment with organizational mission and vision.’

In my work as a digital scholarship librarian, I (somewhat organically) developed a “rubric” for advocacy in support of the research process. This rubric has come in handy when encouraging consistent research data management practices, advising (in a non-legal sense, of course) on copyright and content licensing, advancing open practices and publishing, and promoting diverse voices in curation and preservation. While each endeavour necessitates nuance and unique strategies, advocating for altmetrics similarly demands a deep engagement with researchers and other academic community members, and should seek to enlist all stakeholders as advocates for a more dynamic scholarly metrics paradigm.

Now as Engagement Manager with Altmetric, I am able to share my experiences and iterative framework with librarians and other Altmetric advocates, and champion their efforts to address outdated aspects of and inefficiencies in scholarly communication. Inclusion of altmetrics in the research lifecycle is one crucial development influencing the direction of scholarship and academia, and fueling an evolution sparked by the myriad capabilities inherent to a digital, networked environment.

‘Inclusion of altmetrics in the research lifecycle is one crucial development influencing the direction of scholarship and academia, and fueling an evolution sparked by the myriad capabilities inherent to a digital, networked environment.’

The seeds of my advocacy rubric were sown prior to my librarian career, traceable to my earliest activist days campaigning for HIV/AIDS awareness in grammar school and organizing Take Back the Night rallies in college; present in previous positions as a social services employment counselor and as a human rights legal intern–and even my ethnomusicology studies where I struggled to balance the role of observer with a recognition that cultural analysis is never neutral. But in truth, my present methodology for advocacy is rooted in experience I garnered launching and successfully running my own music production and project management firm. My approach is almost embarrassingly simple, and, by definition, is customizable to different audiences and different needs, centering around thoughtful, adaptive engagement.

 

Putting the rubric into action

When tasked with rolling out Altmetric at an institution, it’s critical you design your launch with the goals and culture of your organization close in mind. Who will be using Altmetric data? What channels of attention are most important to those in your institution? How does your organization currently define and reward impact? What are your stakeholders’ expectations surrounding Altmetric data; and, what, if any, misconceptions might they have?

While these questions should direct development of your Altmetric adoption and advocacy plan, the following ideas are intended help guide your project management process, and support a sustainable program beyond the initial roll-out:

1. Be collaborative and attentive, and keep the researcher at the center of altmetrics conversations.

It may be your Communications Department is eager to use Altmetric data, or key administrators with purchasing power–but if researchers are not engaged with Altmetric data, you will likely struggle with increased or continued adoption, let alone encourage broader cultural shifts across and beyond your institution.

Listen to researchers! When advising on research data management practices I always began by asking questions and listening to researchers describe their processes, and share the concerns of most importance to them. In terms of altmetrics these questions could include:

  • What audiences are they most interested in reaching?
  • Where do they go to find out about important research?
  • What sort of collaborations might be beneficial to their research?
  • What do they view as measures of impact in their field? Is this citations? Lives saved? Discoveries made? Policies influenced?

Find out why they chose a career path in research. Most researchers need to advance their careers, but were they originally compelled by a desire to advance knowledge and/or help humankind? The more we support researchers in connecting back to their fundamental scholarly mission, the greater the potential to shake up traditional frameworks for promotion and valuations of ‘influence.’

‘The more we support researchers in connecting back to their fundamental scholarly mission, the greater the potential to shake up traditional frameworks for promotion and valuations of ‘influence.’’

2. Prioritize adaptive, personalized relationship-building.

Advocacy seldom works in a one-size-fits-all package. If efforts are not having the desired effect, how can we adjust our tactics? Variances across disciplines and departments are vast — different data points will be valuable to different groups. Researchers might be interested in reference manager data or F1000 mentions. Directors may be focused on reputation management or Wikipedia citations.

Tailor messaging to meet the audience, and customize avenues for outreach that put information in the locations and sources they trust and refer to regularly. When working on a new APC pilot in a previous academic library position, our Scholarly Communications Committee decided to advertise the program at the monthly Student Journal Club. Is there a Graduate Student Research Symposium you can table? A newsletter or or external blog to which you can contribute? Reach out directly to specific departments or programs at your institution–Altmetric data can inform collection development, tenure and promotion, review and recruitment. Initiate conversations with different groups and document their varied approaches to impact analysis and research communication.

Target altmetrics supporters among well-respected faculty or established researchers, or even alumni–and actively engage with early-career researchers, lab support, and undergraduates. Enlist these parties as Altmetric advocates, and provide them with opportunities to establish community and expand conversation via listservs or digital discussion groups, and on-campus meet-ups or learning opportunities. In turn, Altmetric Explorer can unearth high profile mentions of your institution’s research and unique examples of impact: use this data as an opening to approach an author about becoming an advocate for altmetrics.

Always seek ways to partner with other groups and departments involved in the research process, or attuned to sharing scholarship outside of traditional publication forums and in non-traditional formats. Altmetric advocacy will benefit when allies represent an array of stakeholders, such as:

    • Liaison/subject librarians
    • Metadata librarians
    • Digital humanities scholars and centers
    • E-learning offices /extension schools/continuing education
    • Communications and marketing departments
    • Research services and grant support
    • Archives, special collections, digital libraries
    • Lab and research assistants, fellows and interns
    • University publishers/press, student groups producing, publishing, or archiving historical or learning materials

Of course, institutional support is paramount to the success of any program or project. Support requires engagement. The type and level of engagement you foster with department heads or provosts will be different from interactions with researchers or other librarians, but providing actionable insights versus static information is vital to enlisting support across your institution. As mentioned, debunk myths with clear messages. For example, Altmetric data reflects more than social media, show this in action by highlighting a news mention or policy citation in an internal email or report, or feature Altmetric info in press releases and other web content. Need help developing materials that speak to your constituencies and their concerns, and engage them as advocates? Check out the resources on our Solutions page, or reach out to me and we can brainstorm together!

It’s also useful to nurture relationships with the broader altmetrics community, and explore partnerships with vendors and publishers. I meet and train different groups within the scholarly lifecycle–from society journal editors, to cancer research funders, to governmental think tanks–many parties are interested in discussing and elevating the role of altmetrics, and specifically, engaging with authors and researchers in this arena. Altmetrics offers a climate ripe for collaboration.

3. Stay positive and proactive.

Look for opportunities to engage rather than waiting for inquiries to come your way. Altmetrics might traditionally be viewed as the realm of evaluation and assessment, but they also provide data useful at the outset of the research process. Altmetric data can help inform decisions around where to publish, with whom to collaborate, and uncover timely conversations that could potentially influence the direction of research itself. Encourage those at your institution eligible for Altmetric Explorer accounts to interact with Altmetric data directly, showing them how to customize and save searches and set alerts. Provide examples and templates for incorporating Altmetrics into CVs, grant proposals, and online scholarly profiles.

Researchers are undoubtedly busy, but respect for their time constraints shouldn’t foment artificial barriers that keep researchers from understanding the ‘whys’ and ‘hows’ behind a nuanced relationship with altmetrics. While we want to diminish challenges impeding Altmetric adoption, we also want to empower others versus simply delivering them data. This approach can extend beyond the “lead a horse to water” adage: why not suggest a collaborative research project with a scholar that looks at altmetrics in their discipline? In order for various stakeholders to become advocates, they must experience the value (and limitations) of Altmetric data firsthand.

4. Foster inclusive, globally representative scholarly communication.

Altmetric data reminds us that scholarship lives well beyond the .pdf. Today research takes many forms–datasets, clinical trial records, learning materials, annotations–as librarians we have the opportunity to elevate non-traditional outputs and publications. In addition, Altmetric data can reveal important conversations happening far outside conventional academic ivory towers. Altmetrics gives us the opportunity to highlight impactful discussions, and share influential research that might be lacking in citations, but is being used in the classroom, or cited in policy.

We also have a duty to analyze demographic data around research attention, and align our communication and dissemination planning with an academic vision that supports the furtherance of knowledge and equitable access to information. Scholarship flourishes when it is infused with diverse perspectives and marginalized voices, and embraces a critical lens with respect to its pedagogy and organizing structures. Advocates for altmetrics envision a reflexive, malleable research ecosystem that embraces the wide-reaching potential of dynamic digital media, and incentivizes innovation, collaboration, and global impact. This might be deemed a lofty vision, but if we can inspire broad engagement across the research community, it’s a future within reach.

‘Advocates for altmetrics envision a reflexive, malleable research ecosystem that embraces the wide-reaching potential of dynamic digital media, and incentivizes innovation, collaboration, and global impact.’

Hopefully I’ve offered some fodder for thought, and tangible ideas you can take back to your institution. Please reach out with any questions or feedback–the altmetrics movement grows because of contributions from folks like you!

Digital Science and The New Scientometrics

It’s amazing what you can do with data these days. A couple of weeks ago, Digital Science held its third annual US Publisher day. This year, one of the themes that emerged was data, scientometrics and how we, as both an industry and as Digital Science, can use it to support publishers in strategic decision making.

The traditional data type for understanding the research landscape are citations and bibliometrics. While in recent years, we’ve all come to view impact as being much broader than citations, it’s only very recently that other types of data and analyses have been used for strategic planning and business intelligence.

That is what the Digital Science Consultancy team does. We apply new types of data and new analysis techniques to support funders, institutions, and publishers to make better decisions faster. During my talk at the Publisher Day, I broke down three aspects of how we go about doing that.

1) First, you need data

We have some pretty unique datasets at Digital Science. Most people are familiar with altmetrics and our portfolio company Altmetric.com. It was Altmetric that really helped define the discipline. We also have Uber Research, which created the only database of awarded research funding: Dimensions. Then there’s GRID, our open dataset of institutional identities.

Not only do we have data, but often our customers have data. For publishers, the most obvious source is in the form of authors, affiliations, citations, subscription information and the most underutilized data that publishers have; the full text of the articles that they publish.

2) Data is only useful if you have the right tools

Once you have data, you need the right tools to interpret it. In March 2017, Digital Science released a Digital Research Report in which we used the affiliation data from PLOS One articles dating between 2006 and 2016. Affiliation data is traditionally a challenge to work with. It’s usually a free-text field in journal submission forms, which results in a non-uniform hotchpotch of variant spellings and word orders. Sometimes, they’re in more than one language.

The GRID suite of tools contains a matcher that allows us to discover and deduplicate author affiliations with a high level of accuracy. Once we know which GRID records are the right ones, we than have a plethora of other information available including ISNI record numbers, Crossref, relationships between parent-child institutions and importantly, geolocation data.

In the report, we analyzed the global network of research collaborations, how collaboration is used strategically, and how it’s changing over time. The report is well worth a read if your business involves helping researchers communicate, just click on the image below to access it.

A graph of global research collaboration, colour coded by country. For the full digital research report, click on the image.

Text mining, natural language processing and topic modelling have come into their own in recent years. The field has moved so fast that many people, even in the information space, aren’t aware of just how powerful things have gotten.

As part of a long-standing relationship, Digital Science helped the Higher Education Funding Council for England (HEFCE)  analyze the results of the last Research Excellence Framework (REF). As a part of the REF, Universities are asked to submit a series of impact case studies that detail how the Institution’s activities impact society in ways other than bibliometric citations.

Those written accounts are read and scored by one of four panels (physical sciences, life sciences, social sciences, and arts and humanities). From a publisher perspective, this type of content is similar to a full text archive; it’s mostly words rather than numbers and is written for humans to read, rather than computers.

We used natural language processing to assess the similarity between case studies. Each one was plotted on a graph, colour coded by the panel that assessed it. The distance between the dots is inversely dependant on the similarities between the studies.

Despite the fact that we did not tell the computer anything about the structure of research in the UK, spontaneous clusters emerged from the dataset, enabling us to identify areas of excellence in UK research. The reuslts were pretty remarkable, as you can see below.

A cluster analysis based on similarities between impact case studies submitted for the REF. The four panels are color coded. Red for physical sciences, green for life sciences, blue for social sciences and yellow for arts and humanities.

If we zoom in on a particular cluster (the nicely multi-coloured one on the bottom left edge (detail shown below), we see that the cluster contains an interdisciplinary group of research on environmental management of waterways.

Zooming in on one particularly multidisciplinary cluster shows that it’s about environmental management of waterways.

You can play with the interactive visualization yourself, here.

The applications of these techniques for publishers are exciting. From discovery of emergent fields, to consolidation of existing titles and everything in between. As I said during the publisher day,

Imagine what we could do together by combining your data with ours and applying our techniques.

3) The secret sauce: domain expertise

Accurate interpretation requires an understanding of the conditions that generated the data. This is an area where both Digital Science and our Consultancy come into their own.

Inside the Digital Science Consultancy, our expanding team contains amongst others, a professor of bibliometrics, a world leading data scientist, an institutional research management and libraries expert, a health care and bibliometrics analyst, a very well known bibliometrics leader and a former academic research scientist.

Digital Science more broadly has invested in companies, sold to customers, driven progress through outreach and research reports, and even found many of its employees and entrepreneurs at each stage in the scholarly supply chain. This experience and depth of knowledge give us a truly unique perspective across the entire landscape.

Digital Science has products, services, and expertise along the entire scholarly supply chain.

Supporting academic publishers

The purpose of the publisher day was partly to inform our customers of the developments that we’ve been working on at Digital Science, but it was also to learn from publishers about how we can help them.

We heard from publishers who want to know what topics are emerging in their fields based on funding data. Others wanted to look across the landscape with cluster analyses and either find new emergent fields, or opportunities to consolidate titles. There was also interest in identifying emergent geographies and patterns of collaboration, as well as a desire to find authors for special issues or reviews, editors, or just leaders and rising stars in a field.

With data and metadata analysis finally coming into its own to inform strategic decision making in publishing, these are exciting times. I’m personally looking forward to seeing not only how our capabilities at Digital Science continue to grow but also how others in the industry make use of data as business intelligence.

The post Digital Science and The New Scientometrics appeared first on Digital Science.

Interpreting altmetrics

Lots of researchers tell us they love seeing and exploring the altmetrics for their work, but aren’t always sure what to take make of it, or what to do as a result of what they find. In this blog post we’ll provide some pointers for interpreting your altmetrics, and what you can look to do as a next step:

1. Finding the altmetrics for your content

There are many ways you can find altmetrics for your publications, and not necessarily just the metrics provided by Altmetric.com. Many publishers now include the Altmetric badges on their article pages, and others sometimes include information pulled from elsewhere, such as their own media or social monitoring service (check out PLOS and Mendeley for examples of this).

Where you do see the Altmetric badges embedded on a page you can click on them to access the full record of attention – the Altmetric details page – so you can see not just how much your work was talked about, but who was doing the talking and what they were saying.

 

Using the free Altmetric bookmarklet will also give you access to these pages for items that have a DOI, or if your institution subscribes to the Altmetric Explorer you can explore all of the mentions for over 9 million outputs in one place!

Whichever route you choose, the data will be updating in real-time, providing you with the latest insights on the conversations and engagement surrounding your work.

 

2. What is the data telling you?

Once you’ve found your altmetrics, the next thing to do is to critically evaluate the data. Which channels is your work receiving attention from? Has it been shared by an influential Tweeter, which has resulted in a broader reach? Is it being used by students or in Syllabi? Have policy makers picked it up and referenced it in their advice, or has someone thought it relevant enough to include in a Wikipedia article?

Determining who is talking about your research, and which channels they’re using to do it, can help you understand why they are likely giving it that attention, even if they don’t explicitly state it.

Another thing to consider is when your work received the attention it’s got – was it as soon as it was published, or did something happen later that meant it was surfaced again (for example a world event or topical news story?) Has funding changed in that area at the same time? Did someone do something that meant that it was suddenly noticed? You might be surprised by some of the attention that older work often gets, even if at the time it did not seem so notable.

One question that researchers often ask is whether or not their Altmetric Attention Score (the weighted count of attention an item has received) is any good. Althought the score does not measure quality, of the research, the researcher, or the attention, it can be useful for measuring reach and visibility. A higher score = higher reach.

To help you see how the score your item has received compares to other work published at the same time (or in the same journal) the Altmetric details pages include a ‘score in context’ section. Taking note of the information in this tab can help you see if there might be opportunities for your work to be made for visible, or if in fact it’s way outperforming other content in your field.

If you’re publishing Open Access it can be interesting to keep an eye on how that impacts the attention your work receives – preliminary research has found that OA items sometimes get shared more than non-OA – another great reason to make your research open and discoverable!

 

3. How can you determine what to do next?

So, great, you’ve got a good understanding of how people are talking about your research, why they’re likely doing so, and how that compares in terms of volume of engagement to other work in your field. BUT, what are people actually saying about that other work? If it’s had less attention, who is it from? Are other researchers being picked up less by a general audience but more by policy makers? If it’s had more attention, is it attention from audiences that you would like to reach?

Using the tools outlined in section 1 you can start to explore the altmetrics for other work in your field, as well as your own. Try browsing some recent issues of journals, or new books in your discipline, to see how the level of visibility differs between publishers and title.

Based on what you uncover you can start to make a plan. Try answering these questions as a start:

  • Do you feel the amount of visibility and engagement your work got was appropriate to the content?
  • Who were you aiming would see and benefit from your work? From the altmetrics you have, does that seem to have been achieved?
  • Compared with other research in your field, does the engagement your work has received seem high, average, or low? Are there areas where you might improve outreach?
  • What ‘impacts’ or other outcomes did you hope for as a result of doing and publishing your work? Does the data provide any evidence of the extent to which you’ve reached those goals?
  • Did you publish it open access, or would you perhaps want to consider doing so in future?

Altmetrics, like any metrics, are only valuable if you put them into context. Make sure you’ve got a good understanding of the qualitative data, and align what you find with what you want to achieve.

 

4. Take action!

Congratulations! You’ve published your work, taken a look at your early altmetrics, and given some consideration to whether or not you’ve got the result that you want. Now what? In some cases you could take immediate action: if you feel a past publication could’ve been more visible, why not share it now? Identify some key bloggers to reach out to (based on the altmetrics for other publications in your field) or even just take the opportunity to reshare the work from your own social accounts.

If you’ve had something accepted but are still waiting on a publication date then now is the ideal time to try something new! Connect with your research support librarian, scholarly comms office or the PR team at your publisher to discuss how you might maximise the engagement for your work, or even just give some thought to what you’d like to achieve and build your own small plan. It doesn’t need to be extensive, and tools like Kudos can help make it even easier.

If you’re yet to submit then what you’ve learnt can help you shape future strategies: which publishers will share your work most effectively? Are they present on the channels and amongst the audiences you want to reach? What support do they offer for helping you share the work, and tracking the engagement?

Another thing to consider might be what research you make available. Altmetrics aren’t just for articles – they also pick up the attention for datasets, posters, images and all sorts of other research hosted on publisher websites, institutional repositories, preprint servers and other platforms. Sharing more and gathering evidence to demonstrate what the results of that were can help tell the bigger story of who you are as a researcher.

 

5. Make it easy

We all have loads of demands on our time, and keeping an eye on every little mention won’t always be top of your to-do list. Aim to make integrating altmetrics into your workflows as easy as possible – the Altmetric details pages have a link to sign up for alerts to be notified when new mentions of your publications occur, and if you’re using the Altmetric Explorer you can set up daily, weekly or monthly updates for any search.

If you check in on your citations from time to time, why not also take a moment to take a look over your altmetrics?

 

We hope you find this post useful! If you’d like some top tips for simple ways to make all of your work more visible click here to download our handy guide – and share your own in the comments below or on Twitter @altmetric!

 

What is the impact of a research publication?

From EBMH:

An increasing number of metrics are used to measure the impact of research papers. Despite being the most commonly used, the 2-year impact factor is limited by a lack of generalisability and comparability, in part due to substantial variation within and between fields. Similar limitations apply to metrics such as citations per paper. New approaches compare a paper’s citation count to others in the research area, while others measure social and traditional media impact. However, none of these measures take into account an individual author’s contribution to the paper or the number of authors, which we argue are key limitations. The UK’s 2014 Research Exercise Framework included a detailed bibliometric analysis comparing 15 selected metrics to a ‘gold standard’ evaluation of almost 150 000 papers by expert panels. We outline the main correlations between the most highly regarded papers by the expert panel in the Psychiatry, Clinical Psychology and Neurology unit and these metrics, most of which were weak to moderate. The strongest correlation was with the SCImago Journal Rank, a variant of the journal impact factor, while the amount of Twitter activity showed no correlation. We suggest that an aggregate measure combining journal metrics, field-standardised citation data and alternative metrics, including weighting or colour-coding of individual papers to account for author contribution, could provide more clarity.

 

Read full article

Altmetric and WeShareScience

Created in 2014, WeShareScience is an online platform for researchers to post videos showcasing their latest work. We spoke to site creator Ryan Watkins, Professor of Educational Leadership at George Washington University, to find out more about the site and what he’s aiming to achieve.

Could you tell us a bit about your site? How did it get started and what are you aiming to achieve?
Like many ideas, WeShareScience emerged out of frustration and curiosity. As a professor of educational technology, much of my work cuts across many fields and disciplines. For years, each month I would read research journals in education, psychology, sociology, personnel management, business, and other disciplines. The growing volume of research in each these areas was however making it increasingly difficult to find the research that I could apply to in my work. And yet, in most other parts of my life, social media platforms were making it easier find and share information.

In 2013 I got increasingly curious about new ways for sharing research. At the time, Pinterest was a rapidly growing platform that had found a unique way to capture its users’ attention, and YouTube was pushing all of us to reimagine the role of video in our lives. While observing these trends I started to play around with ideas for how these trends could support the sharing of scientific research — and that curiosity brought about the first iterations of WeShareScience. The site is conceptually a mash-up of Pinterest, YouTube, and TedTalks, applied to the traditional notion of a science fair (or a poster session at a conference).

The aim of WeShareScience is to offer an unique platform where anyone can share their scientific research; making it more accessible to other researchers and the public.  

Screen Shot 2017-04-20 at 09.29.44

 

What content can people find on the site?
Videos about research! These videos offer a new option that can complement peer-reviewed research articles and increase the availability of research amongst new audiences. Through this format, researchers and their research tend to be more practically accessible — after all, there is something personal about hearing and seeing the researcher; whereas traditional journal articles written in the third person, intentionally trying to remove this connection between the reader and the researcher.  

Why do you think it’s so important to communicate research in this way?
Video is a very engaging medium and with new technologies it is quickly becoming how we share ideas. In 2015, for example, YouTube announced that it had 8 billion video views each day; in more 39 countries, and in over 54 languages. Science can’t ignore this change in how people choose to learn, share, and archive their experiences. There remains an important role for peer-reviewed research journal articles, but that is no longer the only tool that researchers should be using to share their science.  

You’ve already added the Altmetric badges to your site, which is great to see! Why were you keen to have that information visible there?
Screen Shot 2017-04-20 at 09.30.19
For the first two years, WeShareScience was primarily a destination site. That is to say, it was a culdesac site where people came to watch videos about research, and then maybe leave comments or share a link to the video with colleagues — but there was no where to go.

As I developed the technical parts of the site this was a sufficient scope, but in order to grow the site to its potential I found that WeShareScience really had to be hub that links visitors to other places where they can learn more.  If people find the research in the video to be of interest or value, then WeShareScience has to be the starting place rather than the end destination. Altmetric badges are the ideal tool for quickly and easily guiding visitors to more information that can be found on a variety of other platform (including Facebook, Twitter, Mendeley, etc.).  

Have you noticed anything particularly interesting or that you weren’t expecting to see in the Altmetric data? What are you using it for at the moment?
At this point I have integrated the badges into several aspects of site’s design. When visitors share their video abstracts on WeShareScience they have the option to provide more information; including the DOI number of published resources, the URL for an open or closed access journal, or links to the researcher’s website. WeShareScience then uses this information to add an Altmetric badge just below the video.

What advice would you give to academics who want to communicate their research more broadly but aren’t sure where to start?
To begin, I would suggest that they reflect on what they are hoping to achieve through their research and their career as a researcher. For many researchers, expanding the audience for their work is not necessary in order to make the substantial contributions they are looking to make. For other researchers, the small readership of peer-reviewed journals in their discipline may be a barrier to the influence that their research can have on the work of other researchers, researchers in other countries, policy makers, or others.

For this second group, they should have specific goals and audiences in mind as they develop a communications plan for their research. Articles, conference presentations, invited talks, webinars, video abstracts, and social media should all be considered and integrated into a plan that achieves their goals. As researchers implement their communications plan, they must then remain aware of (and work to guard against) the often slippery slope of trying to gain attention for their research. Maintaining their role as a scientific expert is essential to their creditability, but in an era where dramatic headlines that capture “eyes” and “clicks” dominate, we must all make conscious efforts to be rigorous and trustworthy as we accurately communicate about our research.

Any top tips for creating videos in particular?
Many of the most popular video abstracts on WeShareScience are those that use the video as a tool for helping viewers connect with the researcher (and thereby the research). Some contributors still elect to record their voice while showing PowerPoint slides, but the videos that catch attention and often effectively communicate about the research are those that show the researcher engaged in their research. On video, the passion of the researcher comes through much clearer than in written articles. If you are a botanist, go out and record pictures of you in environment which you study. If you are a chemist, let the viewers see you working in the chemistry lab.

Video allows you to make a connection with the viewer, and it gives the viewer the opportunity to take a glimpse into the world of your science — use that to your advantage and then point them towards places where they can learn more.

Want to hear more about the innovative ways people are using Altmetric? Register here to receive our monthly newsletter! 

Help Altmetric further expand its coverage: tell us about your favourite blogs

Nominate your favourite arts, humanities and social science blogs for a chance to win a £25/$25 Amazon voucher! 

Altmetric already monitors over 11,000 blogs from a variety of disciplines on a real-time basis looking for mentions of published research – be it a book, article, dataset or other output type. All of the mentions found are displayed on the related details page for the item, which is easily accessible by clicking on the Altmetric badge where you see it on institutional repositories or publisher websites.

In doing so we’re hoping to help you, authors and researchers, receive feedback more quickly on work that you’ve published and make more informed decisions about what to read.

We’re keen to make our coverage as comprehensive as possible, but to do so we need your help!

Do you have a blog you read regularly to stay up to date with the latest research in your field? Are there research communicators out there who you think do a great job of disseminating, evaluating or critiquing scholarly work? You’re the experts and we’d love to hear from you! 

Tell us about your favourite blogs in the arts, humanities or social sciences and we’ll get to work to start tracking them – plus you’ll be entered into a prize draw to win a £25/$25 Amazon voucher.

Note that there are two important requirements for us to be able to track a blog:

  1. The blog must have a working RSS feed
  2. The blog must not be paywalled
Nominate your favourite blogs today!

Enter your suggestions by 28th April 2017 to be in with a chance of winning.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑