research integrity - Digital Science https://www.digital-science.com/blog/tags/research-integrity/ Advancing the Research Ecosystem Fri, 03 Oct 2025 16:53:43 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.digital-science.com/wp-content/uploads/2025/05/cropped-favicon-container-2-32x32.png research integrity - Digital Science https://www.digital-science.com/blog/tags/research-integrity/ 32 32 Digital Science investigation shows millions of taxpayers’ money has been awarded to researchers associated with fictitious network https://www.digital-science.com/blog/2025/09/taxpayers-money-awarded-to-researchers-associated-with-fictitious-network/ Thu, 04 Sep 2025 13:00:44 +0000 https://www.digital-science.com/?p=94374 Digital Science investigations show researchers associated with a fictitious research network and funding source have netted millions of taxpayers' dollars in funding.

The post Digital Science investigation shows millions of taxpayers’ money has been awarded to researchers associated with fictitious network appeared first on Digital Science.

]]>
Thursday 4 September 2025 – London, UK and Chicago, USA

Researchers associated with a fictitious research network and funding source have collectively netted millions of dollars of taxpayers’ money for current studies from the United States, Japan, Ireland, and other nations. That’s according to investigations led by Digital Science’s VP of Research Integrity, Dr Leslie McIntosh.

The results of her investigations raise serious concerns about the lack of accountability for those involved in questionable research publications.

“This example illustrates how weaknesses in research and publishing systems can be systematically exploited, so that researchers can game the system for their own benefit,” Dr McIntosh says.

Dr McIntosh – one of the co-founders of the Forensic Scientometrics (FoSci) movement – has presented her analysis at this week’s 10th International Congress on Peer Review and Scientific Publication in Chicago, in a talk entitled: Manufactured Impact: How a Non-existent Research Network Manipulated Scholarly Publishing.

While not naming the individual researchers involved, Dr McIntosh’s presentation was centered on a group known as the Pharmakon Neuroscience Network, a non-existent body listed on more than 120 research publications from 2019–2022 until being exposed as fictitious. These publications involved 331 unique authors and were associated with 232 organizations and institutions across 40 countries.

Research network raised multiple red flags

The Pharmakon Neuroscience Network functioned as a loosely organized collaboration of predominantly early-career researchers, such as postdoctoral and PhD students, whose publications included:

  • Funding acknowledgments with unverifiable organizations
  • Use of questionable or unverifiable institutional affiliations
  • Suspiciously large citations in a short timeframe
  • Globally connected despite a young publication age

“Despite clear concerns about the legitimacy of their work, only three papers have been formally retracted to date,” Dr McIntosh says.

Using Digital Science’s research solutions Dimensions and Altmetric, Dr McIntosh and colleagues have tracked the progress of the authors connected with this network.

“Once the Pharmakon Neuroscience Network was exposed as being fake in 2022, it no longer appeared on publications, but many of the researchers associated with it have continued to publish and attract significant funding for their work,” she says.

Millions in funding for current research

Of the initial 331 researchers associated with the Pharmakon Neuroscience Network’s publications, Dr McIntosh has established that more than 20 currently have funding either as a Principal Investigator or a Co-Principal Investigator from sources where the grant commenced in 2022 or later. During this time, those researchers have collectively been awarded the equivalent of at least US$6.5 million from seven countries: US, Japan, Ireland, France, Portugal, and Croatia, and an undisclosed sum from Russia.

One researcher with more than US$50 million in funding has authorship on one of the Pharmakon papers. It is not clear if he knowingly participated in the network or was part of a former student activity. 

“Many of the researchers had grants before and after Pharmakon. This is legitimate, taxpayer money in most instances that are funding very unethical practices,” Dr McIntosh says.

“One aspect we need more time to vet is the possibility that a few of these researchers do not know they were authors on papers within this network. We are still completing this work.”

Of the funded researchers, five had never previously received funding for their research, but following their involvement with the Pharmakon Neuroscience Network they have since been awarded grants from the following sources ($US equivalent):

  • Science Foundation Ireland – $649,891
  • Ministry of Science, Technology and Higher Education (Portugal) – $538,904 total
  • Croatian Science Foundation – $206,681
  • Russian Science Foundation – undisclosed sum

“Here we have evidence that some authors have secured legitimate funding, including large sums of taxpayers’ money, following their participation in questionable research and publication activity,” Dr McIntosh says.

“We can presume that their publication portfolio, no matter how it was obtained, helped in securing this funding from legitimate sources.”

Dr McIntosh says this case has implications across the research system and emphasizes the need for stronger verification, monitoring, and cooperation.

“Although most of these publications remain in circulation and have been cited widely, corrective actions have been limited. This highlights the challenge of addressing such networks once their work is embedded in the scholarly record,” she says.

Recommendations

Dr McIntosh recommends the following:

  • Oversight to be reinforced by requiring the use of verified institutional identifiers, such as GRID or ROR, in all publications to ensure affiliations are legitimate and traceable.
  • Transparency to be mandated through clearer author contribution statements and verified funding acknowledgments, creating a more reliable and accountable record of how research is conducted and supported.
  • Monitoring mechanisms should be improved by supporting the adoption of forensic scientometrics, which can detect unusual collaboration patterns or questionable authorship practices before they become systemic.

“By addressing these gaps, governments, publishers and research institutions alike can help protect the integrity of the research system and ensure that trust in science is maintained,” Dr McIntosh says.

See further detail about this investigation in Dr McIntosh’s blog post: From Nefarious Networks to Legitimate Funding.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media contact

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

The post Digital Science investigation shows millions of taxpayers’ money has been awarded to researchers associated with fictitious network appeared first on Digital Science.

]]>
Emerald Publishing to safeguard research integrity with Dimensions Author Check https://www.digital-science.com/blog/2025/08/emerald-publishing-to-safeguard-research-integrity-with-dimensions-author-check/ Wed, 06 Aug 2025 09:23:12 +0000 https://www.digital-science.com/?p=93831 Emerald Publishing has adopted Dimensions Author Check from Digital Science as part of Emerald’s ongoing commitment to research integrity.

The post Emerald Publishing to safeguard research integrity with Dimensions Author Check appeared first on Digital Science.

]]>
Wednesday 6 August 2025

Digital Science is pleased to announce that Emerald Publishing has adopted Dimensions Author Check as part of Emerald’s ongoing commitment to research integrity.

Dimensions Author Check offers publishers a fast and reliable way to incorporate research integrity checks into their work, helping to support responsible and ethical publishing.

Built on Digital Science’s Dimensions – the world’s largest interconnected global research database – Dimensions Author Check offers unmatched transparency into authors’, editors’ and reviewers’ publishing and collaboration histories, accessible through an intuitive and visual dashboard.

Using the dashboard, publishers can thoroughly review the publishing history of a researcher and the people they’ve collaborated with, to spot any unusual activities, such as retractions, expressions of concern, or atypical collaboration patterns.

Sally Wilson, VP Publishing at Emerald, said: “The primary use case for Author Check is to support our due diligence processes when developing and reviewing new special issue proposals. It allows us to efficiently verify the academic credentials, publication history, and editorial experience of proposed guest editors and contributors, helping ensure they meet our editorial standards and ethical expectations.

“Additional use cases are to support our editor succession planning and commissioning activities, offering valuable insights into potential candidates’ research impact and professional networks.

“We hope by integrating Author Check into these workflows, we not only enhance the integrity and transparency of our editorial decision-making but also save time by streamlining what would otherwise be manual and time-consuming processes.”

Dr Leslie McIntosh, VP of Research Integrity at Digital Science, said: “We’re excited that Emerald has become the latest publisher to adopt Dimensions Author Check, further boosting Emerald’s commitment to supporting the integrity of the scholarly record.

“Dimensions Author Check empowers publishers to uphold trust and transparency in research by ensuring they have the best possible information at their fingertips – within seconds.”


About Emerald

Founded by management scholars in 1967, and now part of the Cambridge Information Group, Emerald Publishing provides a range of publishing services to help researchers tell their stories in a meaningful and timely way, providing innovative tools and services to build confidence and capability in impactful research. As a proud signatory of DORA, Emerald is committed to establishing new pathways to impact, making research more accessible, and helping communities make decisions that change their world for the better.

For over 55 years Emerald’s core purpose has been to champion fresh thinkers and help them make a difference so that little by little those in academia or in practice can unite to bring positive change in the real world. Emerald Publishing is proud to be a Times Top 50 Employer for Gender Equality 2025 – for the second year in a row – and one of the Top 50 Inspiring Workplaces in the UK and Ireland for 2025.

About Dimensions

Part of Digital Science, Dimensions hosts the largest collection of interconnected global research data, re-imagining research discovery with access to grants, publications, clinical trials, patents and policy documents all in one place. Follow Dimensions on Bluesky, X and LinkedIn.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry, and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media contacts

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

Tom Shiels, Communications Manager, Emerald Publishing: tshiels@emerald.com

The post Emerald Publishing to safeguard research integrity with Dimensions Author Check appeared first on Digital Science.

]]>
Digital Science to strengthen research integrity in publishing with new Dimensions Author Check API https://www.digital-science.com/blog/2025/07/strengthen-research-integrity-in-publishing-with-dimensions-author-check-api/ Wed, 16 Jul 2025 08:54:05 +0000 https://www.digital-science.com/?p=93533 Scholarly publishers can now fully integrate research integrity checks into their editorial and submission workflows with Dimensions Author Check API.

The post Digital Science to strengthen research integrity in publishing with new Dimensions Author Check API appeared first on Digital Science.

]]>
Wednesday 16 July 2025

Scholarly publishers can now fully integrate research integrity checks into their editorial and submission workflows, thanks to Digital Science’s new Dimensions Author Check API, which launches today.

Built on Dimensions – the world’s largest interconnected global research database – Dimensions Author Check evaluates researchers’ publication and collaboration histories within seconds, delivering reliable, concise, structured insights.

For the first time, the new Dimensions Author Check API enables publishers to embed this functionality directly into their own workflows, without the need to switch to an outside platform.

Dr Leslie McIntosh, Vice President of Research Integrity at Digital Science, said Dimensions Author Check API is designed to support consistent and confident editorial decision-making.

“By highlighting key indicators of research integrity – such as retractions, tortured phrases, or unusual co-authorship patterns – the Dimensions Author Check API helps to rapidly identify potential issues for concern. These include continuously improving indicators that will identify paper mills and increase trust in science,” Dr McIntosh said.

“Importantly, the Author Check API can do this at scale, giving publishers the ability to screen multiple researchers per request. This makes it ideal for high-volume manuscript processing and broader editorial oversight.”

Key benefits of the new Dimensions Author Check API include:

  • Seamless integration: A standards-based RESTful API designed for easy deployment within publishers’ internal systems or third-party platforms.
  • Actionable insights: Clear summaries highlighting key aspects of researchers’ publication and collaboration histories.
  • Operational efficiency: Reducing editorial workload while enhancing the quality and consistency of integrity assessments.
  • Support for transparency and trust: Surfacing critical integrity information at key decision points, strengthening publishers’ ability to adhere to ethical standards.

Note to editors: The Dimensions Author Check dashboard was originally announced in December last year. This announcement is specific to the Dimensions Author Check API, which launches today.

About Dimensions

Part of Digital Science, Dimensions hosts the largest collection of interconnected global research data, re-imagining research discovery with access to grants, publications, clinical trials, patents and policy documents all in one place. Follow Dimensions on Bluesky, X and LinkedIn.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media contact

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

The post Digital Science to strengthen research integrity in publishing with new Dimensions Author Check API appeared first on Digital Science.

]]>
Paris Declaration calls for data-driven forensics to spearhead the fight against fake science https://www.digital-science.com/blog/2024/12/paris-declaration-calls-for-data-driven-forensics-to-fight-fake-science/ Wed, 18 Dec 2024 09:50:53 +0000 https://www.digital-science.com/?post_type=press-release&p=74651 Supporters of research integrity have signed the Paris Declaration, calling for data-driven forensics to spearhead the fight against fake science.

The post Paris Declaration calls for data-driven forensics to spearhead the fight against fake science appeared first on Digital Science.

]]>
banner image promoting event

Research integrity champions say Forensic Scientometrics (FoSci) will decontaminate “polluted” science and scholarly literature

Wednesday 18 December 2024

Supporters of research integrity have signed a new declaration calling for data-driven forensics – known as Forensic Scientometrics (FoSci) – to lead the charge in detecting, exposing and even preventing fake science.

The Forensic Scientometrics (FoSci) Paris Declaration was drafted during an event in Paris last week organized and run by Digital Science’s VP of Research Integrity, Dr Leslie McIntosh. The event was hosted at Institut Universitaire de France (IUF) by Dr Guillaume Cabanac, research integrity “sleuth” and professor at the University of Toulouse, as part of his research chair titled “Decontamination of the scientific literature.”

The event involved researchers, experts, and professionals from around the world who are committed to upholding research integrity, many well-known sleuths among them. Attendees signed the declaration over the following weekend.

As the Declaration states, “Trustworthy science risks being obscured by a small but growing corpus of papers, people, organizations, and potentially governments polluting the integrity of research.”

And: “We care deeply about science, and we believe firmly in the ability of scientific study to decontaminate the scholarly literature. As a collective, we intend to do whatever we can to promote the consistency and reliability of scientific research output.”

“We want to dispel this pollution by flagging problematic papers, actors, and systems, mitigating the effects and disincentivizing such behavior in the future. Our goals are to prevent these errors from spreading, to promote better policies for scientific endeavours, and to safeguard the positive impact of science on society.”

FoSci is a forensic, data-driven initiative to uphold scientific integrity and public trust in science. It combines forensic investigation and scientometrics, which is the study of how research is shared and built upon. FoSci uncovers patterns that uphold or threaten the integrity of science itself.

The problems currently researched by forensic scientometricians include: author misrepresentation, data manipulation, fake conferences, image duplication, misconduct (including fabrication, falsification, and plagiarism), papermill operations, questionable research practices, sale of authorship and citations, sneaked references, stealth corrections, and tortured phrases.

The Declaration states that these problems have widespread and potentially damaging implications, through the citation of fraudulent research in patents, clinical guidelines, government policy, and more.

Dr McIntosh, one of the co-founders of the FoSci movement, said: “Forensic scientometrics is needed now more than ever. Scientific achievement is critical to our society’s health and wellbeing, to our economic and social prosperity, but we also live in a time when the community’s trust in science is constantly being eroded.

“What FoSci does is shine a light on questionable or deceitful practices in the world of science. Through collective action, we want to motivate those involved in producing and disseminating scientific research to produce consistent, valid, and high-quality work.”

Dr Cabanac said: “Our gathering of institutions, journalists, publishers, sleuths, and a leading scientometric data provider proved highly stimulating and productive. Meeting in person created a synergy that we, as a community, plan to sustain and put at the service of science.

“Unreliable bricks weaken the wall of knowledge that researchers have been building for centuries, one publication at a time. Collective action is required, both curative to prevent humans and AIs from learning from these, and preventive to design methods to stop new forms of misleading contents from entering the scientific record. This declaration is a call for action: join us.”

The FoSci Paris Declaration has made the following key commitments:

  • Advocate for transformation
    • Open a dialogue with policymakers to design de-incentivizing strategies to tackle the mass production of problematic papers
    • Advocate for reform of institutions involved in scientific research based on the group’s findings
  • Develop expertise and share knowledge
    • Facilitate training for researchers and professionals exploring these questions
    • Share and provide research and data in the FoSci community
    • Establish a regular cycle of professional meetings
    • Improve the tools and methods of forensic scientometrics
  • Improve the group’s ability to communicate its findings
    • Inform editorial boards, publishers, research institutions, governments and all relevant involved parties about the group’s work
    • Participate in building software and tools to enable the reproducibility of their forensics findings
    • Establish points of contact between FoSci members and concerned organizations

The Paris event and its declaration are the culmination of a year of awareness-raising activities for Dr McIntosh, who has held workshops on FoSci in Athens, Los Angeles, Darwin and Sydney throughout 2024.

The signatories to the Paris Declaration hope that FoSci will become internationally recognized and taught at research institutions, particularly within research administration teams, but also among the academic community.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Scismic, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media contact

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

The post Paris Declaration calls for data-driven forensics to spearhead the fight against fake science appeared first on Digital Science.

]]>
Sage first to adopt Dimensions Author Check, a new research integrity tool from Digital Science https://www.digital-science.com/blog/2024/12/sage-first-to-adopt-dimensions-author-check-for-research-integrity/ Mon, 02 Dec 2024 15:11:08 +0000 https://www.digital-science.com/?post_type=press-release&p=74479 Sage has become the first publisher to adopt the new Dimensions Author Check – taking a significant step in advancing research integrity.

The post Sage first to adopt Dimensions Author Check, a new research integrity tool from Digital Science appeared first on Digital Science.

]]>

Monday 2 December 2024

Sage has taken a significant step in advancing research integrity by adopting Dimensions Author Check, an application from Digital Science that reviews researchers’ publication histories and networks to check for research integrity issues. This tool will help spot patterns in unethical scholarly behavior and is part of Sage’s ongoing effort to prevent low-quality research from being published and to preserve the integrity of the academic record.

Author Check works by using an extensive dataset to flag unusual activities that may require further investigation, such as indicators of paper mill involvement. It streamlines the historically labor-intensive and time-consuming author verification process, creating a more comprehensive view of an author’s research history.

“As an independent company, we have the freedom to think long-term and invest in meaningful solutions to address the pressing challenges in research integrity,” said Dr Adya Misra, Associate Director of Research Integrity at Sage. “With Author Check, we’ll be able to better channel our efforts alongside fair and rigorous investigation processes. I’m excited to use this tool and to be working with Digital Science to uphold trust in research.”

“Sage is harnessing cutting-edge technology developed by Digital Science, inspired by our innovative vision, to enhance the integrity of academic publishing by verifying authorship on research papers. This collaboration underscores our commitment to empowering publishers with tools that uphold trust and transparency in scholarly communication,” said Dr Leslie McIntosh, Vice President of Research Integrity at Digital Science. “Personally, seeing Dimensions Author Check evolve from an idea into a practical solution is thrilling. It’s incredibly rewarding to know that a respected company like Sage values the insight Author Check provides and will use it to strengthen trust in research.”

Sage is committed to publishing scholarship that is robust, accurate, and inclusive, reflecting the highest standards of research integrity. Read more about Sage’s efforts to support research integrity.

About Dimensions

Part of Digital Science, Dimensions hosts the largest collection of interconnected global research data, re-imagining research discovery with access to grants, publications, clinical trials, patents and policy documents all in one place. dimensions.ai. Follow @DSDimensions on X and LinkedIn.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Scismic, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

About Sage

Sage is a global academic publisher of books, journals, and library resources with a growing range of technologies to enable discovery, access, and engagement. Believing that research and education are critical in shaping society, 24-year-old Sara Miller McCune founded Sage in 1965. Today, we are controlled by a group of trustees charged with maintaining our independence and mission indefinitely.  

Our guaranteed independence means we’re free to: 

  • Do more – supporting an equitable academic future, furthering disciplines that drive social change, and helping social and behavioral science make an impact 
  • Work together – building lasting relationships, championing diverse perspectives, and co-creating resources to transform teaching and learning 
  • Think long-term – experimenting, taking risks, and investing in new ideas 

This press release was originally published at the Sage website here.

The post Sage first to adopt Dimensions Author Check, a new research integrity tool from Digital Science appeared first on Digital Science.

]]>
TL;DR Shorts: Angela Saini on trust in research https://www.digital-science.com/blog/2024/10/tldr-shorts-angela-saini-on-trust-in-research/ Tue, 08 Oct 2024 15:42:27 +0000 https://www.digital-science.com/?post_type=tldr_article&p=73588 Today’s episode of TL;DR Shorts features British science journalist, author, and broadcaster Angela Saini, who reflects on the role of science communication in the COVID-19 pandemic and how accessing research information should be done with sceptical caution, to avoid the spread of misinformation and disinformation.

The post TL;DR Shorts: Angela Saini on trust in research appeared first on Digital Science.

]]>
Today’s episode of TL;DR Shorts once again features British science journalist, author, and broadcaster Angela Saini, who is best known for her work exploring the intersections of science, gender, race, and society. In this episode Angela reflects on the role of science communication in the COVID-19 pandemic and how accessing research information should be done with sceptical caution, to avoid the spread of misinformation and disinformation.

Angela Saini reflects on the COVID-19 pandemic and the role of research in our everyday lives. She also discusses how research information must be approached with healthy scepticism to combat misinformation and disinformation. Watch this video and others on the Digital Science YouTube channel.

Angela also reflects on the role of the internet in democratising access to research information. While the use of technology to share research is a positive development, Angela reminds us that all information encountered must be done so with a healthy dose of scepticism to avoid consuming and propagating inadvertent misinformation or deliberate disinformation.

Angela discusses her role as founder and chair of the Challenging Pseudoscience research group under the Royal Institution. Through their research, they have learned that this is not so much down to a lack of access to robust research, but rather a challenge of confirmation bias. If someone reads an article without checking their own preconceptions or biases, they will cherry-pick elements of that research to suit their narrative. Reflecting on one of our previous TL;DR Shorts from Rodney Mullen who advocates for a broader range of minds being involved in research, mitigating against confirmation bias is certainly one more factor to add to the list of reasons why we must diversity our research community.

If you want to contribute to that discussion or you’d like to suggest future contributors for our series or suggest some topics you’d like us to cover, drop Suze a message on one of our social media channels and use the hashtag #TLDRShorts. Subscribe now to be notified of each weekly release of the latest TL;DR Short, and catch up with the entire series here.

The post TL;DR Shorts: Angela Saini on trust in research appeared first on Digital Science.

]]>
Mission-critical: Risks and repercussions presented by China’s research programs https://www.digital-science.com/blog/2024/10/mission-critical-risks-and-repercussions-presented-by-chinas-research-programs/ Mon, 07 Oct 2024 14:21:38 +0000 https://www.digital-science.com/?post_type=story&p=73547 Is China strategically important to the west? How you answer that question might depend on how old you are.

The post Mission-critical: Risks and repercussions presented by China’s research programs appeared first on Digital Science.

]]>

Evaluating transparency and integrity risks in China’s research landscape

Is China strategically important to the west? How you answer that question might depend on how old you are. A recent survey showed that 36% of US residents under the age of 30 ranked China over the UK as America’s most valuable strategic partner, however just 4% of respondents over the age of 70 felt the same way. While the disparity may be surprising, the fact that the younger generation feel China is more important for the US strategically is perhaps not. But what does this mean for the world of research, with so many of these young people entering higher education and research programs in the US?

A new report published by the Center for Research Security & Integrity offers some sobering thoughts on problems faced by US researchers and others when engaging with their counterparts in China. The report – Transparency and Integrity Risks in China’s Research Ecosystem: A Primer and Call to Action – is a comprehensive analysis identifying and mitigating transparency and integrity risks posed by Chinese research programs, focusing on practices that contradict the norms of some liberal democracies. 

The report, summarized below, delivers key findings and recommendations for countries such as the US in dealing with the potential threat these international research collaborations might pose. The findings around risk elements such as transparency and integrity ultimately show the importance of trusted partners like Digital Science in the research ecosystem, including its products Dimensions Research Security and Dimensions Research Integrity.Among the co-authors of the report is Digital Science’s VP of Research Integrity, Dr Leslie McIntosh.

Transparency risks

The report emphasizes a range of transparency issues present in China’s research ecosystem. These risks include:

  • Denial of Access: Institutions like the China Aerodynamics Research and Development Center (CARDC) obfuscate their ties to the People’s Liberation Army (PLA), while their websites are inaccessible to foreign viewers.
  • Website Discrepancies: The English-language websites of Chinese research institutions often omit critical details, such as departmental structures or research affiliations. For example, the Chinese Academy of Sciences Dalian Institute of Chemical Physics shows significant differences between its English and Chinese websites, with key research information missing from the English version.
  • Use of Alternative Names: Certain entities, such as the China Electronics Technology Group Corporation’s 13th Research Institute, operate under alternative names, potentially masking any defense affiliations.

Integrity concerns

The report also investigates integrity issues, which can be broader and more challenging to detect. China accounts for a significant portion of retracted scientific publications due to nefarious practices, including:

  • Deceptive Authorship: Researchers have been found to add foreign coauthors to publications to bolster credibility, while others use pseudonyms to avoid scrutiny.
  • Fraudulent Publications: The proliferation of “paper mills” that produce fake scientific papers is also regarded as a significant issue, particularly in medical and health sciences.

National security concerns

Many transparency and integrity concerns overlap with national security risks. The report shows:

  • Several Chinese research institutions have ties to the PLA and other defense entities, but these affiliations are often concealed in international collaborations.
  • Research institutions in liberal democracies may unwittingly collaborate with Chinese entities involved in military research.

Recommendations

Given these concerns, the report provides some timely ways forward in the following recommendations:

Collaboration on exposing transparency and integrity issues

Think tanks, NGOs and academic institutions in liberal democracies should work together to expand on the issues highlighted. The report calls for the creation of a China Transparency & Integrity Tracker, a tool to catalog PRC entities that violate transparency and integrity norms.

Government-led initiatives

Governments should take the lead in identifying and cataloging research misconduct. For instance, US government agencies should sponsor the development of large-scale monitoring systems for questionable publications, using techniques like forensic scientometrics. Governments should also share information on PRC entities and programs acting in bad faith, to assist research institutions in conducting due diligence.

Risk mitigation

Research institutions should develop policies that consider transparency and integrity when deciding whether to collaborate with PRC entities. This includes providing mechanisms for reporting problematic behaviors by Chinese partners. Institutions should also ensure that coauthors verify the integrity of the research they are involved in to prevent deceptive practices.

Public disclosure

Governments and research organizations should create public repositories of information on PRC institutions that have engaged in deceptive practices. These databases would allow informed decision-making when considering collaborations with Chinese research institutions.

Conclusion

Transparency and Integrity Risks in China’s Research Ecosystem: A Primer and Call to Action emphasizes the need for liberal democracies to take a more proactive approach in scrutinizing the transparency and integrity of research collaborations with China – actions that can be supported through the use of tools such as Dimensions Research Security. The risks posed by Chinese research institutions are substantial, and addressing these challenges requires collective action from governments, academic institutions, and civil society organizations. In other words, it’s time to act.

To learn more about how Digital Science and its Dimensions Research Security product can help your organization, arrange a call with one of our experts.

The post Mission-critical: Risks and repercussions presented by China’s research programs appeared first on Digital Science.

]]>
The TL;DR on… ERROR https://www.digital-science.com/blog/2024/09/tldr-error/ Wed, 25 Sep 2024 17:02:11 +0000 https://www.digital-science.com/?post_type=tldr_article&p=72358 We love a good deep dive into the awkward challenges and innovative solutions that are transforming the world of academia and industry. In this article and in the full video interview, we’re discussing an interesting new initiative that’s been making waves in the research community: ERROR.
Inspired by bug bounty programs in the tech industry, ERROR offers financial rewards to those who identify and report errors in academic research. ERROR has the potential to revolutionise how we approach, among other things, research integrity and open research by incentivising the thorough scrutiny of published research information and enhancing transparency.
Suze sat down with two other members of the TL;DR team, Leslie and Mark, to shed light on how ERROR can bolster trust and credibility in scientific findings, and explore how this initiative aligns with the principles of open research and how all these things can drive a culture of collaboration and accountability. They also discussed the impact that ERROR could have on the research community and beyond.

The post The TL;DR on… ERROR appeared first on Digital Science.

]]>
We love a good deep dive into the awkward challenges and innovative solutions transforming the world of academia and industry. In this article and in the full video interview, we’re discussing an interesting new initiative that’s been making waves in the research community: ERROR.

Inspired by bug bounty programs in the tech industry, ERROR offers financial rewards to those who identify and report errors in academic research. ERROR has the potential to revolutionize how we approach, among other things, research integrity and open research by incentivizing the thorough scrutiny of published research information and enhancing transparency.

I sat down with two other members of the TL;DR team, VP of Research Integrity Leslie McIntosh and VP of Open Research Mark Hahnel, to shed light on how ERROR can bolster trust and credibility in scientific findings, and explore how this initiative aligns with the principles of open research – and how all these things can drive a culture of collaboration and accountability. We also discussed the impact that ERROR could have on the research community and beyond.

ERROR is a brand new initiative created to tackle errors in research publications through incentivized checking. The TL;DR team sat down for a chat about what this means for the research community through the lenses of research integrity and open research.

Leslie’s perspective on ERROR

Leslie’s initial thoughts about ERROR were cautious, recognizing its potential to strengthen research integrity but also raising concerns about unintended consequences.

She noted that errors are an inherent part of the scientific process, and over-standardization might risk losing the exploratory nature of discovery. Drawing parallels to the food industry’s pursuit of efficiency leading to uniformity and loss of nutrients, Leslie suggested that aiming for perfection in science could overlook the value of learning from mistakes. She warned that emphasizing error correction too rigidly might diminish the broader mission of science – discovery and understanding.

Leslie: “Errors are part of science and part of the discovery… are we going so deep into science and saying that everything has to be perfect, that we’re losing the greater meaning of what it is to search for truth or discovery [or] understand that there’s learning in the errors that we have?”

Leslie also linked this discussion to open research. While open science encourages interpretation and influence from diverse participants, the public’s misunderstanding of scientific errors could weaponize these mistakes, undermining trust in research. She stressed that errors are an integral, even exciting, part of the scientific method and should be embraced rather than hidden.

Mark’s perspective on ERROR

Mark’s initial thoughts were more optimistic, especially within the context of open research.

Mark: “…one of the benefits of open research is we can move further faster and remove any barriers to building on top of the research that’s gone beforehand. And the most important thing you need is trust, [which] is more important than speed of publication, or how open it is, [or] the cost-effectiveness of the dissemination of that research.”

Mark also shared his excitement about innovation in the way we do research. He was particularly excited about ERROR’s approach to addressing the problem of peer review, as the initiative offers a new way of tackling longstanding issues in academia by bringing in more participants to scrutinize research.

He thought the introduction of financial incentives to encourage error reporting could lead to a more reliable research landscape.

“I think the payment for the work is the most interesting part for me, because when we look at academia and perverse incentives in general, I’m excited that academics who are often not paid for their work are being paid for their work in academic publishing.”

However, Mark’s optimism was not entirely without wariness. He shared Leslie’s caution about the incentives, warning of potential unintended outcomes. Financial rewards might encourage individuals to prioritize finding errors for profit rather than for the advancement of science, raising ethical concerns.

Ethical concerns with incentivization

Leslie expressed reservations about the terminology of “bounty hunters”, which she felt criminalizes those who make honest mistakes in science. She emphasized that errors are often unintentional.

Leslie: “It just makes me cringe… People who make honest errors are not criminals. That is part of science. So I really think that ethically when we are using a term like bounty hunters, it connotes a feeling of criminalization. And I think there are some ethical concerns there with doing that.”

Leslie’s ethical concerns extended to the global research ecosystem, noting that ERROR could disproportionately benefit well-funded researchers from the Global North, leaving under-resourced researchers at a disadvantage. She urged for more inclusive oversight and diversity in the initiative’s leadership to prevent inequities.

She also agreed with Mark about the importance of rewarding researchers for their contributions. Many researchers do unpaid labor in academia, and compensating them for their efforts could be a significant positive change.

Challenges of integrating ERROR with open research

ERROR is a promising initiative, but I wanted to hear about the challenges in integrating a system like this alongside existing open research practices, especially when open research itself is such a broad, global and culturally diverse endeavor.

Both Leslie and Mark emphasized the importance of ensuring that the system includes various research approaches from around the world.

Mark: “I for one think all peer review should be paid and that’s something that is relatively controversial in the conversations I have. What does it mean for financial incentivization in countries where the economics is so disparate?”

Mark extended this concept of inclusion to the application of artificial intelligence (AI), machine learning (ML) and large language models (LLMs) in research, noting that training these technologies requires access to diverse and accurate data. He warned that if certain research communities are excluded, their knowledge may not be reflected in the datasets used to build future AI research tools.

“What about the people who do not have access to this and therefore their content doesn’t get included in the large language models, and doesn’t go on to form new knowledge?”

He also expressed excitement about the potential for ERROR to enhance research integrity in AI and ML development. He highlighted the need for robust and diverse data, emphasizing that machines need both accurate and erroneous data to learn effectively. This approach could ultimately improve the quality of research content, making it more trustworthy for both human and machine use.

Improving research tools and integrity

Given the challenges within research and the current limitations of tools like ERROR, I asked Leslie what she would like to see in the development of these and other research tools, especially within the area of research integrity. She took the opportunity to reflect on the joy of errors and failure in science.

Leslie: “If you go back to Alexander Fleming’s paper on penicillin and read that, it is a story. It is a story of the errors that he had… And those errors were part of or are part of that seminal paper. It’s incredible, so why not celebrate the errors and put those as part of the paper, talk about [how] ‘we tried this, and you know what, the refrigerator went out during this time, and what we learned from the refrigerator going out is that the bug still grew’, or whatever it was.

“You need those errors in order to learn from the errors, meaning you need those captured, so that you can learn what is and what is not contributing to that overall goal and why it isn’t. So we actually need more of the information of how things went wrong.”

I also asked Mark what improvements he would like to see from tools like ERROR from the open research perspective. He emphasized the need for better metadata in research publishing, especially in the context of open data. Drawing parallels to the open-source software world, where detailed documentation helps others build on existing work, he suggested that improving how researchers describe their data could enhance collaboration.

Mark also feels that the development of a tool like ERROR highlights other challenges in the way we are currently publishing research, such as deeper issues with peer review, or incentives for scholarly publishing.

Mark: “…the incentive structure of only publishing novel research in certain journals builds into that idea that you’re not going to publish your null data, because it’s not novel and the incentive structure isn’t there. So as I said, could talk for hours about why I’m excited about it, but I think the ERROR review team have a lot of things to unpack.”

Future of research integrity and open research

What do Leslie and Mark want the research community to take away from this discussion on error reporting and its impact on research integrity and open research?

Leslie wants to shine a light on science communication and its role in helping the public to understand what ERROR represents, and how it fits into the scientific ecosystem.

Leslie: “…one of the ways in which science is being weaponized is to say peer review is dead. You start breaking apart one of the scaffolds of trust that we have within science… So I think that the science communicators here are very important in the narrative of what this is, what it isn’t, and what science is.”

Both Leslie and Mark agreed that while ERROR presents exciting possibilities, scaling the initiative remains a challenge. Mark raised questions about how ERROR could expand beyond its current scope, with only 250 papers reviewed over four years and each successful error detection earning a financial reward. Considering the millions of papers published annually, it is unclear how ERROR can be scaled globally and become a sustainable solution.

Mark: “…my biggest concern about this is, how does it scale? A thousand francs a pop, it’s 250 papers. There [were] two million papers [published] last year. Who’s going to pay for that? How do you make this global? How do you make this all-encompassing?”

Conclusion

It is clear from our discussion that ERROR represents a significant step forward in experimenting to enhance both research integrity and open research through this incentivised bug-hunting system.

Leslie has highlighted how the initiative can act as a robust safeguard, ensuring that research findings are more thoroughly vetted and reliable, but she does remind us that we need to be inclusive in this approach. Mark has also emphasized the potential for a tool like this in making publication processes more efficient – and even finally rewarding researchers for all the additional work that they’re doing – but he does wonder how this can scale up to foster a more transparent and collaborative research environment that aligns perfectly with the ethos of open research as well.

Leslie and Mark’s comments are certainly timely, given that the theme of Digital Science’s 2024 Catalyst Grant program is innovation for research integrity. You can find out more about how different segments of research can and should be contributing to this space by reading our TL;DR article on it here.

We look forward to exploring more innovations and initiatives that are going to shape – or shatter – the future of academia, so if you’d like to suggest a topic we should be discussing, please let us know.

The post The TL;DR on… ERROR appeared first on Digital Science.

]]>
Innovation and integrity across all research segments to safeguard the future of research https://www.digital-science.com/blog/2024/09/innovation-and-integrity-to-safeguard-the-future-of-research/ Wed, 18 Sep 2024 07:00:00 +0000 https://www.digital-science.com/?post_type=tldr_article&p=73293 Maintaining integrity and security is paramount in the ever-evolving landscape of research. While science, technology and medicine publishers have made significant strides in this area, the importance of innovative solutions in this space extends beyond publishing. It is within this environment that Digital Science is now seeking technology-driven ideas to safeguard research integrity and support trust in science. This is the focus of our Catalyst Grant round for 2024.

The post Innovation and integrity across all research segments to safeguard the future of research appeared first on Digital Science.

]]>

Innovation meets integrity: Digital Science’s call for tools that build trust in research

Maintaining integrity and security is paramount in the ever-evolving landscape of research. While science, technology and medicine (STM) publishers have made significant strides in this area, the importance of innovative solutions in this space extends beyond publishing.

It is within this environment that Digital Science is now seeking technology-driven ideas to safeguard research integrity and support trust in science. This is the focus of our Catalyst Grant round for 2024.

Digital Science is also one of the prime organizations behind the push for a new field of research integrity forensics, known as Forensic Scientometrics (FoSci).

Ensuring research integrity is a collective responsibility that benefits all segments of the research ecosystem, from individual researchers to governments to industrial organizations. Here are the key stakeholders and the opportunities for each to bolster research integrity and security.

Researchers

Innovative frameworks and tools help researchers maintain high ethical standards by preventing misconduct such as fabrication, falsification, and plagiarism. However, much more innovation is needed to uphold integrity in the research ecosystem, including solutions for academic institutions, governments, funders, and more. These innovations ensure the authenticity and reliability of research findings. Additionally, advanced training programs on research ethics and security protocols can equip researchers with the necessary knowledge and skills to conduct their work responsibly, fostering a culture of integrity from the outset.

Academic institutions

Strengthened policies and procedures are essential for ensuring compliance with ethical standards and security protocols, significantly reducing the risk of breaches and misconduct, safeguarding the institution’s integrity, and supporting research and researchers. A strong commitment to high standards of integrity and security enhances an institution’s reputation, attracting top talent and funding, and solidifying its standing in the academic community.

Investing in infrastructure and resources to support research integrity and security such as secure data storage systems and comprehensive training programs is crucial for fostering a culture of integrity. Additionally, improved frameworks for safe and ethical collaboration with external partners facilitate partnerships with other academic institutions and industry, ensuring that these collaborative efforts adhere to high standards of integrity and security.

Publishing

Adopting advanced technologies to enhance the peer review process is crucial for ensuring the integrity and quality of published research. These technologies help to maintain rigorous review standards, upholding the credibility of scientific literature. Additionally, sophisticated detection tools are vital for preventing the publication of unoriginal or unethical work, and safeguarding the originality and integrity of research publications.

Improved mechanisms and processes for transparently handling retractions and corrections are necessary to maintain the credibility of scientific literature, grants and patents, ensuring that errors are addressed promptly and openly. Ensuring the security of submitted manuscripts and associated data is also a top priority, as well as protecting intellectual property and sensitive information, maintaining the trust of authors and readers alike.

Governments and funders

Creating and enforcing robust policies and regulations is essential for promoting research integrity and security, ensuring public trust in funded research, and providing a clear framework for ethical conduct. Prioritising funding for projects and institutions that demonstrate firm commitments to integrity and security ensures that resources are allocated to trustworthy and responsible research endeavors.

Enhanced mechanisms for monitoring and auditing funded research are crucial for ensuring accountability and transparency in public funds and building public confidence in the research process. Furthermore, establishing international standards and agreements promotes global research integrity and security, facilitating cross-border collaborations and driving scientific progress on a global scale.

Corporate industrial research organizations

Advanced methods for protecting intellectual property and proprietary data are crucial for maintaining a competitive advantage and ensuring compliance with legal requirements, safeguarding valuable research assets. Secure and ethical frameworks for collaborating with academic researchers ensure mutual benefits and adherence to integrity standards, driving innovation while upholding high ethical standards.

Balancing innovation with compliance is essential to ensure that cutting-edge research aligns with ethical and security standards, fostering a responsible and forward-thinking research environment. Developing comprehensive risk management strategies is also vital for mitigating potential breaches in research integrity and security, protecting the organization’s reputation and research investment.

Impact on the research community as a whole

Trust and credibility

Improved trust and credibility of research outputs benefit the entire research ecosystem. Enhanced public confidence in scientific findings drives support for further research and innovation.

Efficiency and productivity

Streamlined processes and tools for ensuring research integrity and security lead to more efficient and productive research environments. This efficiency accelerates scientific discovery and application, the impact of which can be felt by everyone.

Global collaboration

Harmonized standards and practices facilitate international research collaborations. These collaborations drive global scientific progress, and address pressing challenges that transcend borders.

Innovative solutions in research integrity are crucial for all segments of the research ecosystem. While STM publishers have pioneered efforts in this domain, the broader research community must continue to build on these foundations. By fostering a culture of integrity, we can ensure that research remains a trusted and vital force for advancing knowledge and improving lives worldwide.

The post Innovation and integrity across all research segments to safeguard the future of research appeared first on Digital Science.

]]>
Shining a light on conflict of interest statements https://www.digital-science.com/blog/2024/09/shining-a-light-on-conflict-of-interest-statements/ Thu, 05 Sep 2024 14:56:41 +0000 https://www.digital-science.com/?p=73188 A Digital Science study of conflict of interest statements highlights the need for a more careful appraisal of published research.

The post Shining a light on conflict of interest statements appeared first on Digital Science.

]]>
Understanding the complexities of conflict of interest disclosures in research

Authors either have a conflict of interest or not, right? Wrong. Research from Digital Science has uncovered a tangled web of missing statements, errors, and subterfuge, which highlights the need for a more careful appraisal of published research.

At this year’s World Conference on Research Integrity, a team of researchers from Digital Science led by Pritha Sarkar presented a poster with findings from their deep dive on conflict of interest (COI) statements. Entitled Conflict of Interest: A data driven approach to categorisation of COI statements, the initial goal was to look at COI statements with a view to creating a binary model that determines whether a Conflict of Interest statement is present or not in an article. 

However, all was not as it seemed. While some articles had no COI and some had one present, those present covered a number of different areas, which led the team to think COIs might represent a spectrum rather than binary options.

Gold standard

Conflict of interest is a crucial aspect of academic integrity. Properly declaring a COI statement is essential for other researchers to assess any potential bias in scholarly articles. However, those same researchers often encounter COI statements that are either inadequate or misleading in some way even if they are present. 

The Digital Science team – all working on research integrity with Dimensions – soon realized the data could be leveraged further to better explore the richness inherent in the nuanced COI statements. After further research and analysis, it became clear that COI statements could be categorized into six distinct types:

  1. None Declared
  2. Membership or Employment
  3. Funds Received
  4. Shareholder, Stakeholder or Ownership
  5. Personal Relationship
  6. Donation

This analysis involved manually annotating hundreds of COI statements with Natural Language Processing (NLP) tools. The aim was to create a gold standard that could be used to categorize all other COI statements, however despite the team’s diligence a significant challenge persisted in the shape of ‘data skewness’ – which can be defined as an imbalance in the distribution of data within a dataset that can impact data processing and analytics.

Fatal flaw

One irresistible conclusion to the data skewness was a simple one – that authors weren’t truthfully reporting their conflicts of interest. But could this really be true?

The gold standard approach came from manually and expertly annotating COI statements to develop an auto-annotation process. However, despite the algorithm’s ability to auto-annotate 33,812 papers in just 15 minutes, the skewness that had been initially identified persisted, leading to the false reporting theory for authors (see Figure 1 of COI Poster). 

To firm up this hypothesis, when the Retraction Watch database was analyzed, the troubling trend, including the discrepancy between reported COI category and retraction reason, became even more apparent (see Figure 2 of the COI Poster). 

Moreover, when the team continued with the investigation, they found there were 24,289 overlapping papers in Dimensions GBQ and Retraction Watch, and among those papers, 393 were retracted due to conflict of interest. Out of those 393 papers, 134 had a COI statement, however 119 declared there was no conflict to declare.

Conclusion

Underreporting and misreporting conflict of interest statements or types can undermine the integrity of scholarly work. Other research integrity issues around paper mills, plagiarism and predatory journals have already damaged the trust the public has with published research, so further problems with COIs can only worsen the situation. With the evidence of these findings, it is clear that all stakeholders in the research publication process must adopt standard practices on reporting critical trust markers such as COI to uphold the transparency and honesty in scholarly endeavors. 

To finish on a positive note, this research poster was awarded second-place at the 2024 World Conference on Research Integrity, showing that the team’s research has already attracted considerable attention among those who seek to safeguard research integrity and trust in science.

You can find the poster on Figshare: https://doi.org/10.6084/m9.figshare.25901707.v2

Partial data and the code for this project are also available on Figshare.

For more on the topic of research integrity, see details of Digital Science’s Catalyst Grant award for 2024, which focuses on digital solutions around this topic.

The post Shining a light on conflict of interest statements appeared first on Digital Science.

]]>
Digital Science’s Catalyst Grant calls for innovations to safeguard research integrity https://www.digital-science.com/blog/2024/08/catalyst-grant-to-safeguard-research-integrity/ Mon, 19 Aug 2024 10:50:46 +0000 https://www.digital-science.com/?post_type=press-release&p=72934 Digital Science’s 2024 Catalyst Grant round is driven by a need to address one of the most pressing issues faced by all research stakeholders: research integrity.

The post Digital Science’s Catalyst Grant calls for innovations to safeguard research integrity appeared first on Digital Science.

]]>

Up to £25,000 to be won for tech ideas that support trust in science

Monday 19 August 2024

Digital Science is seeking innovative, technology-driven ideas to safeguard research integrity and support trust in science, as the focus of its Catalyst Grant round for 2024.

Up to £25,000 will be awarded to individuals or startups for innovative technology ideas.

Launched with the campaign We Believe in… Research Integrity, this year’s Catalyst Grant round is driven by a need to address one of the most pressing issues faced by all research stakeholders, which impacts on the community’s trust in science.

The application deadline is Monday 14 October 2024, 12:00pm BST / 7:00am EDT.

Join the conversation on social media with: #CatalystGrant #ResearchIntegrity

Steve Scott, Director of Portfolio Development at Digital Science, says: “Now in its 14th year, the Digital Science Catalyst Grant supports innovation, cultivating early-stage software ideas and enabling them to come to fruition.

“This year’s focus on Research Integrity recognizes the very real issues facing researchers, academic institutions, publishers, governments, and funding bodies, and the need for improved public trust in research and its benefits for society.

“Our 2024 Catalyst Grant round is now looking for the best and most innovative uses of technology to support Research Integrity and Trust in Science,” he says.

Dr Leslie McIntosh, VP of Research Integrity at Digital Science, co-founded the company Ripeta, which was a 2017 winner of Catalyst Grant. Today, Ripeta’s ‘trust markers’ technology underpins Digital Science’s products Dimensions Research Integrity and Dimensions Research Security.

Dr McIntosh says: “Trust in research is the bedrock of healthy societies, and research integrity is a critical challenge in today’s research ecosystem. Safeguarding this integrity is the responsibility of everyone involved in research – policymakers, corporations, publishers, institutions, and researchers alike. While we face philosophical issues in society, we urgently need tangible solutions. We must strengthen and reimagine research integrity to uphold trust in the face of recent changes in open science and technological advances.

“As a past winner of the Catalyst Grant, I’m excited the grant might unearth another outstanding technology that will help safeguard the integrity of the scholarly record,” she says.

About Catalyst Grant

The Digital Science Catalyst Grant is an international initiative to support innovation in new software tools and technologies to advance research and create meaningful change.

The program supports and invests in early-stage ideas in the novel use of technology, with an award of up to £25,000 for the most promising ideas that aid research and further its impact on society.

Now in its 14th year, the Catalyst Grant will be awarded to innovative individuals or startups, without the need for a complete business or development plan. Several previous Catalyst Grant winners have developed important products and solutions within Digital Science itself.

Research Integrity – background

Public trust in scientific research has taken a downturn, accelerated by the pandemic. The UK Research Integrity Office (UKRIO) reports a 71% increase in formal requests regarding integrity issues since 2007 (including plagiarism, falsification, research ethics, publication ethics and authorship, financial mismanagement, and conflicts of interest), with a third from the health and biomedical sector. Yet, many people and small companies are innovating and implementing solutions to improve research.

Research Integrity – the Catalyst Grant Focus

For Catalyst Grant 2024, Digital Science is looking for novel applications of technology to support research integrity and security in areas such as:

  • Accountability and Transparency: Enhanced mechanisms for monitoring funded research, ensuring accountability and transparency in the use of public funds.
  • Ethical Standards: Improved frameworks and tools for maintaining high ethical standards, preventing misconduct (such as plagiarism, data fabrication, and falsification), and preventing or identifying scientific disinformation.
  • Efficiency and Productivity: Streamlined processes and tools for ensuring research integrity and security, leading to more efficient and productive research environments.
  • Global Collaboration: Harmonized standards and practices facilitating international research collaborations, driving global scientific progress.

Apply for the Digital Science Catalyst Grant

The Digital Science Catalyst Grant is now open for entries. Key details:

  • Visit the Digital Science Catalyst Grant website for full eligibility criteria and how to apply
  • Open globally to individuals and startups with early-stage software ideas
  • Focus on technologies that safeguard Research Integrity and build Trust in Science
  • Questions about the Catalyst Grant to be directed to: catalyst@digital-science.com
  • Deadline: Monday 14 October 2024, 12pm BST / 7am EDT.
Catalyst Grant calls for innovations to safeguard research integrity

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, ReadCube, Symplectic, IFI CLAIMS Patent Services, Overleaf, Writefull, OntoChem, Scismic and metaphacts – we believe when we solve problems together, we drive progress for all. Visit www.digital-science.com and follow @digitalsci on X or on LinkedIn.

Media contacts

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

The post Digital Science’s Catalyst Grant calls for innovations to safeguard research integrity appeared first on Digital Science.

]]>
TL;DR Shorts: Mariette DiChristina on trust and civic science https://www.digital-science.com/blog/2024/07/tldr-shorts-mariette-dichristina-on-trust-and-civic-science/ Tue, 16 Jul 2024 10:45:00 +0000 https://www.digital-science.com/?post_type=tldr_article&p=72502 This TL;DR Tuesday we’re talking about civic science – the democratisation of knowledge production through engaging with impacted communities to co-produce scientific solutions – as a tool for building trust in research with Boston University’s Mariette DiChristina, Dean of the College of Communication.

The post TL;DR Shorts: Mariette DiChristina on trust and civic science appeared first on Digital Science.

]]>
This week’s TL;DR Shorts episode features Mariette DiChristina, Dean of the College of Communication at Boston University, and former Editor-in-Chief of Scientific American, talking about the role of civic science in building trust in research.

Mariette reminds us of the role of communications in trust in research, something she touched on in a previous TL;DR Shorts episode. Mariette extends the idea from dissemination into true engagement – a two-way listening and learning process where communities impacted by science can themselves be involved in the creation and development of novel research.

Mariette DiChristina, Dean of the College of Communication at Boston University, discusses the role that civic science can play in helping build trust in research.

Also known as public engagement and patient and public involvement (PPI), civic science takes an interdisciplinary approach to integrate scientific research with community engagement and even participation, in order to more appropriately and impactfully overcome societal challenges. It does this through the democratisation of knowledge production by involving impacted communities in the scientific process and ensures that research best reflects the public’s actual needs and values. Civic science emphasises collaboration between scientists, policymakers, and the public to co-create solutions that are scientifically sound and socially relevant. This goes some way to enhancing and improving the transparency, accountability, and impact of scientific research, while also scaffolding a more informed and engaged society.

If you’d like to suggest future contributors for our series or suggest some topics you’d like us to cover, drop Suze a message on one of our social media channels and use the hashtag #TLDRShorts. Subscribe now to be notified of each weekly release of the latest TL;DR Short, and catch up with the entire series here.

The post TL;DR Shorts: Mariette DiChristina on trust and civic science appeared first on Digital Science.

]]>