Computer and Information SciencesHugo

Research Organization Registry (ROR)

Research Organization Registry (ROR)
The Research Organization Registry (ROR) is a global, community-led registry of open persistent identifiers for research organizations.
Home PageAtom Feed
language
ExplainersComputer and Information Sciences
Published

Have you ever wondered exactly what happens once you request a new ROR record or suggest a change to an existing ROR record? In this blog post, we take you through all the steps involved in ROR's open, community-driven process for making sure that the information in the ROR registry is complete and accurate.

Case StudiesComputer and Information Sciences
Published

In this interview with HighWire Press's Tony Alves, we learn that thanks to customer requests and a PID-aware development process, the publishing platform DigiCore Pro uses ROR in form lookups and automatic extraction processes for author affiliations, funder identification, peer reviewer affiliations, user disambiguation, and research integrity.

Case StudiesComputer and Information Sciences
Published

In this dual case study, we learn why the Howard Hughes Medical Institute (HHMI) relies on OA.Report and why OA.Report relies on ROR to help HHMI track compliance with its open access policy. “Even back then [in 2019], the best option was to lean on a big, community-owned solution. And it’s been great to see ROR effectively become the standard, the clear way forward for identifying organizations.” “We think ROR is terrific.

Adoption NewsComputer and Information Sciences
Published

OpenAlex has added a new metadata matching strategy co-developed by ROR and Crossref to its affiliation matching processes: ROR is also investigating the prospect of incorporating this new matching strategy into the ROR API in 2025. If you’ve been reading our recent series of blog posts about metadata matching, you know that automatic metadata matching at scale is a topic dear to our hearts.

ExplainersComputer and Information Sciences
Published

The fifth and final blog post about metadata matching by ROR’s Adam Buttrick and Crossref’s Dominika Tkaczyk outlines a set of pragmatic criteria for making decisions about metadata matching. Read all posts in the series on metadata matching. In our previous entry in this series, we explained that thorough evaluation is key to understanding a matching strategy’s performance.