• Factchecks
  • Blog
  • Documentation
  • Browser
  • Home
  • Building tools to help fact checkers

    2018-10-02

    In early 2016, in collaboration with academia and the fact checking community, Google and schema.org helped to define a new structured data open standard, ClaimReview. In doing so, tech companies started to highlight high quality fact checking content, starting with Google through fact check tag and search rich snippet. Bing, Facebook and others soon began to do the same.

    In May 2018, we introduced datacommons.org, an initiative for the open sharing of data, and released the first fact check corpus to help academia and practitioners to study misinformation. This sample dataset of fact checks came from a small but diverse set of publishers. Still, the release of this open dataset led to more interests from researchers around the world to study misinformation. We also received requests from academia to update the fact check corpus regularly and allow more publishers and non-technical users to add ClaimReview markups.

    In the effort to continue to bolster high quality information on the Web, today, we are pleased to share two new tools by Google, starting in limited beta with the academia and journalist communities: Fact Check Explorer and Fact Check Markup Tool.

    Both tools can be found on Google's newly created experimental Fact Check Tools site. To start, the Fact Check Explorer (FCE) acts as a search engine for fact checking content and allows users to easily search for and find fact checks relevant to a topic of interests. Was a new species of trout discovered in Pennsylvania? Is the WhatsApp rumor about India’s government distributing free cycles to students true? What topics has a certain publisher fact checked? All of this can be found in the tool.

    The Fact Check Markup Tool, to complement this, makes it possible for journalists and fact checkers to add ClaimReview markup to their articles without having to code or mess around with their CMS, making the whole process easy to use even without coding skills. Here’s how it works: a journalist from a publisher copies the URL of an article they want to add the markup to, pastes it into the tool, completes the form (e.g. who made the claim? What is the verdict? etc.) and then clicks "submit". Once completed, the data will be shared openly on datacommons.org for any and all interested companies to access and programmatically incorporate into their products. Note that a journalist can only create markups for pages from their own publishers.

    FactCheck Explorer
    Screenshot of the Fact Check Markup Tool, now in beta.

    These two tools are being made available on limited release to select journalists and fact-checking organizations. The International Fact Checking Network will determine those users, in line with their Code of Principles. Over time, access to these tools will be opened more broadly.

    In line with our commitment to continued sharing of data, all ClaimReview markup data created via the Fact Check Markup Tool will be available via the Fact Check Markup Tool Data Feed in DataFeed format and will be updated on a frequent and regular basis. We are also releasing an updated version of the research Fact Check Corpus. Again, the meta-data in both datasets follows the open schema.org standard (ClaimReview) that has been adopted by most fact checking organizations and is currently being used by Google News, Google Search, Bing News, Bing Search, Facebook and others. And the field "url" points to the original fact-checking articles, whose content are not part of the released datasets and reside on the publishers' sites.

    We would like to thank all the contributors for this launch and especially call out International Fact-Checking Network and Duke Reporters' Lab for their continued partnership.

    ClaimReview was developed three years ago through a partnership of the Duke University Reporters’ Lab, Google, and schema.org. A pioneer of the digital fact-check movement, Duke’s Bill Adair is also announcing today, with the help of a grant from the Google News Initiative, a new global effort to help journalists and fact checkers adopt ClaimReview in their fact check operations. Read more about the effort here.


    Download Fact Checks
    FAQ

    Academia, Publishers and Tech Come Together to Open Up Fact Check Data

    2018-05-02

    The Internet began as an academic venture, intended to foster greater exchanges between members of different research communities. Since its inception, many of its major advancements have come from the academic world.

    Today as the Internet and the web play an increasingly important role in our everyday lives, we all benefit from that work. This growth has not been unchallenged, and the Internet has faced threats including DNS spoofing, browser malware, and spam. In every one of these cases, academia and industry have joined forces to address these attacks.

    Today, the attack is in the form of misinformation. The Internet has empowered individuals to communicate and publish like never before. Malicious actors seek to exploit this and undermine the systems we have built.

    We wish to attract the energies of academics to fight this new threat. Achieving this first requires a better understanding of the phenomenon. This of course, is best facilitated through the sharing of data. Today we launch datacommons.org, a schema.org-like initiative for the open sharing of data. We start with a dataset aimed at helping us understand the characteristics of misinformation.

    Fact checks offer a significant lens into the world of misinformation. In this release we are providing the metadata associated with a sample of fact checks from a number of different sources. The meta-data follows the open schema.org standard (ClaimReview) that has already been adopted by most fact checking organizations and is currently being used by Google News, Google Search, Bing News, Bing Search, and others.

    This release is a small step towards making it easier for researchers around the world to study misinformation and publish findings on a publicly available data set. In the future, we are committed to continued sharing of data related to the study of misinformation through datacommons.org and within the bounds of privacy constraints. This effort is a partnership between the fact checking community (International Fact-Checking Network), academic institutions (Carnegie Mellon University, Duke University), and industry (Bing, Google). Interested researchers can head directly to the Download Page