Inspect Launches Open Platform for AI Safety Testing

The British Institute AI Safety Institute launched a new open platform for testing AI, available to the whole world.

Inspect is a software library that allows various groups, from startups to government institutions, to evaluate AI models – from their basic knowledge and reasoning to autonomous possibilities. The platform is released under an open license, so anyone can use it.

Inspect includes three main components:

  • data sets: provide samples for evaluative tests;
  • Tools for solving problems: Testing;
  • Tools for evaluating the results: Summarize points in metric.

The platform can also be supplemented with third-party packages written in Python, which allows you to adapt and develop it for high-quality assessment.

The AI Safety Institute is inspired by leading source developers with open source, such as GPT-Neox, Olmo, and Pythia, which provide publicly accessible educational data, training code, and evaluations under the OSI license, the weight of models, and partially trained control points. According to the institute, Inspect is an attempt to contribute to this list.

In parallel with the launch of Inspect, the Incubator of AI (I.AI) and the office of the prime minister of the country, together with leading developers of AI, will combine efforts to develop new open-source security tools. Such tools are easier to integrate into existing models, which will help improve their understanding and safety.

/Reports, release notes, official announcements.