Du benytter en nettleser vi ikke støtter. Se informasjon om nettlesere

Kapittel 4.2Initiated: Framework for quality assurance

Background

Section 6.2 of the final report for the coordination project states the following recommendation:

  • that the work to develop a framework for quality assurance be continued. This framework is being developed on an ongoing basis and in a dialogue with the sector, partly through the user panel, which both provides input concerning the deliverables and helps to prioritise themes that should be addressed in the subsequent steps.

Status

The work to draw up a report on the quality assurance of artificial intelligence in health and care services is continuing.

A user panel from the sector provides input to the entirety as well as specific chapters during the process.

The report is aimed at those who will be using AI solutions in the health and care services in Norway. It points to challenges that must be given particular consideration in any procurement process for an AI system. The document is a supplement to existing guides.

Inspired by A Buyer’s Guide to AI in health and care from the NHS in England, the overall structure is aimed at decision-makers, buyers and health professionals [38].

The content is also partly based on relevant and recognised sources, both nationally and internationally, and references can be found throughout the text and on a separate page.

This version of the report is limited to fully developed products that are procured, and thus does not include a framework for the in-house development of AI solutions. Systems with continuous learning fall outside the scope of this report.

Plan going forward

  • Finalise the report on the quality assurance of artificial intelligence in the health and care services 
    Expected delivery: third quarter 2024
  • Obtain experiences, maintain and further develop the guidance
  • Prioritise the next deliverables

Identified new needs are to

  • clarify when research is needed, i.e. additional clinical trials, before an AI product can be implemented in the health service.
  • clarify what a CE mark signifies and what it does not signify
  • provide support to smaller stakeholders such as GPs, contract specialists and municipal health services.
  • clarify how enterprises can assess and validate new versions of products that are already in use, including requirements regarding data sets for the validation.
  • clarify how AI products which learn and/or are fine-tuned continuously can be validated.
  • develop guidelines on how AI systems should be validated. Validation should be carried out according to a pre-specified protocol and so that an estimate of the method’s accuracy is obtained.
  • establish a common methodology (or standards) to validate various types of AI systems, including those involving large language models.
  • establish a nationally organised resource system to support the validation of AI solutions

Who is responsible?

The Norwegian Directorate of Health

Who collaborates?

The Norwegian Board of Health Supervision, South-Eastern Regional Health Authority and a broadly composed user panel. The Medical Products Agency quality-assures the work within its remit.

Last update: 18. februar 2025