Skip to content

Regulatory Partners Release guidelines for PCCPs

The FDA and two other national regulatory agencies have collaborated on a set of guiding principles for predetermined change control plans, or PCCPs, which are applicable to machine learning-enabled devices. The other agencies involved in this development are Health Canada and the U.K. Medicines and Healthcare Products Regulatory Agency (MHRA), a roster that provides this set of recommendations with significant global credibility.

Troy Tazbaz, director of the FDA Digital Health Center of Excellence, stated that products that rely on artificial intelligence and machine learning (AI/ML) can improve the performance of medical device systems, but that they also present unique regulatory considerations. The rapid iteration of AI and ML products has created an unanticipated level of demand on regulatory agencies for multiple reviews of this category of software, a situation that has prompted the development of a previous FDA draft guidance for PCCPs that is applicable to AI and ML.Concept of harmonious coexistence of humans and AI technology 1200x624 px

One of the five guiding principles released by these three agencies is that any future changes to AI/ML software be limited to the intended use declared in the PCCP. The PCCP should also characterize the steps the developer will take to reverse any changes that would create a new intended use. Another guiding principle calls for transparency of the data used to develop and modify the AI/ML software, which should reflect the intended use population. The PCCP should also be transparent as to the testing for planned changes, as well as for the methods that will be used to monitor and respond to changes in device performance.

Also included in the guiding principles is an emphasis on risk management across the life cycle of the product, which should govern device performance for both individual changes and the effects of cumulative changes. The list of principles includes a call for evidence to be generated throughout the product life cycle to ensure that the product continues to operate safely and effectively throughout the life cycle. Evidence should also be generated to demonstrate that risks are adequately controlled, again throughout the entire product life cycle. These evidence generation methods should be scientifically and clinically justified, and should be proportionate to the risks incurred in device usage.

The March 2023 FDA draft guidance for the use of PCCPs in connection with AI and ML covers devices that update manually and automatically, although a PCCP used in this manner should describe the modifications that would be incorporated. The agency maintains a list of devices that have been cleared or approved either as stand-alone AI software or devices that incorporate AI software components, which totaled more than 500 as of March 2023. That list grew by more than 60 between the release of the draft guidance and July 27, disclosing intense interest in development of these advanced software products.

The FDA, MHRA and Health Canada had also previously collaborated on principles that would be applied toward AI and ML. In October 2021, the three agencies released a set of 10 principles for good machine learning practices, which like the PCCP guidelines are not compulsory. The International Medical Device Regulators Forum has assembled a working group to develop a series of good machine learning practices, although it does not appear that this WG has produced a draft or final GMLP document.

For additional resources contact the Marketing department

Phone: 888-633-6272

Medmarc is a member of ProAssurance Group, a family of specialty liability insurance companies. The product material is for informational purposes only. In the event any of the information presented conflicts with the terms and conditions of any policy of insurance offered from ProAssurance, its subsidiaries, and its affiliates, the terms and conditions of the actual policy will apply.

Copyright © 2024 - Medmarc

Back to Blog