The United Kingdom’s Medicines and Healthcare products Regulatory Agency (MHRA) has proposed to overhaul its regulatory framework for medical devices that will span all medical technologies. The proposed change is just one of several changes that may be passed into active medical device regulation in the U.K., given that a consultation is already underway for a national strategy for artificial intelligence (AI), which carries potential privacy implications for the medical device industry.
The proposal to revise the U.K. regulatory system, which was under consultation through Nov. 25, would update the existing Medical Devices Regulation of 2002. The regulatory breadth of this proposal covers the conduct of clinical investigations, premarket assessment of devices, and postmarket surveillance. The timing of the new framework is critical, as these changes must be in place by July 2023, the end date for U.K. acceptance of devices with a CE mark from the European Union (EU).
MHRA said it wants to promote innovation without elevating the risks associated with new device technologies, and one way to do this is to create new access pathways to support innovation. One of the more significant proposals to offset the emphasis on innovation is a more specific set of risk classification rules. MHRA uses a four-class system for risk (class I, IIa, IIb and III), which would be retained. The rules governing which of these classes a device would fall into would be revised, however, one example of which is that surgical meshes would be class III products.
Device equivalence regulation may be amended
At present, device makers can use clinical data developed for similar, or “equivalent” devices to claim equivalence, which eliminates the need to conduct a full suite of clinical studies. That could change under the new framework, which would require that the new device be entirely equivalent, not just equivalent in part. For a manufacturer to claim complete equivalence, it would have to have a contract with the maker of the predicate device to obtain full access to that device’s technical documentation on an ongoing basis.
MHRA indicated that it may revise the risk classification rules for in vitro diagnostics (IVDs) to align with the approach in use in the EU or the International Medical Device Regulators Forum (IMDRF). Companion diagnostics are currently regulated in the lowest class of risk, but that, too, could change, as MHRA said it may impose a risk classification that is proportional to the hazards of an errant result from a companion diagnostic test result.
The section for software as a medical device (SaMD) proposes a new definition for the term, which would align with the definition used in the European Union. This would define software as “a set of instructions that processes input data and creates output data.” For software and artificial intelligence, the U.K. national strategy is intended to invest and plan for the long-term needs of the AI ecosystem, and would work to ensure that governance of AI is appropriate. This plan works across all industries and so is not specific to medical technologies, but takes up questions such as ethics and transparency, two issues that do affect developers of medical software.
The U.K. national strategy for AI is intended as a 10-year plan, and the section on digital regulation includes several points regarding the accumulation, processing and portability of persona data. MHRA pointed to the EU’s General Data Protection Regulation as an example of how the debate over privacy and data security may unfold in the coming years. The privacy provisions of the GDPR may be applicable to device makers located outside of the EU when they are conducting trials in EU nations, a feature that if replicated in other national privacy legislation could create an unnavigable international patchwork of privacy regulation.
Device makers need to be increasingly vigilant about privacy considerations generally when they are doing business across national boundaries. The European Union’s General Data Protection Regulation (GDPR) has been a source of considerable angst for any entities that deal with personal data, although the European Commission has provided a new set of guidelines for standard contractual clauses for data transfer, clearing out a large point of uncertainty. The picture in the U.S. is less clear, given that several states have passed legislation, seeming to suggest a patchwork of privacy law is emerging.
California, in particular, has passed two laws regarding data privacy, including the California Consumer Privacy Act. This law had been in force only 10 months when the California Privacy Rights Act was passed, thus amplifying an already demanding privacy rights framework that will press digital health developers to avoid legal troubles in the Golden State.