Data integrity plays an essential role for the final delivery of products in the bio/pharmaceutical and medtech industries, as the availability of complete, accurate, and reliable data is of paramount importance to ensure drug safety and quality. Data integrity refers to both manual (paper) and electronic data, and the availability of increasingly sophisticated digital systems is making issues relating to data integrity increasingly complex. Breaching in data integrity may result in the issuing of warning letters from the regulatory authorities, of import alerts or in the assignment of penalties to the organisation.
The regulatory references
In the European Union, data integrity is regulated by the Annex 11 to the good manufacturing practices (GMP), which applies to all informatics systems used for GMP activities. The IT infrastructure has to be qualified and all the applications validated (see also the article from Jain Sanjay Kumar on The Pharma Innovation Journal).
In the US, a relevant regulation is contained in the 21 CFR Part 11, which applies to any electronic record submitted to the Food and Drug Administration, identified or not in Agency regulations. Maintaining data integrity is difficult: the Data integrity and compliance with CGMP. Guidance for Industry of the FDA of April 2016 clearly represents the increasing number of current GMP (CGMP) violations involving data integrity observed during inspections. Despite this difficulty, all decisions in the GMP manufacturing environment is based on the reliability of generated data. Regulatory authorities worldwide are thus paying increasing attention in running inspection to assess data integrity.
The main requirements listed by the FDA guideline include provisions in order to ensure that backup data are exact and complete, and secure from alteration, inadvertent erasures, or loss (§211.68 of CGMP), data are properly stored to prevent deterioration or loss (212.110(b), certain activities are to be documented at the time of performance and that laboratory controls are scientifically sound (211.100 and 211.160), records are retained as original records, true copies, or other accurate reproductions of the original records (211.180). Complete information, complete data derived from all tests, complete record of all data, and complete records of all tests performed are also required (211.188, 211.194, and 212.60(g)).
Focused stakeholders consultation on the revised draft PIC/S guidance
The Pharmaceutical Inspection Co-operation Scheme (PIC/S) has launched at the end of November a consultation for stakeholders on its revised “Draft guidance on good practices for data management and integrity in regulated GMP/GDP environments” (PI 041-1 (Draft 3)). The consultation is open up to 28 February 2019; the template and all details on how to forward comments and suggestions are available on the dedicated page of the PIC/S website.
The draft guidance has been prepared by the PIC/S Working Group on data integrity, co-led by the Australian regulatory agency TGA and the UK’s one, MHRA. The document describes how an inspector might position during the inspection of GDP/GMP facilities, with the final objective to facilitate a harmonised approach to the inspection, including reporting in regards to data management and integrity. The current consultation is specifically intended to address questions relating to the proportionality, clarity and implementation of the guidance requirements.
In parallel to the consultation, the new draft will be applied by PIC/S participating authorities on a trial basis for a new implementation trial period, as already done for the previous draft which was published in 2016 (PI 041-1 (Draft 2). The results of this first six months trial period have represented the basis for the updating and expanding of the guidance made by the Working Group.
A three level system to prevent issues
According to Jain Sanjay Kumar, a three level system may be established by pharmaceutical companies in order to prevent data integrity issues. An high internal quality culture of the organisation is the first, critical element needed to improve knowledge and best practices among operators. Leadership, engagement and empowerment of staff at all levels are the core principles of the company’s quality culture.
Control by design is the second feature, central to the European approach described by Annex 11. A good design of controls prevents the possibility to manipulate data or repeat testing to achieve the desired outcome. According to Annex 11, a validated computer system for record management should provide at least the same degree of confidence as that provided by paper based systems.
Risk management, together with audit trail, should be used to routinely assess possible issues for data integrity, throughout the entire lifecycle of the computerised system and taking into account patient safety, data integrity and product quality. A particular attention should be paid in the validation of softwares acquired from external providers. Data migration, data storage and security, incident management, business continuity, electronic signature and the management of printouts during weighting procedures in the lab are other items to be considered during risk assessment for data integrity. All these processes should be internally regulated by specific standard operative procedures (SOPs), in order to allow for a control-by-procedure approach to guarantee the integrity of data. This should be coupled to control-by-monitoring activities based on internal audits and the independent review of records.
Some tools and best practices to preserve data integrity
The preservation of data integrity has strict synergies and interconnections with data security, but the two concepts are clearly distinguished with regard to their final objective (see here, for example, the article on the Varonis’blog).
Many procedures needed to ensure data protection find also useful application for data integrity, i.e. access control or backup of data. But the first, fundamental step in order to demonstrate the truthfulness of data is represented by the validation of both inputs and data. The first refers to make sure the inserted data are correct, both in the case they originate from a known or unknown source. The identification of specifications and key attributes to demonstrate that data process has not been corrupted is the essential feature of the second aspect.
After validation, controls should be performed to remove duplicates and stray data which might be present in the IT system, even with the assistance of easily available applications to identify and remove from the hardware undesired files. Backup is critical to ensure nothing is lost in the case of a crush of the system or ransomware attacks, and it should be run as often as possible ensuring encryption of the data. No matter to say the control of data access is another fundamental security measure to prevent any issue with data integrity. It may be implemented, for example, using the privilege model. According to Cindy Ng’s article, also the physical access to sensitive servers should be prevented. An automatically-generated audit trail to track all events related to data management (create, delete, read, modified) or breaches in the system is the final measure to demonstrate the identification of the root cause of the problem. Audit trails can also allow to know who accessed the data, and at what time.
How to afford an audit
Possible strategies that might be used by inspectors during audits of data integrity have been discussed by Joy McElroy in a paper published in Pharmaceutical Online. The audit should interest the entire industrial process and it has to be carefully planned in order to identify problematic and high risk areas and all data or metadata to be looked for, including deleted and reprocessed data, “data that is being misused, or data that isn’t receiving a final review during batch record disposition”. The planning should also include the identification of the possible compliance issues or elements that might give rise to data breaches. Out of specifications (OOSs) and highly restrictive stability systems are example of potential inadequacies to focus on as the high-risk starting point of the audit.
A second phase of activity should see the selection of an extensive dataset over a predetermined time frame to be traced for all data generated along the entire productive process. A professional behaviour and comfortable environment for the audited party are important features to be put in place while conducting the audit, according to Joy McElroy. Employees should communicate all relevant information, as managers might be not completely aware of the specific processes. Data can be collected at a wide and inclusive range, or attention can be paid to processes most probably affecting the quality of products or results, always keeping a flexible approach to accommodate outcomes of the audit. Potential trends can be identified by the review of summary reports, coupled to the review of raw data along the entire chain of GMP activities.
In the case of inconsistencies, Joy McElroy indicates the importance to gain as information as possible to identify the root cause, avoiding any form of judgment. Every objectionable conditions observed during the audit should be adequately documented at the end of the exercise, including “any existing conditions that show potential opportunities for data integrity breaches due to inadequate controls or inappropriate motivation”. A particular attention should be paid in explaining the finding of orphan data and their possible origin, including the eventual need of reporting through the alert systems and recall assessment.
At the completion of the audit, corrective and preventive plans should also be developed, including interim controls and additional periodic review of datasets where needed. A data integrity improvement map may be also helpful to illustrate the progress of planned actions.