The Universe of data, its integrity and necessity
The world around us provides us with an abundance of data at every moment of every day and we constantly analyze this data to make decisions. In this process, the attributability, accuracy, legibility, permanence, contemporaneousness, and originality of the data we receive are of utmost importance.
So, what does “data” mean? Data is a formalized representation of information suitable for communication, interpretation, or processing, according to ISO/IEC 2382:2015. Data integrity is the property of data where accuracy and consistency are preserved regardless of changes made, according to the same ISO/IEC standard.
Pharmaceutical companies heavily rely on data for their business operations. From pharmaceutical development and preclinical research to clinical trials, manufacturing and the registration dossier, data sets play a crucial role in ensuring the quality, efficacy, and safety of medicines, as well as building trust in the brand. The Pharmaceutical Quality System (PQS) plays a vital role in analyzing data integrity.
A significant amount of data is generated and assessed that is crucial for the analysis of the production and certification of a batch. Therefore, it is no surprise that data integrity has become a “hot topic” in the pharmaceutical industry.
Why is data integrity important?
Data integrity holds great importance for several reasons. To illustrate its significance, let’s consider a scenario. Suppose you are a Qualified Person responsible for certifying a batch. You have been provided with the batch manufacturing records for one hundred batches of a medicinal product, each with a batch size of 1,000,000 tablets. While reviewing the first dossier you discover that all printouts from the weight balance, which are essential for quality control, are missing. The weight balance used in the Quality Control laboratory lacks integrated computer software and, therefore, doesn’t store electronic copies of the data. Similar situations are observed in the subsequent dossiers. In such a case, can you be certain that the final data is correct, and that the product’s quality truly meets the specified requirements? Is there a possibility of data manipulation? You would either have to reject the batch and dispose of it or conduct a full repeat Quality control, resulting in significant time and financial losses. Additionally, your obligations to the distributor and the risks to the Company’s reputation would be extremely high.
While you might not be able to identify the absence of all primary data, such a situation may be detected by inspectors during a GMP inspection. Consequently, your company may receive a Warning Letter from the FDA or Non-Compliance Report from a European Regulatory body, addressing data integrity issues, as several companies have experienced.
The above example represents just one possible scenario in the lifecycle of a product but there are numerous others where data integrity is essential. This issue spans across the entire product lifecycle and data types.
A common problem faced by many pharmaceutical companies is that not everyone fully recognizes that data integrity requirements apply to every participant in the supply chain. Data is crucial throughout the product’s shelf life plus one year, from the production of the medicinal product to the storage of medicines in pharmacies. This data confirms the quality of the final medicinal product and must be readily available upon request.
Every activity (including transportation, outsourcing, and others) generates data, the integrity of which is vital to ensure the quality of the final medicinal product.
Although computerized systems have become integral to pharmaceutical manufacturers, paper-based data still plays a significant role, and both must meet regulatory requirements as outlined in the guidelines.
No matter what type of activity a pharmaceutical company performs – production, distribution, import, clinical trials, quality control, transportation, development of software solutions for pharmaceutical production, etc. – there are no actions that do not generate data. The integrity of this data is something that the Company’s PQS must ensure, monitor, and improve on an ongoing basis.
The requirements set by regulators for the Pharmaceutical Quality System today include the following main documents:
- FDA 21 CFR Part 11
- FDA Guidance for Industry Data Integrity Compliance with Drug cGMP
- EMA Good Manufacturing Practice (GMP) guidance to ensure the integrity of data
- MHRA GxP Data Integrity Guidance and Definitions
- WHO Guideline on Data integrity, Annex 4
- PIC/S Good practices for data management and integrity in regulated GMP/GDP environments
- GAMP Records and Data Integrity Guide – ISPE/GAMP, March 2017
- EU Good Manufacturing Practice, Volume 4, Annex 11
ALCOA+ has been recognized as the ultimate reference for data integrity. ALCOA+ is the acronym for Attributable, Legible, Contemporaneous, Original, Accurate and encompasses the following quality attributes for data:
– A (attributable). Data should be attributable, thus being traceable to an individual and, where relevant, the measurement system. In paper records, this can be achieved through the use of initials, a full handwritten signature, or a controlled personal seal. In electronic records, unique user logons can be used to link the user to actions that create, modify, or delete data. Alternatively, unique electronic signatures, which can be biometric or non-biometric, can be employed. An audit trail should capture user identification (ID), date and time stamps, and the electronic signature should be securely and permanently linked to the signed record.
– L (legible and permanent). Data and metadata should be readable throughout the data’s lifecycle. Electronic data can be made legible/readable through the original software application that created it.
– C (contemporaneous). Data should be generated and recorded in a timely manner, meaning it is captured and documented at the same time.
– O (original record or “true copy”). The initial or source capture of data or information, along with all subsequent data necessary to fully reconstruct the GxP activity, should be available.
– A (accurate). For paper records, the process of capturing data should be clearly defined, and the data must be recorded accordingly. This includes specifying the expected record format (e.g., date) and precision (e.g., number of decimal places). The unambiguous identification of the data source should be clearly documented. In the case of data recorded using a computer system, the verification performed during the initial qualification and subsequently during changes and repair activities should ensure that the data is captured from the correct source and processed correctly (e.g., linearization, normalization, conversion, etc.).
What does “+” mean? When the regulatory framework was improved, four more data properties were added:
– Complete: The data should include relevant metadata, ensuring that all necessary information is captured and documented.
– Consistent: The date and time of an activity must be recorded in the correct chronological order, maintaining consistency and sequence.
– Enduring: The data should be persistent and retained during storage, ensuring its integrity and longevity.
– Available: The data should be easily accessible and available for viewing or verification upon request by authorized personnel.
It is important to note that the ALCOA+ principle applies to electronic data, paper records, and hybrid systems, encompassing various types of data management.
Are all ALCOA+ attributes sufficient to ensure data integrity?
Simply following the attributes described above is not enough. A present-day PQS requires that any changes made by staff to the data are documented and traceable, meaning that changes must go through the Change Management process. However, in this case, the PQS must be maintained by a significant number of staff in order to evaluate each change initiative. This is where the risk management process comes into play.
A Data Integrity Risk Assessment (DIRA) should be conducted to identify and assess areas of risk. This assessment should cover systems and processes that generate data or where data is obtained, considering inherent risks. The Data Integrity Risk Assessment should encompass the entire data lifecycle and take into account the criticality of the data. It should address relevant computerized systems, personnel, staff training, outsourcing activities, and the overall quality assurance system. Data criticality can be determined by evaluating how the data influences decision-making.
The identified risks should be assessed and mitigated. The Data Integrity Risk Assessment should be documented and periodically reviewed to ensure its currency and the effectiveness of the identified control measures. A periodic risk review should be conducted throughout the document’s lifecycle and the associated data, with the frequency depending on the level of risk determined by the risk assessment process.
Where the risk assessment has identified areas requiring corrective and preventive actions, the prioritization of actions (including assuming an appropriate level of residual risk) and the prioritization of controls should be documented and communicated to management and staff. Staff training and periodic reminders of the company’s data integrity policy are crucial for compliance with requirements. If long-term preventive actions are identified, short-term risk mitigation measures should be implemented to ensure acceptable data management in the interim and to maintain focus.
Identified risk controls may include organizational, procedural, and technical measures such as procedures, processes, equipment, tools, and other systems to prevent and detect situations that could impact data integrity.
The PQS must ensure that systems (both computerized and paper-based) meet regulatory requirements to ensure data integrity. This brings us to the selection of a computerized systems/software vendor. The vendor qualification process should incorporate a risk assessment approach. Even previously installed and validated computerized systems should be periodically reassessed for compliance with current requirements. Appropriate preventive and detection controls should be identified and implemented based on the risk assessment.
The effectiveness of the implemented controls should be evaluated through various means, such as:
- Tracking and trending data
- Reviewing data, metadata, and audit logs
- Conducting routine audits and/or self-inspections, including assessments specifically focused on data integrity and computerized systems.
Any computerized system used in the GxP-relevant environment must undergo validation. When GxP systems are utilized for data acquisition, recording, transmission, storage, or processing, it is essential to identify the potential risks posed by the system and its users to data integrity.
The software of computerized systems used in conjunction with GxP instruments and equipment should be appropriately configured and validated. The validation process should encompass aspects such as the design, implementation, and maintenance of controls to ensure the integrity of manually and automatically generated data, the implementation of Good Documentation Practices, and the appropriate management of data integrity risks throughout the data lifecycle. Efforts should be made to reduce and eliminate the potential for unauthorized and adverse data manipulation throughout its lifecycle.
In cases where electronic instruments or systems without configurable software and electronic data retention are employed (such as certain pH meters, balances, and thermometers), controls should be established to prevent adverse data manipulation and repeated testing to achieve desired results.
While technical controls should be prioritized, additional procedural or administrative controls should be implemented to manage aspects of computerized system control in cases where technical controls are absent.
It is important to recognize that delaying the implementation of computerized systems does not necessarily reduce inconsistencies or provide greater protection against regulatory scrutiny regarding data integrity. Data integrity underlies every activity within the pharmaceutical system, including staff training, pharmaceutical development, preclinical and clinical trials, manufacturing, quality control, and logistics operations such as storage and transportation. As all these activities generate data, it is imperative to ensure the integrity of this data to guarantee the quality, safety, and efficacy of the final product received by the end user. Computerized systems play a vital role in eliminating potential blind spots that could serve as environments for data manipulation.
What additional steps can be taken to determine whether your PQS complies with the current data integrity guidelines?
Nowadays, it is crucial for every pharmaceutical company to assess its compliance with the current data integrity requirements. As a result, self-inspections have become a key activity for the Quality Assurance department. It is always beneficial to engage an external organization with experienced and qualified personnel to conduct a comprehensive inspection to identify gaps and, ultimately, to prevent critical issues related to data integrity.
PharmaLex is uniquely positioned to support your organization in addressing data integrity concerns, both in your existing well-established processes and in adapting to upcoming changes. If you would like our team to assess your compliance with the current data integrity requirements and work with you to prepare for the future, please contact us here.