Since the release of 21 CFR Part 11 in 1997, the validation and verification of electronic records has been at the forefront of the Information Technology (IT), Quality Assurance (QA) and Regulatory departments of the medical device and pharmaceutical industries. From simple record control through electronic signature approval routing and automated records processing, the validation of these functions can offer a substantial challenge to the average IT or QA personnel member not familiar with computer systems validation activities. Specific to these challenges are the validation of electronic spreadsheets or forms.
The Validation Process
The FDA defines validation as “The establishing of documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes.” Spreadsheets and automated forms are included in this definition.
Requirements
The first step of validating a spreadsheet or form is to outline the process requirements of the form or spreadsheet.
Key process requirements include:
• Determining the required outcome of completing the record
• Specifying the process paths a record may take in its completion or approval
• Determining how rejections of incorrect or incomplete data will be handled
• Identifying any functions, calculations, measurements or test data collection activities within the form or spreadsheet record that require testing
• Choosing the interface used to process the data record
* Is the interface a validated system or off-the-shelf software application?
* Does the form or spreadsheet interact with other applications or database systems?
* Document these key identified requirements in a User Requirements Specification (URS).
Risk Management
After identifying key requirements, the next step lies in detecting critical risk areas of the form or spreadsheet process.
Key risks to consider include:
• Determining the effects of a data collection failure for:
* Company business/reputation
* Customer and/or patient
* Regulatory or industry rule compliance
* Product and/or process quality
* Other company systems/processes associated up- or downstream
Rate each identified risk for its impact to both the overall company process operation and the validation project effort. Risks should be identified, monitored and tasks assigned to allow responsibility, mitigation, closure and/or overall resolution. Conduct a periodic status review throughout the project implementation. Document the management of these risks as part of the overall project plan (validation plan) or as a part of the specific risk management process.
Planning
The validation plan of a project identifies the base methodology used to prove that the form or spreadsheet is usable for the requirements/process. The key to the planning is identifying the following items:
• The process or method will be used to verify that the data record and its interface meet the defined requirements
* The level of testing will be considered appropriate
• Deliverables
• Testing process
• Acceptable outcome
• Items tested and excluded from the scope
• Justify the exlusions
* Equipment and personnel requirements
* The project and activities schedule
Test Script and Implementation
When conducting or developing a testing method for a form or spreadsheet, the “best practice” methodology is to test using a suitable data record. Key aspects of the form or spreadsheet function and usage testing include:
• Blank template retrieval
* Controlled storage / version control
* Preventing unauthorized changes to template that could compromise the process
• Creating data records
* Creating and storing form or spreadsheet data records or metadata
* Test each pathing used to create a record within the process
* Preventing unauthorized templates
* Only authorized personnel view completed data records
• Data record approval
* Only authorized personnel view completed data records
* Preventing unauthorized template changes
* Recording of metadata and record approvals/rejections/rejection comments
* * Testing of data rejection/correction process
• Data record revision
* Creating and storing new revisions of form or spreadsheet data records and metadata
* Preventing unauthorized template changes
* Only authorized personnel view completed data records
* Approval and replacement of old revision by new revision
* * Ensuring old revision data are still accessible in case of rollback issues
• Data record reporting and distribution
* Based on data record entry, separate reports should be verified for data accuracy with a comparison of the record entry to the form report * Only authorized personnel view completed data records
• Data backup and recovery
* Proven data recovery and backup methods must be used to ensure data integrity and continued process operation in the case of hardware-, software- or disaster-related failure.
* * This testing is normally conducted separately as a part of a company’s overall network and server infrastructure
Test Script Design
Design each testing script to capture the overall action, expected result, actual result and test status for each testing step. Test steps should directly relate to the key requirements defined in the requirements definition step. For a test to be considered a “PASS” during execution, the actual result must meet the expected result. Objective evidence for testing may be collected via handwritten entries, screenshots, pictures, video captures or any other method available that allows the testing record to be auditable and maintain proof of execution. Pre- and post- execution reviews are recommended to allow for approval of both the method and the executed results.
Summary Reporting
When testing is completed and the form, spreadsheet and associated processes are verified, document the results in an overall summary report. The summary report lists the outcome of the testing efforts and the project team’s recommendations for either corrective action or approval of the form/spreadsheet process. The summary report, along with the validation plan, URS, risk management documentation and completed testing documentation (with objective evidence), as well as any other deliverable defined in the validation plan, is retained as evidence that can be presented at a later date to the FDA as proof of function and usability.
Going Live and Continuous Monitoring
When the validation activities are completed, the final items for consideration are the startup and maintenance activities that can affect the overall continued operation. Key considerations include:
• User training
* Form and spreadsheet process training
* Associated equipment or skill set training
• Administration training
* Form and spreadsheet process training
* Data record storage/maintenance/retention
• Process documentation
* Form and spreadsheet change control
* Corrective action (process failure recovery)
* Form and spreadsheet process
* *
Procedure/work instruction
* * Data record storage/maintenance retention
* Training method/materials/records
• Electronic equipment maintenance
* Data backup and recovery
* Disaster recovery
* Systems maintenance and trouble call support
* Validated system change control
• Continuous monitoring
* Form and spreadsheet change control
* Form and spreadsheet process change control
* Periodic systems assessments or internal audits
* Revalidation process
Conclusion
Spreadsheet applications are widely used in the life sciences for data capture, manipulation and generating reports. These applications are considered as software by the FDA and must be validated when used in a regulated environment. Following these steps will keep companies out of the compliance gap and able to consistently meet FDA expectations.
For more information contact www.mastercontrol.com