Continuous manufacturing (CM) is gaining popularity in the pharmaceutical industry because it represents an improved approach for small molecule drug product development compared to traditional batch manufacturing methods. CM allows for the production of varying amounts of medicines in response to changing customer and patient demands. There is no need for scale-up in the traditional sense because a process can be run longer to increase output, instead of continually increasing batch size, along with the amount of time and raw material typically required for development.
Transitioning from traditional batch to continuous manufacturing provides greater throughput, but it requires near real-time data analysis to proactively address potential problems.
CM in pharma offers an opportunity for fully integrated equipment with in-line or at-line process analytical technology (PAT), process monitoring and data analytics. Incorporating these techniques with real-time release testing can lead to more efficient pharmaceutical manufacturing, driven by the ability to provide product quality assurance based on process data.
The footprint required for production is reduced because CM typically requires less equipment, for example, by eliminating the need for holding tanks often required in batch manufacturing. CM also typically requires more points of real-time measurement because the process must be continuously monitored and frequently adjusted to maintain quality and throughput. With more measurement comes more data, creating a need for advanced analytics.
In a user presentation for the recent OSIsoft PI World Online1, the authors shared how Merck (NYSE:MRK) used the OSIsoft Enterprise Infrastructure and Seeq advanced analytics software to support CM process development. This article expands on the material presented at the event.
Maximizing value from data
At Merck, the continuous manufacturing process starts at the downflow booth where raw materials are screened and charged to Intermediate bulk containers (IBCs). These IBCs are wheeled into an adjacent room where they are staged above the continuous direct compression system. The loss-in-weight (LIW) feeders, blenders and tablet press are all housed in this small footprint where raw materials are dispensed, mixed and compressed into tablets. Core tablets can either be conveyed to an automated tablet tester or to one of two parallel film coaters. After film coating, the finished product is discharged into drums via a distribution arm.
Approximately 1600 sensors in the CM system generate data for collection and storage in PI every second, with advanced analytics required to maximize the value derived from the data by enhancing process understanding. With CM, there is an opportunity to identify potential trends or shifts in the process, and then respond or segregate material such that it does not impact the rest of the lot.
Subject matter experts (SMEs) need to access, filter, contextualize and visualize the CM data in an efficient manner to characterize the key unit operations of the process. By understanding normal operations and performance across a range of operating conditions, potentials shifts or trends can be proactively identified, well before any process warnings or alarms are triggered. To facilitate this process, teams need to easily access, analyze and visualize the data to encourage collaboration and improve knowledge management.
The following use cases demonstrate advanced analytics in action to create insights and improve operations.
Loss-in-weight feeders use case
Merck’s LIW feeders are gravimetric metering devices mounted on a weighing platform. These feeders accurately dose pharmaceutical raw materials at a predetermined feed rate through a screw or set of twin screws. The LIW feeder controller monitors the actual feed rate measured by the load cell readings over time, compares that process variable to the setpoint, and changes screw speed to maintain the target feed rate. To enable CM, it is necessary to periodically refill the hopper and continue feeding raw materials to the process. During these transient times when material is being added to the hopper, a LIW feeder must temporarily operate in volumetric mode via constant screw speed operation.
LIW feeding is optimized for key parameters such as minimum refill level, the point at which a refill is triggered, and refill amount to ensure a consistent feed rate to the process. Figure 1 shows the stable operating region, highlighted in the green box.
This stable operating region is where the amount of material per revolution of the screw, sometimes referred to as feed factor, is consistent. It is also where the net weight in the feeder decreases as the raw material is fed to the process. These parameters are important for investigation to optimize and monitor feeder performance.
Merck used PI ProcessBook dashboards to monitor the process in real-time. The company used Seeq to identify when the feeders were in gravimetric or volumetric mode, evaluate the consistency of feeder refills and develop tools to support troubleshooting. This characterization of the feeder unit operation is the foundation for creating alerts to identify potential deviations before process limits are exceeded.
System alarms require complementary data to give process context and help with early detection of potential process disruptions. Compiling the analyses for all of the feeders into a single dashboard enables support engineers — not all of whom are experts in feeder unit operation — to understand process performance.
Figure 2 shows the feeder dashboard with results based on input data from key development runs. Merck’s SMEs use Seeq to leverage historical data and quickly define the different modes of operation of each feeder, and to then create summary statistics describing feeder performance during development. In addition to these process parameters, scatterplots were created and are used to compare the feed factor to net weight of the material in the feeder.
This left side of this dashboard depicts scorecard metrics and summarized key statistics over each development run, including count of refill events, average refill duration, time spent in volumetric and gravimetric mode and count of process alarms during the run.
Scatterplots, shown on the right side of the dashboard, compare feed factor by net weight of the material, with earlier runs at one process throughput shown in gray, and more recent runs at a different process throughput shown in orange. Additional metrics are included to highlight low refill events, each of which may require further investigation
Collecting all these calculations into a single monitoring dashboard allows for quick review of each run, with results from all six feeders brought together for final viewing and analysis. The dashboard also has functionality to bring in new data, and to automatically update the calculations and views to include the latest information.
Film coater analysis use case
The second use case concerns film coating, applied to a core tablet for a variety of reasons, such as taste masking, product differentiation, or modification of ingredient release. The film coater used by Merck is the ConsiGma Continuous Tablet Coater manufactured by GEA, and it operates in a few distinct phases.
First, core tablets are charged into a drum that rotates at a high speed to form a ring of tablets around the perimeter. Air knives are then turned on to create a cascade of tablets so that when the spray nozzle is turned on the tablets are presented to the spray zone while in free-fall to enable rapid drying. Once initiated, the spray phase continues until the target suspension has been applied (Figure 3).
Compared to traditional film coating, the time each tablet spends in the spray zone is short to comply with CM requirements for throughput, but frequent to impart a uniform film coat. While the operating principle is somewhat unique, the data collected is largely the same in traditional batch film coating.
This investigation focused on a few key process parameters: inlet airflow, inlet air temperature, and suspension spray rate. Once again, this use case leveraged the PI ProcessBook dashboard to visualize the process and then further characterized the behavior with Seeq by narrowing the the investigation’s focus to when the coater was spraying the tablets and omitting other modes of operation when spray was not active. By evaluating multiple subparts, in this case film coating pans, to understand normal behavior, fluctuations or shifts could be identified before reaching the point of a warning or alarm.
Figure 4 shows an analysis designating the phases where spray is active for the development runs specified for this use case. By defining these phases of interest, the process parameter data can be cleansed to only include relevant data when aggregating run statistics.
In the top image A, the times when the coater is spraying are represented by the green bars at the top of the analysis. These first and second coating phases are used for data cleansing. The blue signal represents the spray rate within coating phases, and the pink signal shows the original data set of the spray rate during the run. Image B on the bottom shows the average spray rate over each run, calculated using only data during the coating phases of operation.
Operating metrics of these process parameters were calculated from data collected during the periods of interest while the coaters were actively spraying. These analyses were brought together to provide a full picture of coater performance in one report, with this information shared across the team and updated to compare data from the latest runs.
Where the feeder dashboards demonstrated real-time process monitoring by bringing the latest data from new lots into the dashboards, the coater analysis showed post-execution summaries and the potential for trending across runs. Both of these use cases were built on historical data to maximize learnings from test runs, and to enable continued monitoring for process shifts. This proactive process monitoring supported early detection and correction of potential changes in the process.
Modern analytics tools have made it easier to transform data and share it across partner groups, laying the groundwork for application to future runs, continued process verification and proactive process analysis. Combining efficient analysis of large data sets with knowledge capture and collaboration tools enabled harmonization of analyses and a deeper process understanding for continuous improvement.
These efforts can now also support alignment across different functions, providing insights into reliability and process optimization for management, engineers, and operators. Some ideas for future work at Merck include expanding the use of Seeq to support:
- Analyzing other products and processes.
- Bringing in new data sources and datasets such as those from vibration sensors, environmental monitoring, and electronic batch records.
- Identifying statistical process control limits based on historical data.
- Using condition-based maintenance to inform preventative plans.
- Using PI AF and Event Frame Notifications to communicate back to the operator.
Accessing large CM data sets for rapid process analysis is a new challenge requiring modern software, such as Seeq combined with the OSIsoft Enterprise Infrastructure. These software applications empower SMEs to characterize the process, understand normal process variability, calculate statistics, and generate reports to be shared across teams. These reports can be used both for proactive process monitoring by providing visual feedback during operation, as well as for post-execution analysis and trending to provide greater insight into process performance.
All figures courtesy of Merck and GEA.
About the authors
Laura Wareham is a senior scientist, engineering in the Merck manufacturing division within the Pharmaceutical Commercialization Technology group supporting pharmaceutical process development of oral solid dosage forms from Phase IIb through regulatory filings, process performance qualification, and into commercial launch. Laura has been involved in developing Merck’s first continuously manufactured drug product from equipment design through process development and control strategy evaluation.
Laura holds a B.S. degree in Chemical Engineering from Rensselaer Polytechnic Institute, and a M.Eng. degree in Chemical Engineering from Lehigh University.
Emily Johnston is a senior analytics engineer at Seeq and works with end users to analyze data for insights into process, equipment and quality. After graduating from Texas A&M University with a B.S. in biomedical engineering, Emily spent five years in the process automation industry configuring process control systems for manufacturing plants. At Seeq, she enjoys working across multiple industries to solve big data challenges using advanced analytics.
- Wareham, L. and Anderson, E. (2020). Advances in Data Analytics for Continuous Manufacturing [PowerPoint presentation]. OSIsoft PI World, San Francisco. https://www.osisoft.com/presentations/advances-in-data-analytics-for-continuous-manufacturing/
- Wareham, L. and Graham, L. (2019). Leveraging Data Analytics to Support Merck’s Journey in Continuous Manufacturing [PowerPoint presentation]. OSIsoft PI World, San Francisco. https://www.osisoft.com/presentations/leveraging-data-analytics-to-support-merck-s-journey-in-continuous-manufacturing–merck-seeqx/
- GEAPharma (2015, June 26). ConsiGma Coater animation [Video]. YouTube. URL https://www.youtube.com/watch?v=FXRjt1RZ3CQ
The opinions expressed in blog post are the author’s only and do not necessarily reflect those of Pharmaceutical Processing World or its employees.