The sharing of data and collaboration between companies, laboratories and individuals has brought about exciting changes in the pharmaceutical industry – and it is hoped that this will enable drugs to be brought to market more quickly, new ideas to be implemented faster and costs to be cut.
Technology transfer is historically associated with the transfer of process knowledge from development to the manufacturing organization. The process of “how to manufacture” the drug product at scale is developed and this knowledge handed over to the manufacturing organization to actually implement. All knowledge generated by the development process is critical when the manufacturing organization scales up to make the product. The nuances of what variables affect product quality and purity are also explored in the development phase and so this tacit knowledge and insight is fundamental to the manufacturing process. As well as improving processes within the development stage; it is also essential that a platform is created for the generated knowledge to be transferred easily.
In the pharma space, some standards have been developed to try and help this digitized “technology transfer” process to better occur – S88 is a set of formats and information standards that are designed to capture the critical elements for a process enabling it to be recreated. But the adoption of this standard has been slow and we still see many “technology transfers” occurring using documents, spreadsheets and presentations. This represents a huge problem and there’s a great opportunity for technology to help.
Transferring knowledge between multiple organizations
In recent years, the idea of tech transfer has started to go beyond the standard development of manufacturing hubs within a single organization and into a scenario where technology transfers now take place across different organizations including CDMOs (contract development and manufacturing organizations) and CROs (contract research organizations), external laboratories and academic departments. This externalized scenario comes with even more requirements for good data and process knowledge to be shared since the people are no longer working in the same organization and are therefore neither incentivized nor driven to work collaboratively in the same way.
However, the commercial incentives that are at play with the externalized “technology transfers and exchanges” are also able to drive the outcomes to be successful – but they are still dogged by a common problem – how best to share the data, knowledge and insights effectively and in an easily consumable fashion.
The same principles that are applicable for technology transfers within companies should also apply to those that occur externally. This is where cloud and externalized collaboration tools come into play. The process development phase can be long and requires the capture of a huge variety and amount of data from instruments, analytical results and process information. There are various technology solutions that help with this that are now available. The benefit of having these tools in the cloud comes in two forms – firstly, easy access for all those who need it across all organizations and secondly, access to compute and storage infrastructures to make best use of the data and information available.
Externalized Research and Development
Until now, we have only considered the concept of technology transfer in the development to manufacturing sense – but the concept applies to any data and knowledge exchange. Pharma companies are externalizing their research and development more and more – with many having strategic targets set at the board level to externalize large percentages (upwards of 25 percent) of their work. Some companies have now even become what is termed virtual – where all lab work is done by third parties and the holding company is far more lean due to no lab overheads.
Each of these cases are governed by the same principles of the development to manufacturing space – data, knowledge and insights are the foundations. If we consider this, then the question arises as to how to ensure that the data, knowledge and insights that are being generated are those that are required and are of a good enough quality to support the business decision-making process. The S88 format is only part of the puzzle here as it is great at documenting what has been done in an exchangeable manner but, with this format, the quality of the information and background “raw data” remains absent.
Data Quality of the Tech Transfer
Quality is measured in many different ways but it is principally determined by whether the information and data is consumable, analyzable, broad enough (statistically robust) and, all importantly, repeatable. This is a common problem for the scientific community and we are seeing the application of new technologies to help assess the quality of data and information. This advanced analytics approach to assessing scientific data is where AI (in all its forms – deep learning, machine learning, artificial intelligence) come in. Furthermore, to apply these AI type technologies easily, the data and information must be cloud-based due to the cost implications of trying to do this at scale.
Quality can perhaps be defined as a combination of all the aspects that define good scientific data so we are also seeing new technologies emerge that are able to assess data quality – not in isolation – but more in the context of what questions the data needs to answer. This sounds like the stuff of science fiction, but there are many examples of this approach being used in other industries such as financial services. Although the data structures and complexities are simpler in these other areas, the relevance still remains valid and companies are now emerging that are providing “data quality” as a service for pharma companies. The proposition is simple: they assess the quality and density of the data and information, giving a quantitative assessment and insights into gaps in the data.
Why is this data quality important? Because when we consider the wider definition of technology transfer, organizations want to know that what they are buying is of good quality – whether it be contracted work to develop a new process or a merger and acquisition project to ingest assets and knowledge. The last part of technology transfer at the macro scale is how to ingest and aggregate data from the external parties with your own internal data and knowledge. This is a long-term problem and one that will continue to affect the drive for innovation within the data management and knowledge management sphere. We are seeing the cloud and “large data” handling technologies provide a foundation for innovation to take place, on but this is still an emerging domain.
In conclusion, pharma organizations can manage the transfer of data and process information using existing standards like S88, but that is only the first step along the road to truly effective data and knowledge exchange. The cloud and associated “data interrogation and analysis” technologies are likely to provide more robust exchange and greater understanding of the inherent value of data and knowledge.