Advertisement
PRMA Consulting - Integrated Evidence Generation 2.0

C-Tech Analytical Solutions
C-Tech Analytical Solutions
C-Tech Analytical Solutions
C-Tech Analytical Solutions
C-Tech Analytical Solutions
C-Tech Analytical Solutions
C-Tech Analytical Solutions
C-Tech Analytical Solutions
C-Tech Analytical Solutions
Get a free Nalgene bottle when you send us an enquiry!
Get a free Nalgene bottle when you send us an enquiry!
Get a free Nalgene bottle when you send us an enquiry!
Get a free Nalgene bottle when you send us an enquiry!
Get a free Nalgene bottle when you send us an enquiry!
Get a free Nalgene bottle when you send us an enquiry!
Get a free Nalgene bottle when you send us an enquiry!

Which Technologies you Need to Use, and for What, to be Successful with your Teams in Pharma 4.0

Fausto Artico, Global R&D Tech Head and Director of Innovation and Data Science, GSK

Kevin Harrigan, Director of Innovation and Engineering, GSK

We live in a volatile, uncertain, complex and ambiguous (VUCA) world. Things will continue to change faster and faster. The question is not anymore about if you and your company will be disrupted but when (and how). Therefore, you need to continuously be on the lookout for new, emerging technologies that could be used by you to generate competitive advantage and by your competitors to disrupt the market. Many such emerging technologies might not be mature enough, but you need to assess them anyway. In fact, to be ahead of your competitors, you and your teams need to understand these technologies and start to build internal capabilities(e.g., acquire talent, run experiments, create support ecosystems) now. This will allow you to quickly scale up their adoption inside your organisation as they enter their growth and maturity phases.

Introduction

Emerging technologies can help you automate processes in ways believed impossible just five years ago. Furthermore, many pharma companies are accelerating their digital transformation journey, pairing up their use of emerging technologies with other methodologies like Agile, DevOps, Extreme Programming, etc. But what could be some of the most promising emerging technologies you and your teams can use to accelerate your digital transformation journey? And how do they work or why are they so promising? In this short article, we provide a quick overview of the emerging technologies we think are most useful to you and your teams in pharma. You do not need to have a technical background or know how they work. We will make them easy to understand. In addition, we will provide some quick examples and explain where you can start to leverage them inside your company. Doing what we say, you will create some quick wins that will generate business value, unlock more funds for you and your teams, and increase your probability of career progression.

Interesting Emerging Technologies

3D PRINTING: It provides the ability to code what you want the machine to build for you. It is also called 'additive manufacturing’ because the machine adds layer upon layer of material to create the physical objects you want. The type of personalisation you can achieve with 3D printing is enormous. 3D printing is still far from being mature enough to allow you to create ad-hoc manufacturing components or personalised medicines. However, it is maturing and some important governmental initiatives, especially in the USA, are creating synergies between university and industry, especially in the manufacturing sector, to accelerate the capabilities it can offer. The most promising usage of such technology in the Pharma sector is for creating highly personalised implants and so your bioengineering departments are potential good collaborators in working with you to experiment with it.

ANALYTICS: It is so pervasive that you can use it everywhere. From executing simple statistical methods to visualising results in dashboards, your Fausto Artico, Global R&D Tech Head and Director of Innovation and Data Science, GSK

Kevin Harrigan, Director of Innovation and Engineering, GSK teams have an unlimited number of ways to use analytics to discover and show drifts, deviations, trends, etc. It can also be used to generate new insights on why adverse events happen (e.g., deviations). This is important because operators and users struggle to make sense of all the information available. Analytics can help them prioritise what to focus on, independently of which part of the organisation they work in (e.g., drug discovery, clinical trials, manufacturing, supply chain or commercialisation).

ARTIFICIAL INTELLIGENCE (AI): AI can automate many processes, thereby reducing the need for human intervention and achieving scalable cost savings. AI is important as more and more systems become impossible to manage using only conscious capabilities. AI is also important for quickly reacting to unexpected events that could take time for a human being to recognise but which could be easily stopped and/or taken care of by software systems. However, remember always to embed monitoring capabilities so that such systems can ask for human intervention if the situation drifts too much from a baseline and/or the systems recognise  the situation to be too different from any of the ones used for their training.

AUGMENTED REALITY (AR): AR overlays information on your vision field to revolutionise the way you execute root cause analyses. This is especially true in manufacturing where equipment has many mechanical components. For example, using smart glasses with AR allows people to quickly discover handles and other machine parts that seem to be in the wrong positions as well as to verify wear and tear of many small components that are difficult to analyse by the human eye. Because the precision and accuracy of such solutions is heavily dependent on Computer Vision capabilities, make sure to have at least a Deep Learning expert in your teams when you build or buy AR solutions.

BIG DATA (BD): The ability to ingest, clean, link, contextualise and harmonise many disparate data sources is becoming incredibly important. This will become easier as BD capabilities become more mature. The number of tools and platforms available in the marketplace allows you to leverage graphical user interfaces and so even non-technical people can simply drag and drop boxes and connect them to create data pipelines that in the background execute all the typical BD operations. Thanks to BD, your workers and partners save huge amounts of time accessing all the data and have it organised using agreed upon formats and schemas. You can use BD capabilities in any part of your company and on any project where it would be practically impossible to manually collate, check and update the deluge of information the existing systems are constantly generating.

BLOCKCHAIN: Transparency is a critical feature for building trust. Blockchain will become very important in Supply Chain because it allows you to trace in a fully transparent way the provenance of all the source materials. It will also become extremely important in the creation and execution of smart contracts. Smart contracts will save huge amounts of time for legal departments. Contracts that can be 'assembled’ using pre-established clauses for different geographical zones will probably be the first ones to be created in automatic ways using Blockchain as the underlying foundational technology.

CYBERSECURITY: It is important to minimise the possibility for malicious people to execute actions that are against the integrity of systems and the wellbeing of users and patients. However, cybersecurity can go a step further and be used to design future systems leveraging security principles from day one. This is important because, in the future partnership ecosystems, more components will be designed and coded by different companies. Having each component providing its own cybersecurity capabilities minimises the possibility, after penetration, for attackers to 'spread’ to other components and execute unauthorised actions. You and your teams can get in touch with the perimeter and security teams and start to use cybersecurity for creating monitoring systems able to detect anomalous users' behaviours.

DATA SHARING: The huge need to innovate in this field has created plenty of opportunities. This is true especially if the systems you design and implement need to work in geographical zones that have different legal systems and regulations. It will also be important to: 1) create systems that are able to share data but that can also mask, for privacy reasons, important information in automated ways (e.g., people in some geographical zones cannot access some data or people accessing clinical trial data cannot deanonymise patients); 2) allow users to grant, but also revoke at any moment, access to their data; and 3) make sure sensitive information does not get copied and is destroyed in a timely fashion after the end of contracts. You and your teams can start to build basic masking capabilities first (e.g., mask addresses and names). When you have become more experienced in understanding the difficulties and nuances of creating a privacy system for data sharing at the local level, you can move to more complex activities like very granular policy creation (e.g., policies based on attributes and not roles) and code systems that ensure their performant enforcement at scale.

MACHINE LEARNING: The ability of a machine to learn new things and generate insights using pre-codified rules is going to become a huge source of competitive advantage. A new algorithm is difficult to copy or understand if your competitors do not have access to its code. The more you create flexible rules providing strong statistical evidence that the models they create are reliable, resilient and robust, the more you can be successful in solving many different challenges in the whole Pharma pipeline. This is due to the generic nature of the rules and the fact that you can create them for generic abstractions of problems that seem different superficially (e.g., manufacturing problems vs drug discovery problems) but are similar from the mathematical point of view (i.e., symbols and the need to discover in automated ways the relationships and dynamics between them).

QUANTUM COMPUTING (QC): The most promising applications of QC in Pharma are probably those used to determine how proteins fold, thus aiding in the discovery of new drugs. There are still big limitations on the hardware side, but progress and workarounds are moving fast on the software side. Promising QC sub-fields in which we could see breakthroughs are the ones characterised by Quantum classifiers. Such classifiers could soon be able to scan huge amounts of genomic data in very short amounts of time and verify if a subject is affected by a specific disease or has a propensity to become affected. Compared to other emerging technologies, for QC you probably need a small team of people that could be difficult to find. To minimise cost, try to hire somebody with a Computer Science, Statistics, Mathematics and Physics background. If you cannot find anybody with all these backgrounds, then hire several people to cover all the bases.

REAL WORLD DATA: Government initiatives are making it easier to get data that can be used to design new clinical trial studies or accelerate drug discovery. The datasets made available through such initiatives are important because they are usually a ‘common denominator’ that companies can use in their collaborative efforts. The standards used to create such datasets are determined a priori. Such initiatives facilitate the adoption of such standards (e.g., schemas and formats) inside companies. This also means that if at a later stage companies decide to share their data, integration and sharing will happen quickly. You and your teams should leverage such rich datasets. This is true even if the reliability of the data sources cannot be easily checked and/or the replicability of the results is difficult to verify due to the lack of metadata that explains the protocols used to generate the data.

ROBOTIC PROCESS AUTOMATION: Often it is a software capability, contrary to what many people think. You install a software on laptops and other devices. The software starts to track the actions that people execute. Later, after having collected data for some time, the software discovers and finds commonalities and differences in how people execute actions and standardise them creating push button macros that people can use to not have to manually repeat the common parts of the actions. This capability is important especially if some parts of some procedures should be repeated always in the same exact way for regulatory and compliance reasons. In addition, the macros save a great deal of people’s time, and such time is easily transformable into cost savings that senior executives can readily understand, and which scale up quickly as the adoption of the software increases.

SMART SENSORS: As the number and complexity of systems increase, the ability of sensors to 'talk with’ and “observe” each other is fundamental. Sensors are becoming capable of collaborating with each other thanks to their augmented capabilities to interconnect and work together (i.e., Internet of Things). Working together, they can create baselines for the environment in which they operate and multiple sub-groups in the “community” they create can observe what each single sensor is doing, or claims to be doing, and help it to determine through voting methodologies if it is healthy. Such capability is important for maintenance people, especially in manufacturing where machines can have thousands of sensors (i.e., bioreactors).

VIRTUAL REALITY (VR): Educational purposes or regulatory processes, you can use VR to train people on many things. For example, some young operators must be able to use and manage complex equipment for discovering new drugs in early stage discovery or producing products in manufacturing. Such equipment can easily cost tens of millions of dollars. Human errors need to be minimised because they would easily cost tens of million dollars to your company due to missed target submission deadlines and/or regulatory fines. Using VR to train people before introducing them to the real processes can greatly decrease the probability of misuse of such equipment. You can create digital twins, especially for highly mechanical systems for which the science is very well known, and have people, through VR, run what-if scenarios on how to use the equipment in different situations and conditions (even the rare, anomalous ones).

WEARABLES: The new generation of wearable devices is easily configurable and can stream seamlessly and constantly a considerable number and type of biomarkers. These new devices can do this using bluetooth, Internet or lowpower, wide area networks (LoRaWan). This opens unprecedented possibilities in clinical trials. Before doctors had to manually take and record biomarkers from patients with a very low sampling frequency. Thanks to the new generation of wearable devices, you can now monitor patients 24/7 and do not have to wait any time before receiving the data from your patients or doctors. Therefore, leveraging such devices, you can generate much richer datasets for each patient and take clinical trial actions much faster.

Summary

In this VUCA world, it is important that you invest some time and money in exploring and assessing new emerging technologies on an ongoing basis. Doing so, by the time some of these technologies become mature, you and your team will have built the knowledge and confidence necessary to leverage them in larger and more complex projects. Such projects will be critical for you to generate a competitive, and in a considerable number of cases, sustainable, long-term advantage for your organisation.

--Issue 48--

Author Bio

Fausto Artico

Fausto has two PhDs. As a Physicist, Mathematician, Engineer, Computer Scientist, and High-Performance Computing (HPC) and Data Science expert, Fausto has worked on key projects at European and American government institutions and with key individuals, like Nobel Prize winner Michael J. Prather. After his time at NVIDIA corporation in Silicon Valley, Fausto worked at the IBM T J Watson Center in New York on Exascale Supercomputing Systems for the US government (e.g., Livermore and Oak Ridge Labs).

Kevin Harrigan

Kevin graduated with a BSAE in aerospace engineering from Pennsylvania State University. During his collegiate career he gained experience in a co-opposition with Capital One Financial as a Data Analyst at in Richmond, Virginia. It was this experience that afforded him the opportunity to find passion in data munging, applied statistics, and programming. Following graduation, he accepted a full time offer in their newly formed Digital Enterprise Organization, expanding his technical and analytical knowledge in areas such as distributed computing, clickstream analytics, multivariate testing, anomaly detection, and propensity modeling

Latest Issue
Get instant
access to our latest e-book
THERMOFISHER SEA SGS - ADVANCED ANALYTICS Adare Pharma Solutions - Pediatric Formulation Solutions Thermo Fisher Scientific - 60th year celebration of The Gibco brand Thermo Fisher Scientific - LC-MS biopharmaceutical applications CPC - The Future of Aseptic Connections in Cell and Gene Therapies CPHI PMEC China - Virtual Expo Connect ThermoFisher - Accekerate therapeutic development