Signal management is one of the most audited pharmacovigilance processes. It also generates one of the highest findings from audits. The ability of Marketing Authorisation Holders (MAHs) to make a robust signal management system that is fully audit/inspection ready sometimes falls short of expectations. Happily, technology can be used to make the process more scientific and rigorous.
Technology in the signal management process can be divided into two categories. The first one is the front end i.e. what platform (.Net/JAVA) is being used to develop the system. The second is the back end i.e. what programs/software (R, Python, SAS) are used to process the data. In this blog, we will focus on the second category and discuss how R specifically can help improve the signal management process.
What is the Signal Management Process?
Signal management follows a series of steps. The following are the steps in order involved in signal management.
1. Signal Detection
2. Signal Validation
3. Signal Evaluation and Assessment
For the purpose of this discussion, we will divide signal management into two distinct aspects with different process requirements to perform the tasks. The first is Signal Detection, and we can combine the remaining 2 steps (per the list above) in a category of Managing the Detected Signals. An open source language, R has many advantages and it is highly conducive to performing exploratory data analysis. Therefore, we can use R very effectively to both detect and manage signals.
Signal Detection using R
Signal detection entails the process of detecting signals from various datasets like Individual Case Safety Reports (ICSRs), literature and social media. This is achieved using statistical methods/thresholds to define outliers or trends in a "Drug-Event Combination (DEC)". Such DECs or signals detected using this method are called statistical signals. Detecting such statistical signals from various sources requires the application of statistics, programming and pharmacovigilance expertise. The size and volume of the data set determines the statistical methods used. While for large data sets Bayesian techniques are used to tweak out signals, for smaller data sets frequentist methods are usually applied. However, irrespective of the statistical method or the size of the data set, R can be customized to run the statistics and generate signals for further processing.
Managing the Statistical Signals using R
Qualitative assessment or medical judgment is applied to statistical signals through signal validation to determine if these statistical signals warrant any further action to verify them. This validation is performed using causality assessment.
Causality assessment is the assessment of the relationship between a drug treatment and the occurrence of a particular adverse event. By reporting events spontaneously, the reporter has some grounds to suspect that the event has been caused by the pharmaceutical product in question. Unfortunately, such spontaneously reported cases usually do not have enough information for a robust causal assessment to be conducted. Using R programming language can help to rank or weight cases which have sufficient information and weed out those with limited information. This activity, in turn, results in saved time and effort by the safety scientists and physicians.
Causality assessment in signal management is the aggregate analysis of those ICSRs which are spontaneously reported. Such unsolicited ICSRs are one of the main sources of safety data on which analytics/ causality assessment is performed for signal management and aggregate reports. This kind of data is large and “dirty” –this being the rule rather than an exception. Safety physicians and scientists often spend 60-70% of their time grappling with such "dirty" data. Often, this means they are left with little time to perform the actual analysis/causality assessment. The verification of unsolicited data from the safety database is often done at the analysis stage. Subsequently, there are no efforts put in to present the data as visuals so using data visualization in signal detection has not been given its due importance. R is one of the leading software used across all industries today to produce visualizations for data interpretation. Therefore it is a natural choice to help bridge the gap that currently exists in the signal management process.
In the current environment where artificial intelligence and cognitive computing are hot topics, using software like R in signal management process is the first logical step to help achieve automation. The urgent requirement of detecting signals real time can be made easier by using such software. Signal management and pharmacovigilance as a whole should consider utilizing R in their processes to ensure higher efficiency, thus positively impacting patient safety.
Cytel's data science team applies advanced statistical techniques including
predictive modeling of biological processes and drug interactions to unlock the potential of big data. To learn more about how we work with clients in this area, click the button below to download the brochure.
About the author
Dr Krishna Asvalayan is Associate Director of Services at Cytel. A pharmaceutical physician with more than 9 years of experience in the industry and 5 years of clinical experience, Krishna became passionate about bringing technology into the life sciences business as he developed an insight into this domain. He previously worked for Novartis Oncology in India as the Global Head, Signal Evaluation & Management. Krishna was responsible for global coordination of Signal evaluation and management of Oncology products using automated tools. Using his training in biostatistics, he is currently working on novel statistical methods to evaluate safety data. By having an end to end exposure in clinical research activities (including operations and business development) for various products and therapeutic areas, he has developed an insight in this dynamic field.