The Cytel blog keeps you up to speed with the latest developments in biostatistics and clinical biometrics.
Data management is an essential building block for successful Immuno-Oncology (I-O) trials. At the Immuno-Oncology Clinical Trials operations meeting in New York in earlier this year, Patti Arsenault, VP Quality Assurance at Cytel discussed with Christopher Lamplugh, AVP, Clinical Data Management, Global Data Operations at Merck, the key challenges for data management in the space, and what’s needed to overcome them.
As we prepare to close the door on 2017, we thought we would take a look back at the topics which have been most popular on the Cytel blog this year. It's an interesting insight on what pain points and opportunities feature highly on our global biopharma audience's radar. Read on to learn which of our 2017 blogs have received the most interest from our audience so far.
Cytel offers a full range of clinical data management services and the team of experts is spread across the globe.
In this blog we talk to Makarand, who is based in India, to find out more about his career path, current role at Cytel and his interests outside of work.
The Society for Clinical Data Management (SCDM) conference landed in Orlando last week providing insights and key trends for clinical data managers from around the world. In this blog we share an infographic of some of the agenda highlights along with a more detailed overview of our Alla Muchnik's contribution to the session Study Medication Compliance: Data Collection Challenges.
As a recognized expert in adaptive trials, Cytel has extensive experience designing and managing trials with interim analyses. To ensure success in what are often complex studies, data management as well as statistical expertise is required. Cytel data managers are well versed in the various nuances and demands of managing the successful delivery of an interim analysis from a data collection point of view.
Success from the data management standpoint depends on three core elements- effective timeline management, thoughtful database design, and a proactive approach to data cleaning. In this blog, Patti Arsenault, our Global Head of Data Management shares her thoughts.
A precise and thorough approach to planning is key for success in data management.
The Data Management Plan (DMP) is a critical document in any data management project. It outlines all of the data management work to be done, the timelines and milestones to be achieved, as well as the outputs to be produced. The DMP lets all of the stakeholders know what to expect, how to expect it and when to expect it.
The Society for Clinical Data Management (SCDM)‘s publication, Good Clinical Data Management Practices (GCDMP) (1), provides a complete chapter on Data Management Plans. (The GCDMP is available, even to non-members of the society, at their webpage). It is important to note that while DMPs are not regulated documents, they are in fact so commonly used across the industry they have become auditable, and therefore scrupulously close attention needs to paid to getting them right.
We outline 4 key points to bear in mind when creating or reviewing a Data Management plan.
Use a Standard Template for Consistency
To a great extent, the DMP can, and should be standardized across projects for a consistent approach. When using a centralized biometrics model, where data services( data management, statistics, statistical programming) are conducted by a single provider, the development of such standard documents can represent an efficiency in the study set up, and also reduce the oversight burden for the sponsor. Indeed, for any trial project, a robust Data Management Plan template provides a solid starting point. One of the important challenges facing industry professionals today is the increasing complexity of clinical trials, and as such, great care needs to be taken to ensure the DMP accurately documents what actions will be taken with the trial data. Having a highly experienced data management team working on your project, with a track record of implementing innovative and complex trial designs, therefore, becomes increasingly important in this environment.
Adaptive designs have the potential to accelerate clinical development, and improve the probability of trial success. While the principle is simple- to reduce the uncertainty in clinical development by obtaining additional information from the ongoing trial- the statistical methodologies can be complex, and expert support is often required to conduct the clinical trial design. There's also complexity in the data collection itself, so knowledgable data management support is needed to successfully execute an innovative trial design. In this blog, we take a look at 5 top considerations for successful adaptive trial data management.
Editor's note( this blog was refreshed in April 2018)
As CDISC compliant submissions become increasingly expected, biopharmaceutical companies are considering how to approach the issue of data standards governance. Standards governance is a lynchpin in the management of CDISC compliance and is important for promoting standards awareness within organizations. It’s also an acknowledged hot topic in the industry.
It has traditionally been common practice for biopharma companies to outsource their CDISC conversion of legacy data for the purpose of publications and submissions to expert CROs. While large biopharma organizations may have dedicated in-house teams deployed to the management of standards governance, the dynamic nature of CDISC requirements means companies can struggle to find the resources to keep up to date and provide the best interpretation of the documentation. Outsourcing can be an option to ensure dedicated staff are available to manage and monitor these aspects and ensure companies remain submission ready.
To close a clinical database right the first time you have to begin with study start-up. Clearly, you can’t close a database if the data is not cleaned and you can’t have clean data unless you know what is most important for analysis. It’s imperative that data management works closely with the statistics group during CRF/ eCRF design to ensure data is being collected and data checks are being written in a meaningful fashion. But that’s still not enough. The data should be cleaned on a regular basis and forms locked as soon as the data has been SDVd and reviewed. Even then, it will be important to have your statistics team run listings and tables early on to catch anything unexpected. If the data is cleaned and locked by the time the last patient visit comes around then getting Principal Investigator sign-off and ultimately closing the database can run much more smoothly and quickly.
Database lock is a significant milestone in the clinical trial, upon which further data analysis and reporting timelines depend. The Clinical Data Manager is responsible for steering the data management process to ensure that the database is locked on time, and correctly. In this blog we lay out the 6 steps to database lock success.