The Cytel blog keeps you up to speed with the latest developments in biostatistics and clinical biometrics.
As a recognized expert in adaptive trials, Cytel has extensive experience designing and managing trials with interim analyses. To ensure success in what are often complex studies, data management as well as statistical expertise is required. Cytel data managers are well versed in the various nuances and demands of managing the successful delivery of an interim analysis from a data collection point of view.
Success from the data management standpoint depends on three core elements- effective timeline management, thoughtful database design, and a proactive approach to data cleaning. In this blog, Patti Arsenault, our Global Head of Data Management shares her thoughts.
A precise and thorough approach to planning is key for success in data management.
The Data Management Plan (DMP) is a critical document in any data management project. It outlines all of the data management work to be done, the timelines and milestones to be achieved, as well as the outputs to be produced. The DMP lets all of the stakeholders know what to expect, how to expect it and when to expect it.
The Society for Clinical Data Management (SCDM)‘s publication, Good Clinical Data Management Practices (GCDMP) (1), provides a complete chapter on Data Management Plans. (The GCDMP is available, even to non-members of the society, at their webpage). It is important to note that while DMPs are not regulated documents, they are in fact so commonly used across the industry they have become auditable, and therefore scrupulously close attention needs to paid to getting them right.
We outline 4 key points to bear in mind when creating or reviewing a Data Management plan.
Use a Standard Template for Consistency
To a great extent, the DMP can, and should be standardized across projects for a consistent approach. When using a centralized biometrics model, where data services( data management, statistics, statistical programming) are conducted by a single provider, the development of such standard documents can represent an efficiency in the study set up, and also reduce the oversight burden for the sponsor. Indeed, for any trial project, a robust Data Management Plan template provides a solid starting point. One of the important challenges facing industry professionals today is the increasing complexity of clinical trials, and as such, great care needs to be taken to ensure the DMP accurately documents what actions will be taken with the trial data. Having a highly experienced data management team working on your project, with a track record of implementing innovative and complex trial designs, therefore, becomes increasingly important in this environment.
Adaptive designs have the potential to accelerate clinical development, and improve the probability of trial success. While the principle is simple- to reduce the uncertainty in clinical development by obtaining additional information from the ongoing trial- the statistical methodologies can be complex, and expert support is often required to conduct the clinical trial design. There's also complexity in the data collection itself, so knowledgable data management support is needed to successfully execute an innovative trial design. In this blog, we take a look at 5 top considerations for successful adaptive trial data management.
Editor's note( this blog was refreshed in April 2018)
As CDISC compliant submissions become increasingly expected, biopharmaceutical companies are considering how to approach the issue of data standards governance. Standards governance is a lynchpin in the management of CDISC compliance and is important for promoting standards awareness within organizations. It’s also an acknowledged hot topic in the industry.
It has traditionally been common practice for biopharma companies to outsource their CDISC conversion of legacy data for the purpose of publications and submissions to expert CROs. While large biopharma organizations may have dedicated in-house teams deployed to the management of standards governance, the dynamic nature of CDISC requirements means companies can struggle to find the resources to keep up to date and provide the best interpretation of the documentation. Outsourcing can be an option to ensure dedicated staff are available to manage and monitor these aspects and ensure companies remain submission ready.
To close a clinical database right the first time you have to begin with study start-up. Clearly, you can’t close a database if the data is not cleaned and you can’t have clean data unless you know what is most important for analysis. It’s imperative that data management works closely with the statistics group during CRF/ eCRF design to ensure data is being collected and data checks are being written in a meaningful fashion. But that’s still not enough. The data should be cleaned on a regular basis and forms locked as soon as the data has been SDVd and reviewed. Even then, it will be important to have your statistics team run listings and tables early on to catch anything unexpected. If the data is cleaned and locked by the time the last patient visit comes around then getting Principal Investigator sign-off and ultimately closing the database can run much more smoothly and quickly.
Database lock is a significant milestone in the clinical trial, upon which further data analysis and reporting timelines depend. The Clinical Data Manager is responsible for steering the data management process to ensure that the database is locked on time, and correctly. In this blog we lay out the 6 steps to database lock success.
How do you go about selecting the best Electronic Data Capture (EDC) system for your study? There is now a vast amount of choice in the market, and many factors to take into account before making your decision. Different stakeholders within the business may also have different perspectives, so any decision making process needs to balance these disparate needs.
In this blog we’ll highlight some unique challenges that are encountered from a Data Management perspective when working on early phase Oncology trials. We’ll also discuss approaches which can be employed to mitigate these issues.
During the course of any clinical trial, there are often data which, while collected electronically, are outside of the scope of the eCRF . These data include central lab results like ECGs, PK/PD data and others. In this blog we’ll take a look at some key considerations in handling electronic data transfers and any subsequent integration with the core EDC database.
It's critical for biostatistics and data management to be closely aligned and working effectively together. The consequences when these biometrics teams aren't integrated can be significant- impacting on both efficiency and data quality. If data is collected and cleaned without the input of statistics, the assumptions which have been made may not be adequate, resulting in additional work and compromised timelines. So, let's take a closer look at 5 important interactions between the two functions during the course of a clinical trial.