Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
72 views6 pages

IJParticleon Data Management

This document discusses data management in clinical research. It provides an overview of clinical data management (CDM) processes and highlights their importance. CDM involves collecting, cleaning, and managing subject data from clinical trials in compliance with regulatory standards. It aims to provide high-quality data by minimizing errors and missing data. Key CDM processes include case report form design, database design, data entry, validation, discrepancy management, and database locking. CDM professionals must meet quality standards and regulatory requirements while adapting to changing technology.

Uploaded by

Suhail Momin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views6 pages

IJParticleon Data Management

This document discusses data management in clinical research. It provides an overview of clinical data management (CDM) processes and highlights their importance. CDM involves collecting, cleaning, and managing subject data from clinical trials in compliance with regulatory standards. It aims to provide high-quality data by minimizing errors and missing data. Key CDM processes include case report form design, database design, data entry, validation, discrepancy management, and database locking. CDM professionals must meet quality standards and regulatory requirements while adapting to changing technology.

Uploaded by

Suhail Momin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/224825033

Data management in clinical research: An overview

Article  in  Indian Journal of Pharmacology · March 2012


DOI: 10.4103/0253-7613.93842 · Source: PubMed

CITATIONS READS
85 19,105

4 authors, including:

Binny Krishnankutty Latha Subramanya Moodahadu


USV Limited 25 PUBLICATIONS   311 CITATIONS   
20 PUBLICATIONS   397 CITATIONS   
SEE PROFILE
SEE PROFILE

All content following this page was uploaded by Binny Krishnankutty on 17 May 2014.

The user has requested enhancement of the downloaded file.


[Downloaded free from http://www.ijp-online.com on Friday, March 16, 2012, IP: 202.3.66.95]  ||  Click here to download free Android application for this journal

Educational Forum

Data management in clinical research: An overview


Binny Krishnankutty, Shantala Bellary, Naveen Kumar B.R., Latha S. Moodahadu

ABSTRACT
Clinical Data Management (CDM) is a critical phase in clinical research, which leads to
generation of high‑quality, reliable, and statistically sound data from clinical trials. This
helps to produce a drastic reduction in time from drug development to marketing. Team
members of CDM are actively involved in all stages of clinical trial right from inception
Global Medical Affairs, Dr. Reddy’s to completion. They should have adequate process knowledge that helps maintain the
Laboratories Ltd., Ameerpet, quality standards of CDM processes. Various procedures in CDM including Case Report
Hyderabad, India Form (CRF) designing, CRF annotation, database designing, data‑entry, data validation,
discrepancy management, medical coding, data extraction, and database locking are
Received: 07-03-2011 assessed for quality at regular intervals during a trial. In the present scenario, there is an
Revised: 08-11-2011
increased demand to improve the CDM standards to meet the regulatory requirements
Accepted: 01-01-2012
and stay ahead of the competition by means of faster commercialization of product. With
the implementation of regulatory compliant data management tools, CDM team can meet
Correspondence to:
these demands. Additionally, it is becoming mandatory for companies to submit the data
Dr. Binny Krishnankutty,
electronically. CDM professionals should meet appropriate expectations and set standards
E‑mail: [email protected]
for data quality and also have a drive to adapt to the rapidly changing technology. This
article highlights the processes involved and provides the reader an overview of the tools
and standards adopted as well as the roles and responsibilities in CDM.

KEY WORDS: Clinical data interchange standards consortium, clinical data management
systems, data management, e‑CRF, good clinical data management practices, validation

Introduction CDM is the process of collection, cleaning, and management


of subject data in compliance with regulatory standards. The
Clinical trial is intended to find answers to the research
primary objective of CDM processes is to provide high‑quality
question by means of generating data for proving or disproving
data by keeping the number of errors and missing data as low
a hypothesis. The quality of data generated plays an important
as possible and gather maximum data for analysis.[1] To meet
role in the outcome of the study. Often research students ask
this objective, best practices are adopted to ensure that data
the question, “what is Clinical Data Management (CDM) and
are complete, reliable, and processed correctly. This has been
what is its significance?” Clinical data management is a relevant
facilitated by the use of software applications that maintain
and important part of a clinical trial. All researchers try their
an audit trail and provide easy identification and resolution of
hands on CDM activities during their research work, knowingly
data discrepancies. Sophisticated innovations[2] have enabled
or unknowingly. Without identifying the technical phases, we
CDM to handle large trials and ensure the data quality even in
undertake some of the processes involved in CDM during our
complex trials.
research work. This article highlights the processes involved in
How do we define ‘high‑quality’ data? High‑quality data
CDM and gives the reader an overview of how data is managed
should be absolutely accurate and suitable for statistical
in clinical trials.
analysis. These should meet the protocol‑specified parameters
and comply with the protocol requirements. This implies that
Access this article online
Quick Response Code:
in case of a deviation, not meeting the protocol‑specifications,
Website: www.ijp-online.com we may think of excluding the patient from the final database.
DOI: 10.4103/0253-7613.93842 It should be borne in mind that in some situations, regulatory
authorities may be interested in looking at such data. Similarly,
missing data is also a matter of concern for clinical researchers.
High‑quality data should have minimal or no misses. But
most importantly, high‑quality data should possess only an

168 Indian Journal of Pharmacology | April 2012 | Vol 44 | Issue 2


[Downloaded free from http://www.ijp-online.com on Friday, March 16, 2012, IP: 202.3.66.95]  ||  Click here to download free Android application for this journal

Krishnankutty, et al.: An overview of data management

arbitrarily ‘acceptable level of variation’ that would not affect the 11‑compliant systems. Most of the CDM systems available are
conclusion of the study on statistical analysis. The data should like this and pharmaceutical companies as well as contract
also meet the applicable regulatory requirements specified for research organizations ensure this compliance.
data quality. Society for Clinical Data Management (SCDM) publishes the
Tools for CDM Good Clinical Data Management Practices (GCDMP) guidelines,
Many software tools are available for data management, and a document providing the standards of good practice within
these are called Clinical Data Management Systems (CDMS). In CDM. GCDMP was initially published in September 2000 and has
multicentric trials, a CDMS has become essential to handle the undergone several revisions thereafter. The July 2009 version
huge amount of data. Most of the CDMS used in pharmaceutical is the currently followed GCDMP document. GCDMP provides
companies are commercial, but a few open source tools are guidance on the accepted practices in CDM that are consistent
available as well. Commonly used CDM tools are ORACLE with regulatory practices. Addressed in 20 chapters, it covers
CLINICAL, CLINTRIAL, MACRO, RAVE, and eClinical Suite. In the CDM process by highlighting the minimum standards and
terms of functionality, these software tools are more or less best practices.
similar and there is no significant advantage of one system Clinical Data Interchange Standards Consortium (CDISC),
over the other. These software tools are expensive and need a multidisciplinary non‑profit organization, has developed
sophisticated Information Technology infrastructure to function. standards to support acquisition, exchange, submission, and
Additionally, some multinational pharmaceutical giants use archival of clinical research data and metadata. Metadata is the
custom‑made CDMS tools to suit their operational needs and data of the data entered. This includes data about the individual
procedures. Among the open source tools, the most prominent who made the entry or a change in the clinical data, the date and
ones are OpenClinica, openCDMS, TrialDB, and PhOSCo. These time of entry/change and details of the changes that have been
CDM software are available free of cost and are as good as made. Among the standards, two important ones are the Study
their commercial counterparts in terms of functionality. These Data Tabulation Model Implementation Guide for Human Clinical
open source software can be downloaded from their respective Trials (SDTMIG) and the Clinical Data Acquisition Standards
websites. Harmonization (CDASH) standards, available free of cost from
In regulatory submission studies, maintaining an audit trail the CDISC website (www.cdisc.org). The SDTMIG standard[4]
of data management activities is of paramount importance. describes the details of model and standard terminologies for
These CDM tools ensure the audit trail and help in the the data and serves as a guide to the organization. CDASH v
management of discrepancies. According to the roles and 1.1[5] defines the basic standards for the collection of data in a
responsibilities (explained later), multiple user IDs can be clinical trial and enlists the basic data information needed from
created with access limitation to data entry, medical coding, a clinical, regulatory, and scientific perspective.
database designing, or quality check. This ensures that each The CDM Process
user can access only the respective functionalities allotted to The CDM process, like a clinical trial, begins with the end in
that user ID and cannot make any other change in the database. mind. This means that the whole process is designed keeping
For responsibilities where changes are permitted to be made the deliverable in view. As a clinical trial is designed to answer
in the data, the software will record the change made, the the research question, the CDM process is designed to deliver
user ID that made the change and the time and date of change, an error‑free, valid, and statistically sound database. To meet
for audit purposes (audit trail). During a regulatory audit, the this objective, the CDM process starts early, even before the
auditors can verify the discrepancy management process; the finalization of the study protocol.
changes made and can confirm that no unauthorized or false Review and finalization of study documents
changes were made. The protocol is reviewed from a database designing
Regulations, Guidelines, and Standards in CDM perspective, for clarity and consistency. During this review, the
Akin to other areas in clinical research, CDM has guidelines CDM personnel will identify the data items to be collected and
and standards that must be followed. Since the pharmaceutical the frequency of collection with respect to the visit schedule.
industry relies on the electronically captured data for the A Case Report Form (CRF) is designed by the CDM team, as this
evaluation of medicines, there is a need to follow good practices is the first step in translating the protocol‑specific activities into
in CDM and maintain standards in electronic data capture. data being generated. The data fields should be clearly defined
These electronic records have to comply with a Code of Federal and be consistent throughout. The type of data to be entered
Regulations (CFR), 21 CFR Part 11. This regulation is applicable should be evident from the CRF. For example, if weight has to
to records in electronic format that are created, modified, be captured in two decimal places, the data entry field should
maintained, archived, retrieved, or transmitted. This demands have two data boxes placed after the decimal as shown in
the use of validated systems to ensure accuracy, reliability, and Figure 1. Similarly, the units in which measurements have to be
consistency of data with the use of secure, computer‑generated, made should also be mentioned next to the data field. The CRF
time‑stamped audit trails to independently record the date and should be concise, self‑explanatory, and user‑friendly (unless
time of operator entries and actions that create, modify, or you are the one entering data into the CRF). Along with the
delete electronic records.[3] Adequate procedures and controls CRF, the filling instructions (called CRF Completion Guidelines)
should be put in place to ensure the integrity, authenticity, and should also be provided to study investigators for error‑free
confidentiality of data. If data have to be submitted to regulatory data acquisition. CRF annotation is done wherein the variable
authorities, it should be entered and processed in 21 CFR part is named according to the SDTMIG or the conventions followed

Indian Journal of Pharmacology | April 2012 | Vol 44 | Issue 2 169


[Downloaded free from http://www.ijp-online.com on Friday, March 16, 2012, IP: 202.3.66.95]  ||  Click here to download free Android application for this journal

Krishnankutty, et al.: An overview of data management

Figure 1: Annotated sample of a Case Report Form (CRF). Annotations Table 1:


are entered in coloured text in this figure to differentiate from the CRF
questions. DCM = Data collection module, DVG = Discrete value List of clinical data management activities
group, YNNA [S1] = Yes, No = Not applicable [subset 1], C = Character, Data collection
N = Numerical, DT = Date format. For example, BRTHDTC [DT]
CRF tracking
indicates date of birth in the date format
CRF annotation
Database design
Data entry
Medical coding
Data validation
Discrepancy management
Database lock

internally. Annotations are coded terms used in CDM tools to CRF tracking
indicate the variables in the study. An example of an annotated The entries made in the CRF will be monitored by the Clinical
CRF is provided in Figure 1. In questions with discrete value Research Associate (CRA) for completeness and filled up CRFs
options (like the variable gender having values male and female are retrieved and handed over to the CDM team. The CDM team
as responses), all possible options will be coded appropriately. will track the retrieved CRFs and maintain their record. CRFs
Based on these, a Data Management Plan (DMP) is are tracked for missing pages and illegible data manually to
developed. DMP document is a road map to handle the data assure that the data are not lost. In case of missing or illegible
under foreseeable circumstances and describes the CDM data, a clarification is obtained from the investigator and the
activities to be followed in the trial. A list of CDM activities is issue is resolved.
provided in Table 1. The DMP describes the database design, data Data entry
entry and data tracking guidelines, quality control measures, Data entry takes place according to the guidelines prepared
SAE reconciliation guidelines, discrepancy management, data along with the DMP. This is applicable only in the case of
transfer/extraction, and database locking guidelines. Along with paper CRF retrieved from the sites. Usually, double data entry
the DMP, a Data Validation Plan (DVP) containing all edit‑checks is performed wherein the data is entered by two operators
to be performed and the calculations for derived variables are separately.[8] The second pass entry (entry made by the second
also prepared. The edit check programs in the DVP help in person) helps in verification and reconciliation by identifying
cleaning up the data by identifying the discrepancies. the transcription errors and discrepancies caused by illegible
Database designing data. Moreover, double data entry helps in getting a cleaner
database compared to a single data entry. Earlier studies have
Databases are the clinical software applications, which
shown that double data entry ensures better consistency with
are built to facilitate the CDM tasks to carry out multiple
paper CRF as denoted by a lesser error rate.[9]
studies.[6] Generally, these tools have built‑in compliance
with regulatory requirements and are easy to use. “System Data validation
validation” is conducted to ensure data security, during which Data validation is the process of testing the validity of data
system specifications,[7] user requirements, and regulatory in accordance with the protocol specifications. Edit check
compliance are evaluated before implementation. Study details programs are written to identify the discrepancies in the
like objectives, intervals, visits, investigators, sites, and patients entered data, which are embedded in the database, to ensure
are defined in the database and CRF layouts are designed for data validity. These programs are written according to the logic
data entry. These entry screens are tested with dummy data condition mentioned in the DVP. These edit check programs
before moving them to the real data capture. are initially tested with dummy data containing discrepancies.
Discrepancy is defined as a data point that fails to pass a
Data collection validation check. Discrepancy may be due to inconsistent data,
Data collection is done using the CRF that may exist in the missing data, range checks, and deviations from the protocol.
form of a paper or an electronic version. The traditional method In e‑CRF based studies, data validation process will be run
is to employ paper CRFs to collect the data responses, which frequently for identifying discrepancies. These discrepancies
are translated to the database by means of data entry done will be resolved by investigators after logging into the system.
in‑house. These paper CRFs are filled up by the investigator Ongoing quality control of data processing is undertaken at
according to the completion guidelines. In the e‑CRF‑based regular intervals during the course of CDM. For example, if
CDM, the investigator or a designee will be logging into the the inclusion criteria specify that the age of the patient should
CDM system and entering the data directly at the site. In be between 18 and 65 years (both inclusive), an edit program
e‑CRF method, chances of errors are less, and the resolution will be written for two conditions viz. age <18 and >65. If for
of discrepancies happens faster. Since pharmaceutical any patient, the condition becomes TRUE, a discrepancy will
companies try to reduce the time taken for drug development be generated. These discrepancies will be highlighted in the
processes by enhancing the speed of processes involved, many system and Data Clarification Forms (DCFs) can be generated.
pharmaceutical companies are opting for e‑CRF options (also DCFs are documents containing queries pertaining to the
called remote data entry). discrepancies identified.

170 Indian Journal of Pharmacology | April 2012 | Vol 44 | Issue 2


[Downloaded free from http://www.ijp-online.com on Friday, March 16, 2012, IP: 202.3.66.95]  ||  Click here to download free Android application for this journal

Krishnankutty, et al.: An overview of data management

Discrepancy management classification of events, medical dictionaries available online


This is also called query resolution. Discrepancy management are used. Technically, this activity needs the knowledge of
includes reviewing discrepancies, investigating the reason, and medical terminology, understanding of disease entities, drugs
resolving them with documentary proof or declaring them as used, and a basic knowledge of the pathological processes
irresolvable. Discrepancy management helps in cleaning the involved. Functionally, it also requires knowledge about the
data and gathers enough evidence for the deviations observed structure of electronic medical dictionaries and the hierarchy
in data. Almost all CDMS have a discrepancy database of classifications available in them. Adverse events occurring
where all discrepancies will be recorded and stored with audit during the study, prior to and concomitantly administered
trail. medications and pre‑ or co‑existing illnesses are coded
Based on the types identified, discrepancies are either using the available medical dictionaries. Commonly, Medical
flagged to the investigator for clarification or closed in‑house Dictionary for Regulatory Activities (MedDRA) is used for the
by Self‑Evident Corrections (SEC) without sending DCF to the coding of adverse events as well as other illnesses and World
site. The most common SECs are obvious spelling errors. For Health Organization–Drug Dictionary Enhanced (WHO‑DDE)
discrepancies that require clarifications from the investigator, is used for coding the medications. These dictionaries contain
DCFs will be sent to the site. The CDM tools help in the creation the respective classifications of adverse events and drugs in
and printing of DCFs. Investigators will write the resolution or proper classes. Other dictionaries are also available for use
explain the circumstances that led to the discrepancy in data. in data management (eg, WHO‑ART is a dictionary that deals
When a resolution is provided by the investigator, the same will with adverse reactions terminology). Some pharmaceutical
be updated in the database. In case of e‑CRFs, the investigator companies utilize customized dictionaries to suit their needs
can access the discrepancies flagged to him and will be able to and meet their standard operating procedures.
provide the resolutions online. Figure 2 illustrates the flow of Medical coding helps in classifying reported medical terms
discrepancy management. on the CRF to standard dictionary terms in order to achieve data
The CDM team reviews all discrepancies at regular intervals consistency and avoid unnecessary duplication. For example,
to ensure that they have been resolved. The resolved data the investigators may use different terms for the same adverse
discrepancies are recorded as ‘closed’. This means that those event, but it is important to code all of them to a single standard
validation failures are no longer considered to be active, and code and maintain uniformity in the process. The right coding
future data validation attempts on the same data will not create and classification of adverse events and medication is crucial
a discrepancy for same data point. But closure of discrepancies
as an incorrect coding may lead to masking of safety issues or
is not always possible. In some cases, the investigator will
highlight the wrong safety concerns related to the drug.
not be able to provide a resolution for the discrepancy. Such
discrepancies will be considered as ‘irresolvable’ and will be Database locking
updated in the discrepancy database. After a proper quality check and assurance, the final data
Discrepancy management is the most critical activity in validation is run. If there are no discrepancies, the SAS datasets
the CDM process. Being the vital activity in cleaning up the are finalized in consultation with the statistician. All data
data, utmost attention must be observed while handling the management activities should have been completed prior to
discrepancies. database lock. To ensure this, a pre‑lock checklist is used and
completion of all activities is confirmed. This is done as the
Medical coding
database cannot be changed in any manner after locking. Once
Medical coding helps in identifying and properly classifying
the approval for locking is obtained from all stakeholders, the
the medical terminologies associated with the clinical trial. For
database is locked and clean data is extracted for statistical
analysis. Generally, no modification in the database is
Figure  2: Discrepancy management (DCF = Data clarification
form, CRA = Clinical Research Associate, SDV = Source document possible. But in case of a critical issue or for other important
verification, SEC = Self‑evident correction) operational reasons, privileged users can modify the data even
'$7$9$/,'$7,21
after the database is locked. This, however, requires proper
documentation and an audit trail has to be maintained with
sufficient justification for updating the locked database. Data
',6&5(3$1&< extraction is done from the final database after locking. This
is followed by its archival.
12 '&) <(6 Roles and Responsibilities in CDM
In a CDM team, different roles and responsibilities are
attributed to the team members. The minimum educational
&5$ '$7$0$1$*(5 ,19(67,*$725
requirement for a team member in CDM should be graduation
in life science and knowledge of computer applications. Ideally,
medical coders should be medical graduates. However, in the
6(& 5(62/87,21
6'9 industry, paramedical graduates are also recruited as medical
coders. Some key roles are essential to all CDM teams. The
list of roles given below can be considered as minimum
'$7$%$6( requirements for a CDM team:

Indian Journal of Pharmacology | April 2012 | Vol 44 | Issue 2 171


[Downloaded free from http://www.ijp-online.com on Friday, March 16, 2012, IP: 202.3.66.95]  ||  Click here to download free Android application for this journal

Krishnankutty, et al.: An overview of data management

• Data Manager by means of the systems and processes being implemented and
• Database Programmer/Designer the standards being followed. The biggest challenge from the
• Medical Coder regulatory perspective would be the standardization of data
• Clinical Data Coordinator management process across organizations, and development
• Quality Control Associate of regulations to define the procedures to be followed and the
• Data Entry Associate data standards. From the industry perspective, the biggest
The data manager is responsible for supervising the entire hurdle would be the planning and implementation of data
CDM process. The data manager prepares the DMP, approves management systems in a changing operational environment
the CDM procedures and all internal documents related where the rapid pace of technology development outdates the
to CDM activities. Controlling and allocating the database existing infrastructure. In spite of these, CDM is evolving to
access to team members is also the responsibility of the data become a standard‑based clinical research entity, by striking a
manager. The database programmer/designer performs the balance between the expectations from and constraints in the
CRF annotation, creates the study database, and programs the existing systems, driven by technological developments and
edit checks for data validation. He/she is also responsible for business demands.
designing of data entry screens in the database and validating
the edit checks with dummy data. The medical coder will do References
the coding for adverse events, medical history, co‑illnesses, 1. Gerritsen MG, Sartorius OE, vd Veen FM, Meester GT. Data management in
and concomitant medication administered during the study. multi‑center clinical trials and the role of a nation‑wide computer network. A 5 year
evaluation. Proc Annu Symp Comput Appl Med Care 1993:659‑62.
The clinical data coordinator designs the CRF, prepares the
2. Lu Z, Su J. Clinical data management: Current status, challenges, and future
CRF filling instructions, and is responsible for developing the directions from industry perspectives. Open Access J Clin Trials 2010;2:93‑105.
DVP and discrepancy management. All other CDM‑related 3. CFR ‑ Code of Federal Regulations Title 21 [Internet]. Maryland: Food and Drug
documents, checklists, and guideline documents are prepared Administration; Available from: http://www.accessdata.fda.gov/scripts/cdrh/cfdocs/
by the clinical data coordinator. The quality control associate cfcfr/CFRSearch.cfm?fr=11.10. [Updated 2010 Apr 4; Cited 2011 Mar 1].
checks the accuracy of data entry and conducts data audits.[10] 4. Study Data Tabulation Model [Internet]. Texas: Clinical Data Interchange
Standards Consortium.; c2011. Available from: http://www.cdisc.org/sdtm.
Sometimes, there is a separate quality assurance person to
[Updated 2007 Jul; Cited 2011 Mar 1].
conduct the audit on the data entered. Additionally, the quality 5. CDASH [Internet]. Texas: Clinical Data Interchange Standards Consortium.;
control associate verifies the documentation pertaining to the c2011. Available from: http://www.cdisc.org/cdash. [Updated 2011 Jan; Cited
procedures being followed. The data entry personnel will be 2011 Mar 1].
tracking the receipt of CRF pages and performs the data entry 6. Fegan GW, Lang TA. Could an open‑source clinical trial data‑management system
into the database. be what we have all been looking for? PLoS Med 2008;5:e6.
7. Kuchinke W, Ohmann C, Yang Q, Salas N, Lauritsen J, Gueyffier F, et al.
Conclusion Heterogeneity prevails: The state of clinical trial data management in
Europe ‑ results of a survey of ECRIN centres. Trials 2010;11:79.
CDM has evolved in response to the ever‑increasing 8. Cummings J, Masten J. Customized dual data entry for computerized data
demand from pharmaceutical companies to fast‑track the drug analysis. Qual Assur 1994;3:300‑3.
development process and from the regulatory authorities to put 9. Reynolds‑Haertle RA, McBride R. Single vs. double data entry in CAST. Control
Clin Trials 1992;13:487‑94.
the quality systems in place to ensure generation of high‑quality
10. Ottevanger PB, Therasse P, van de Velde C, Bernier J, van Krieken H, Grol R,
data for accurate drug evaluation. To meet the expectations, et al. Quality assurance in clinical trials. Crit Rev Oncol Hematol 2003;47:213‑35.
there is a gradual shift from the paper‑based to the 11. Haux R, Knaup P, Leiner F. On educating about medical data management ‑ the
electronic systems of data management. Developments on the other side of the electronic health record. Methods Inf Med 2007;46:74‑9.
technological front have positively impacted the CDM process
and systems, thereby leading to encouraging results on speed Cite this article as: Krishnankutty B, Bellary S, Naveen Kumar BR,
and quality of data being generated. At the same time, CDM Moodahadu LS. Data management in clinical research: An overview. Indian
J Pharmacol 2012;44:168-72.
professionals should ensure the standards for improving data
Source of Support: Nil. Conflict of Interest: None declared.
quality.[11] CDM, being a speciality in itself, should be evaluated

172 Indian Journal of Pharmacology | April 2012 | Vol 44 | Issue 2

View publication stats

You might also like