Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
31 views45 pages

QC Important Concepts

The document discusses key concepts in laboratory quality including accuracy, precision, coefficient of variation, analytical sensitivity and specificity, matrix effects, imprecision, inaccuracy, and analytical measurement range. It defines these terms and describes their importance for ensuring quality in laboratory testing.

Uploaded by

zmlofficial.pk
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views45 pages

QC Important Concepts

The document discusses key concepts in laboratory quality including accuracy, precision, coefficient of variation, analytical sensitivity and specificity, matrix effects, imprecision, inaccuracy, and analytical measurement range. It defines these terms and describes their importance for ensuring quality in laboratory testing.

Uploaded by

zmlofficial.pk
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 45

Quality in Laboratory

DR . M U HAM MAD Z U BAI R

M B B S , FCP S

C O N S UL T A NT C H E M ICA L P A T H O L OG IS T

M I K D M U LT A N
Basic Definitions
Accuracy

o Closeness of measured (test) value to the true value


o It is the ability of an analytical method to obtain the
true or correct result after a number of replicate
analysis are performed
o An accurate result is one that is the ‘true’ result

3
Precision
o Ability of a method to get the same (but not
necessarily ‘true’) result time after time
o Closeness of agreement between values obtained by
replicate measurements, under specified conditions
o Describes the reproducibility of a method
o Degree of fluctuation in the measurements is
indicative of the precision of the assay

4
Precision and Accuracy
• Precise and inaccurate • Imprecise and inaccurate

Systematic Error Random Error


Coefficient of Variation (CV)
(Relative Standard Deviation)
• Statistical measure of the dispersion of data points in a data series around the mean
• Represents the ratio of the standard deviation to the mean
• Expressed as a percentage
CV% = Standard deviation X 100
mean
▪ Standard deviation of data is expressed in the context of the mean of the data. In contrast,
the actual value of the CV is independent of the unit in which the measurement has been
taken
▪ It is a unitless number
▪ One can use it instead of the standard deviation to compare the spread of data sets that
have different units or different means.

6
Coefficient of Variation (CV)
Analyte: SD CV
FSH Concentration
1 0.09 9.0
5 0.25 5.0
10 0.40 4.0
25 1.20 4.8
100 3.80 3.8

• Smaller the CV, the more reproducible the results: more values
are closer to the mean.
• Useful in comparing 2 or more analytical methods
• Ideally should be less than 5 %
7
Analytical Run

◦ An interval (in term of time or series of


measurements) within which the accuracy and
precision of testing system are expected to be stable
but cannot be greater than 24 hours (CLIA 88)
◦ It should ideally be between 8-24 hours

◦ Analytical run defines the interval (period of time or


number of specimens) between evaluations of control
results
8
Run & Bias

◦ Run = Batch

◦ Bias = Difference between True value (target


value) and Test value (measured
value)

9
Random Error
◦ An error that can be either positive or negative, occurs at any time,
the direction and exact magnitude of which cannot be exactly
predicted
◦ Are the small fluctuations introduced in nearly all analyses
◦ Unpredictable variation in an experiment
◦ Error that affects the reproducibility of method
◦ Measure of imprecision
◦ Imprecision of the test system causing a scatter or spread of control
values around the mean
◦ Arises from unpredictable variations in influence quantities. These
random effects give rise to variations in repeated observations of the
measurand
10
Causes of Random Error
✓ Instability of instrument
✓ Variations in the temperature
✓ Variations in reagents & calibrators & calibration curve
stability
✓ Variability in handling techniques e.g. pipetting, mixing
& timing
✓ Variability in operators
✓ Fluctuation in Electric supply

11
Systematic Error
• Non random error that measures bias between samples of data
and true population value

• An error that is always in one direction and is predictable


• A component of error, that in course of number of analysis of same
measurand, remains constant or varies in a predictable way

• Systematic change in the test system resulting in a displacement of


the mean from the original value

• Systematic error of an analytic system is predictable and causes


shifts or trends on control charts that are consistently low or high
12
Causes of Systematic Error
• Change in reagent or calibrator lot numbers
• Wrong calibrator values
• Improperly prepared reagents
• Deterioration of reagents or calibrators
• Inappropriate storage of reagents or calibrators
• Variation in sample or reagent volumes due to pipetter
misalignments
• Variation in temperature or reaction chambers
• Deterioration of photometric light source
13
• Variation in procedure between technologists
Trend/Drift
• A trend is a continuous movement of values in one direction over
six or more analytical runs.

• Trends can start on one side of the mean and move across it or can
occur entirely on one side of the mean

• A gradual change over time in the test results obtained from control
material that is suggestive of a progressive problem with testing
system or control material
• Indicates a gradual loss of reliability in the test system
Trend

15
Causes of Trend/Drift
• Deterioration of instrument light source
• Gradual accumulation of debris in sample /reagent tubing
• Gradual accumulation of debris on electrode surfaces
• Aging of reagents
• Gradual deterioration of control materials

• Gradual deterioration of incubation chamber


16 • Gradual deterioration of light filter integrity
Shift
• A trend is an abrupt and sustained change in control values

• A shift is a sudden change of values from one level of the control

chart to another.

• Can start on one side of the mean and move across it

• There will be shift of mean also

• QC data results are distributed on one side of the mean for 6-7

17 consecutive days
Shift

18
Causes of Shift
• Change in light source
• Change in reagent formulation
• Change in reagent lot
• Major instrument maintenance
• Sudden change in incubation temperature
• Change in room temp or humidity
• Failure in sampling system
• Failure in reagent dispense system
19
• Inaccurate calibration
Shift Trend

20
Diagnostic Sensitivity
◦ A term used to describe the probability that a laboratory test will be
positive in the presence of disease

If a person has a disease, how often will the test be positive (true positive
rate)?
Put another way, if the test is highly sensitive and the test result is negative
you can be nearly certain that they don’t have disease.
A Sensitive test to helps rule out disease (when the result is
negative). Sensitivity =Rule Out or "Snout"

Sensitivity= true positives/(true positive + false negative)


21
Diagnostic Specificity
◦ A term used to describe the probability that a laboratory test will be
negative in the absence of disease
If a person does not have the disease how often will the test be negative
(true negative rate)?

In other terms, if the test result for a highly specific test is positive you can
be nearly certain that they actually have the disease.

A very specific test rules in disease with a high degree of


confidence Specificity rule in or "Spin".
Specificity=true negatives/(true negative + false positives)
22
Matrix effects
•The combined effect of all components of
the sample other than the analyte on the
measurement of the measurand
•If a specific component can be identified as
causing a matrix effect then this is referred
to as interference

23
Analytical Sensitivity & Specificity

Analytical Sensitivity
➢ It represents the smallest amount of substance in a
sample that can accurately be measured by an assay

Analytical Specificity
➢ It refers to the ability of an assay to measure a particular
organism or substance, rather than others, in a sample

24
Imprecision & Inaccuracy
Imprecision
• It is defined as amount or degree of random error in an
assay or in a set of replicate measurements
• Usually represented by standard deviation, coefficient of
variation or range.
• It can be between-laboratory, with-in day or between-day
imprecision

Inaccuracy
• It is the systematic error estimated by the difference
between the mean of a set of data and the true values
25
Analytical Measurement Range:
(Reportable Range)
• Concentration range of a method over which the analytical
performance has been determined and judged to meet medical
application requirements. or

• Range of analyte values that a method can directly measure


without any dilution, concentration or other pretreatment, not part
of usual assay process.

Cut off value


Those limits above or below which the patient is considered
abnormal or positive for a condition
26
Method Validation

• Method validations are performed for new methods that


have just been developed or instituted, and that require
revisions because they've been significantly changed
• When????
• New method is developed
• Modifications are being made in existing method
• A change in technology/instrument
• Significant changes in instrument parameters, reagents, time and
temperature

27
Method Verification

• Method “verification” is materially different and a little


more limited, and not as robust or rigorous as a
method “validation“
• Regardless, verification is needed in order to verify
that a lab is capable of performing an analytical
method reliably and precisely for its intended use

28
Medical Decision Levels (Xc)

• A concentration of sample at which some medical


action is indicated for proper patient care. e.g., 126
mg/dl for Glucose
• There may be several medical decision levels for a
given analyte

29
Limit of Blank (LOB)

• Highest apparent analyte concentration expected to be found


when replicates of a sample containing no analyte are tested
• Note that although the samples tested to define LOB are devoid
of analyte, a blank (zero) sample can produce an analytical
signal that might otherwise be consistent with a low
concentration of analyte
• Analyzers may report all signal values below a certain fixed limit
as “zero concentration.”
• Uses LOB as a reasonable starting point for estimating the LOD

30
Limit of detection:
(Lower limit of detection - Sensitivity)
• Lowest amount of analyte in a sample that can be detected with (stated)
probability, although perhaps not quantified as an exact value

• Although reagent package inserts may state that an assay has a dynamic
range that extends from zero concentration to some upper limit, typically an
assay is simply not capable of accurately measuring analyte concentrations
down to zero.

• Sufficient analyte concentration must be present to produce an analytical


signal that can reliably be distinguished from “analytical noise,” the signal
produced in the absence of analyte

• LOD is the lowest analyte concentration likely to be reliably distinguished


from the LOB and at which detection is feasible. It is therefore greater than
LOB 32
Limit of Quantification:
(Lower limit of determination,
Lower end of measurement range)

• Lowest amount of analyte that can be quantitatively determined with


stated acceptable precision and trueness

• LOQ is the lowest concentration at which the analyte can not only be
reliably detected but at which some predefined goals for bias and
imprecision are met.

• The LOQ may be equivalent to the LOD or it could be at a much


higher concentration; it cannot be lower than the LOD. A LOD
provides an estimate of bias and imprecision at very low analyte
concentration. 33
Definitive Method
• An analytical method that

• has been subjected to thorough investigation and

• has been found to have no source of inaccuaracy and ambiguity

• Value of an analyte obtained by a definitive method is taken as


true value

• As per NBS, it is an analytical method that is capable of providing


highest accuracy among all methods for determining that analyte
Definitive Method
• Development and performance requires
• sophisticated instrumentation
• superior and well-defined reagents and
• highly skilled personnel

• Therefore such procedures usually are too expensive and technically


demanding
• Depends in part on the availability of pure reference materials. (e.g.
those supplied by the National Bureau of Standards and called
SRM's.)
• Uses:
• Used in the verification of accuracy of Field Methods
• Used to validate reference methods and primary reference materials
Reference Method

• Analytical method with thoroughly documented accuracy,


precision, and low susceptibility to interfering substances,
sufficient to achieve the stated end purpose

• Should show appropriate linearity, sensitivity and specificity

• Accuracy and precision must have been demonstrated by direct


comparison with a Definitive Method
Reference Method
Uses
• To evaluate Secondary Reference Materials and to assign values to
Control Material

• To compare the quality of routine or proposed methods

• To evaluate the bias and interferences in Field Method, and as an


index for the acceptability of Field Methods that are under
development
Field Method

• Any routinely used analytical method that has been shown to have
adequate precision, accuracy and specificity for its intended use

• Such method should also have an appropriate analytical range


and should be practical and economical

• Field Methods that have been endorsed by a qualified body, such


ISO are termed as “Recommended,” “Standard,” or “Selected
Methods”
Field Method
Uses

• Field Methods are predominantly used in routine service

laboratories for measuring analytes in biological fluids


Reference Materials
• As defined by the International Standards Organization (ISO) to be “a
material or substance, one or more physical or chemical properties of
which are sufficiently well established, to be used for:
• calibrating instruments
• validating methods
• assigning values to materials
• evaluating the comparability of results
• Reference Materials are of following types
• Primary
• Secondary
• Standard
• Certified
• Reference Materials trace back to International Bureau of weights and
measurements
Primary Reference Materials
• Well-characterized, stable, homogeneous and highly purified
chemicals that are directly weighed or measured to produce a solution
whose concentration is exactly known.

• Purity of 99.98% as proposed by IUPAC

• These highly purified chemicals may be weighed out directly for


preparation of solutions of selected concentration or for calibration of
solutions of unknown strength

• Supplied with certificate of analysis for each lot

• Values assigned by analysis by Definitive Method


Secondary Reference Materials

• Concentration of Secondary Reference Materials is


determined by
• analysis of an aliquot of solution by reference method,
• using a primary reference material to calibrate the method

• Concentration cannot be prepared by weighing the solute


and dissolving a known amount into a volume of solution
like primary reference material
Standard Reference Materials
• SRMs are chemical compounds of certified purity, characterized sufficiently
well to ensure that if properly used to standardize an analytical method or
instrument, no errors in results can be attributed to the calibration of system
• Purity of SRMs is in effect such that none of the residual impurities are
present in sufficient quantity to affect materially the results obtained with a
system
• For clinical and molecular laboratories, they are available from NIST
• Purity of 99.98% as proposed by IUPAC
• These highly purified chemicals may be weighed out directly for preparation
of solutions of selected concentration or for calibration of solutions of
unknown strength
• Supplied with certificate of analysis for each lot
• Values assigned by analysis by Definitive Method
Standard Reference Materials
• Each has been well characterized for certain chemical and physical
properties, and is issued with a certificate that gives results of the
characterization

• May be used to characterize other materials

• Examples of SRMs for use in clinical & molecular diagnostic labs include:

• Pure crystalline standards


• Human-based standards
• Animal based standards
• Stds containing drugs of abuse in urine and human hair
• For DNA profiling/crime scene investigations
Certified Reference Materials
• Certified Reference Materials (CRMs) are ‘controls’ or standards used
to
• check the quality and meterological traceability of products
• validate analytical measurement methods
• calibrate instruments

• A certified reference material is a particular form of measurement


standard
• CRMs for clinical and molecular laboratories are available from Institute
for Reference Materials and Measurements (IRMM), Belgium

You might also like