5th Unit
5th Unit
LO1 Students will able to know about elaborate the calibration, BP606.6
validation procedures.
LO2 To Understand the scope of good warehousing practices BP606.6
LO3 Students will able to understand the cGMP aspects in a BP606.6
pharmaceutical industry
CALIBRATION
Calibration of an instrument is the process of determining its accuracy. The process involves
obtaining a reading from the instrument and measuring its variation from the reading obtained
from a standard instrument.
Calibration of an instrument also involves adjusting its precision and accuracy so that its
readings come in accordance with the established standard.
This is important for justifying the processes of Qualification and Validation.
The instrument or equipment with the known accuracy is known as standards. All the other
instruments are measured against this standard. It is important to know that the standards
vary from one country to the other depending upon the type of industry.
Calibration Achieves 2 Main Objectives —
a. It checks the accuracy of an instrument
b. It determines the traceability of the measurement
Scope/Purpose of Calibration
Calibration is primarily done to achieve 5 main purposes which are:
1. To make sure that the readings of equipment or instruments are consistent with other
measurements and display the correct readings every single time
2. To determine the accuracy, precision, reliability and deviation of the measurements
produced by all the instruments.
3. To establish the reliability of the instrument being used and whether it can be trusted to
deliver repeatable results each time.
4. To map the ‗drift‘ as documented. Instruments have a tendency to produce inaccurate
measurements over a period of time, following repeated use.
5. Ensuring that the industry standards, quality assurance benchmarks such as current good
manufacturing practice (cGMP) and government regulations are adhered to.
What Is Instrument Calibration?
Instrument calibration can be defined as the process of comparing the measurements made by the
instrument to be calibrated against a known measurement of either standards or an instrument
known to be making measurements that exceed the acceptable limits of accuracy and precision.
Usually, calibration labs prefer a standard with 10 times the accuracy; however, most regulating
organizations and authorities also accept a 3:1 accuracy ratio.
Frequency of Instrument Calibration
How often you conduct instrument calibration mainly depends upon its tendency to drift from the
true measurement and how it impacts the quality of the end product. Examine each instrument
being used and study its behavior. Based on this information, you can design a calibration
schedule for each instrument.
The interval between calibrations can vary as:
Weekly
Monthly or bi-monthly
Quarterly, semi-annually or annually
After every heavy usage of the instrument
When Should The Measuring Instruments Be Calibrated?
The frequency of calibrating the measuring instruments depends on a number of different factors.
The following is a guide outlining when instruments need to be calibrated as a part of GMP:
As soon as you bring in a new instrument, you should calibrate it before you test it out.
Before and after you take critical measurements
After any instance of electrical or mechanical shock or a similar event that includes a fall,
bump, etc.
When you suspect that the accuracy of measurements being produced is questionable
If there were any repairs or re-qualifications of the instrument
As per included as part of a calibration schedule
Depending on the task and processes as some require calibration to be conducted before
the work starts
According to the manufacturer‘s recommendation
Commonly Used Calibration Methods and Procedures:
There are different ways that are used to calibrate an instrument. These methods are chosen
based on the desired results of the calibration and regulatory authorities‘ requirements, like FDA
guidelines. Let us look at three such procedures:
Standard Calibration: This method is mostly preferred for calibrating instruments that are non-
critical to quality or are not required for accreditation and license purposes. Use traceable
standards and document its performance.
Calibration with Data: Procedures for calibrations with data are similar to that of accredited
calibration. The only exception being that these procedures are not accredited to the ISO
standard. Moreover, they are not accompanied by data on measurement uncertainties.
ISO 17025 Accredited Calibration: This has to be the strictest method of calibration.
Generally, it requires a measurement report which has the details of the measurements that are
made against a standard of ‗as found‘ (before calibration is started) and ‗as left‘ (once the
calibration is completed). If the calibration is done by a calibration service provider, they must
issue a certificate of the same.
Importance of Regular Calibration:
Calibration is responsible for defining the accuracy of any measurement and its quality that is
recorded by any instrument. When you start working with any instrument, it must be calibrated
well, thus assuring you of accurate results. However, over a period of time you will start
observing a ‗drift‘. Calibration minimizes such uncertainties by assuring the accuracy of the test
equipment.
When you regularly calibrate your equipment, you can eliminate the drift at its budding stage
instead of allowing it to grow till it affects the measurements in significant ways.
Calibration helps in quantifying and controlling errors and uncertainties within various
measurement processes to an acceptable level.
Further, it helps in improving the accuracy of the measuring device, which in turn improves the
quality of the end product.
In short, regular calibration allows pharmaceutical companies to have confidence in their results
which they can record, monitor and control.
QUALIFICATION
It refers to activities undertaken to demonstrate that utilities and equipment are suitable for
their intended use and perform properly.
It is the action of proving that any equipment or process works correctly and consistently and
produces the expected results.
―It is the action of proving and documenting that equipment or ancillary systems are properly
installed, work correctly, and actually lead to the expected results.‖
Qualification is part of validation, but the individual qualification steps alone do not
constitute process validation.
Qualification of analytical instrumentation is essential for accurate and precise measurement
of analytical data. If the instrumentation is not qualified, ensuring that the results indicated
are trustworthy, all other work based upon the use of that instrumentation is suspect.
Qualification of instruments is not a single, continuous process but instead results from many
discrete activities. For convenience, these activities have been grouped into 4 phases of
qualification. These phases are described below:
1. Design Qualification (DQ)
2. Installation Qualification (IQ)
3. Operational Qualification (OQ)
4. Performance Qualification (PQ)
Design Qualification (DQ):
It is the documented verification that the proposed design of the facilities, systems and
equipment is suitable for the intended purpose.
DQ should be performed when new equipment is being purchased, or when existing
equipment is being used for a new application. DQ serves as the precursor to defining the
equipment Installation Qualification (IQ) and OQ protocols.
The purpose is to ensure that all the requirements for the final systems have been clearly
defined at the start.
In other words, ―Has it been designed and selected correctly?‖
DQ check items:
GMPs and regulatory requirements
Performance criteria
Reliability and efficiency
Commissioning requirements
Construct ability and installation of equipment
Safety and environment impact
Description of the intended use of the equipment
The objective is to ensure that the instrument is performing within specified limits. The PQ
represents the final qualification of equipment or system.
It is used to establish and or confirm;
1. Definition of performance criteria and test procedures.
2. Selection of critical parameters, with predefined specifications.
3. Determination of the test intervals, e.g.
(a) - Everyday.
(b) - Every time the system is used.
(c) - Before, between and after a series of runs.
4. Define corrective actions on what to do if the system does not meet the established
criteria.
Re Qualification:
Modification to, or relocation of equipment should follow satisfactory review and
authorization of the documented change proposal through the change control procedure. This
formal review should include consideration of re-qualification of the equipment.
Minor changes or changes having no direct impact on final or in- process product quality
should be handled through the documentation system of the preventive maintenance
program.
Scope of Performance Qualification.
According to regulatory documents, like FDA guidelines, the scope of PQ is somewhat
limited. While equipment validation tests the ability individually for each piece of
equipment, PQ verifies the performance of equipment, systems and facilities as a whole.
It represents the final qualification, including any requalification of the system and
equipment that you use in your business.
Typically, the scope of PQ extends to include the following scenarios:
New systems being delivered and operated for the first time
Existing systems in use (as part of a regular maintenance schedule)
Systems that have been modified to any degree
Equipment/systems which have been used more than they normally would be
After a system has been expanded in order to increase its capacity
VALIDATION
Validation is an essential part of good manufacturing practices (GMP). It is,therefore, an element
of the quality assurance programme associated with a particular product or process. The basic
principles of quality assurance have as their goal the production of products that are fit for their
intended use. These principles are as follows:
Quality, safety and efficacy must be designed and built into the product.
Quality cannot be inspected or tested into the product.
Each critical step of the manufacturing process must be validated. Other steps in the process
must be under control to maximize the probability that the finished product consistently and
predictably meets all quality and design specifications.
Validation of processes and systems is fundamental to achieving these goals. It is by design and
validation that a manufacturer can establish confidence that the manufactured products will
consistently meet their product specifications.
Documentation associated with validation includes:
standard operating procedures (SOPs)
specifications
validation master plan (VMP)
qualification protocols and reports
Validation protocols and reports.
The implementation of validation work requires considerable resources such as:
1. Time: generally validation work is subject to rigorous time schedules.
2. Financial: validation often requires the time of specialized personnel and expensive
technology.
3. Human: validation requires the collaboration of experts from various disciplines (e.g. a
multidisciplinary team, comprising quality assurance, engineering, manufacturing and
other disciplines, depending on the product and process to be validated).
These guidelines aim to give guidance to inspectors of pharmaceutical manufacturing facilities
and manufacturers of pharmaceutical products on the requirements for validation. The main part
covers the general principles of validation and qualification. In addition to the main part,
appendices on validation and qualification (e.g. cleaning, computer and computerized systems,
equipment, utilities and systems, and analytical methods) are included.
DEFINITIONS:
According to ISO:
―Validation is the confirmation by examination and the provision of objective evidence that the
particular requirements for a specific intended use are fulfilled.‖
According to the US Food and Drug Administration (FDA),
―The goal of validation is to: ―Establish documented evidence which provides a high degree of
assurance that a specific process will consistently produce a product meeting its predetermined
specifications and quality attributes.‖
According t European commission:
―Action providing in accordance with the principles of GMP, that any procedure, process,
equipment, material, activity or system actually lead to the expected results.‖
The definitions given below apply to the terms used in these guidelines. They may have different
meanings in other contexts.
Calibration
The set of operations that establish, under specified conditions, the relationship between values
indicated by an instrument or system for measuring (for example, weight, temperature and pH),
recording, and controlling, or the values represented by a material measure, and the
corresponding known values of a reference standard. Limits for acceptance of the results of
measuring should be established.
Cleaning validation
Documented evidence to establish that cleaning procedures are removing residues to
predetermined levels of acceptability, taking into consideration factors such as batch size,
dosing, and toxicology and equipment size.
Commissioning
The setting up, adjustment and testing of equipment or a system to ensure that it meets all the
requirements, as specified in the user requirement specification, and capacities as specified by
the designer or developer. Commissioning is carried out before qualification and validation.
Computer validation
Documented evidence which provides a high degree of assurance that a computerized system
analyses, controls and records data correctly and that data processing complies with
predetermined specifications.
Qualification
Action of proving and documenting that any premises, systems and equipment are properly
installed, and/or work correctly and lead to the expected results. Qualification is often a part (the
initial stage) of validation, but the individual qualification steps alone do not constitute process
validation.
Standard operating procedure (SOP)
An authorized written procedure giving instructions for performing operations not necessarily
specific to a given product or material but of a more general nature (e.g. equipment operation,
maintenance and cleaning; validation; cleaning of premises and environmental control; sampling
and inspection). Certain SOPs may be used to supplement product-specific master batch
production documentation.
Validation
Action of proving and documenting that any process, procedure or method actually and
consistently leads to the expected results.
Validation master plan (VMP)
The VMP is a high-level document that establishes an umbrella validation plan for the entire
project and summarizes the manufacturer‘s overall philosophy and approach, to be used for
establishing performance adequacy. It provides information on the manufacturer‘s validation
work programme and defines details of and timescales for the validation work to be performed,
including a statement of the responsibilities of those implementing the plan.
Validation protocol (or plan) (VP)
A document describing the activities to be performed in a validation, including the acceptance
criteria for the approval of a manufacturing process—or a part thereof—for routine use.
Validation report (VR)
A document in which the records, results and evaluation of a completed validation programme
are assembled and summarized. It may also contain proposals for the improvement of processes
and/or equipment.
Verification
The application of methods, procedures, tests and other evaluations, in addition to monitoring, to
determine compliance with the GMP principles.
Worst case
A condition or set of conditions encompassing the upper and lower processing limits for
operating parameters and circumstances, within SOPs, which pose the greatest chance of product
or process failure when compared to ideal conditions. Such conditions do not necessarily include
product or process failure.
Relationship between validation and qualification
Validation and qualification are essentially components of the same concept. The term
qualification is normally used for equipment, utilities and systems, and validation for processes.
In this sense, qualification is part of validation
SCOPE OF VALIDATION
Validation requires an appropriate and sufficient infrastructure including: – organization,
documentation, personnel and finances
Involvement of management and quality assurance personnel
Personnel with appropriate qualifications and experience
Extensive preparation and planning before validation is performed
Validation should be performed: –
for new premises, equipment, utilities and systems, and processes and procedures
at periodic intervals
When major changes have been made.
Validation in accordance with written protocols.
Validation over a period of time, e.g. at least three consecutive batches (full production scale)
to demonstrate consistency. (Worst case situations should be considered.)
Significant changes (facilities, equipment, processes) - should be validated
Risk assessment approach used to determine the scope and extent of validation needed
IMPORTANCE OF VALIDATION
1. Assurance of quality
2. Time bound
3. Process optimization
4. Reduction of quality cost.
5. Minimal batch failures, improved efficiently and productivity.
6. Reduction in rejections.
7. Increased output.
8. Fewer complaints about process related failures.
9. Reduced testing in process and in finished goods.
10. More rapid and reliable start-up of new equipment
11. Easier maintenance of equipment.
12. Improved employee awareness of processes.
13. More rapid automation.
14. Government regulation (Compliance with validation requirements is necessary for obtaining
approval to manufacture and to introduce new products)
TYPES OF VALIDATION
Prospective validation
It is defined as the established documented evidence that a system does what it purports to do
based on a pre-planned protocol.
This validation usually carried out prior to distribution either of a new product or a product
made under a revised manufacturing process.
Performed on at least three successive production-size (Consecutive batches).
The objective of the prospective validation is to prove or demonstrate that the process will
work in accordance with validation protocol prepared for the pilot production trials.
Prospective validation should normally be completed prior to the distribution and sale of the
medicinal product.
In Prospective Validation, the validation protocol is executed before the process is put into
commercial use.
Concurrent validation
It is a process where current production batches are used to monitor processing parameters.
Concurrent Validation means establishing documented evidence a process does what it is
supposed to based on data generated during actual implementation of the process.
It is important in these cases when the systems and equipment to be used have been fully
validated previously.
It is similar to prospective, except the operating firm will sell the product during the
qualification runs, to the public at its market price, and also similar to retrospective
validation.
This validation involves in-process monitoring of critical processing steps and product
testing. This helps to generate and documented evidence to show that the production process
is in a state of control.
Retrospective validation
It is defined as the established documented evidence that a system does what it purports to do
on the review and analysis of historical information. This type of validation of a process is
for a product already in distribution.
Retrospective validation is only acceptable for well-established processes and will be
inappropriate where there have been recent changes in the composition of the product,
operating procedures or equipment.
Validation of such processes should be based on historical data.
For retrospective validation, generally data from ten to thirty consecutive batches should be
examined to access process consistency, but fewer batches may be examined if justified.
Revalidation
Re-validation provides the evidence that changes in a process and /or the process
environment that are introduced do not adversely affect process characteristics and product
quality. Documentation requirements will be the same as for the initial validation of the
process.
Re-validation becomes necessary in certain situations. Some of the changes that require
validation are as follows:
Changes in raw materials (physical properties such as density, viscosity, particle size
distribution etc. that may affect the process or product).
Changes in the source of active raw material manufacturer.
Changes in packaging material (primary container/closure system)
Changes in the process (e.g., mixing time, drying temperatures and batch size
Changes in the equipment (e.g., addition of automatic detection system).
Changes in the plant/facility.
CALIBRATION VALIDATION
Calibration is a demonstration that, a particular Validation is a documented program that
Instrument or device produces results within provides high degree of assurance that a
specified limits by comparisons with those specific process, equipment, method or system
produced by a reference or traceable standard consistently produces a result meeting pre-
over an appropriate range of measurements. determined acceptance criteria.
In calibration performance of an instrument or No such reference standards are using in
device is comparing against a reference validation program
standard.
Calibration ensures that instrument or Validation provides documented evidence that
measuring devices producing accurate results. a process, equipment, method or system
produces consistent results (in other words, it
ensures that uniforms batches are produced).
Shall be performed periodically, to identify the No such requirements. Shall be performed
‗drift‘ of the measuring device or equipment when changes or modifications happen to the
and make them accurate. existing system or once revalidation period is
reached.
Shall be performed as per calibration SOP Shall be performed as per validation protocol
meter because pH readings are temperature dependent. Pour your buffers into individual
beakers for calibration.
Check with your pH meter manufacturer, or current educational or professional
institution, about acquiring pH buffer solutions.
Buffers should be kept in a beaker for no longer than two hours.
Discard the buffer when you are finished. Do not return it to its original container.
Calibrating Your pH Meter
1. Place your electrode in the buffer with a pH value of 7 and begin reading. Press the
―measure‖ or calibrate button to begin reading the pH once your electrode is placed in the
buffer.
Allow the pH reading to stabilize before letting it sit for approximately 1-2 minutes.
2. Set the pH. Once you have a stable reading, set the pH meter to the value of the buffer's
pH by pressing the measure button a second time. Setting the pH meter once the reading
has stabilized will allow for more accurate and tuned readings.
Although not necessary, if you stir your buffer before measuring be sure to stir all other
buffers and samples in the same way.
3. Rinse your electrode with distilled water. Rinse and pat dry with a lint-free tissue, like
Kimwipes or Shurwipes, in between buffers.
4. Place your electrode in the appropriate buffer for your sample and begin
reading. Press the measure button to begin reading the pH once your electrode is placed
in the buffer.
5. Set the pH a second time. Once your reading has stabilized, set the pH meter to the
value of the buffer's pH by pressing the measure button.
6. Rinse your electrode. You can use distilled water to rinse. Use a lint-free tissue, like
Kimwipes or Shurwipes, in between buffers to dry the electrode.
Using Your pH Meter
1. Place your electrode in your sample and begin reading. Once your electrode is placed
in your sample, press the measure button and leave the electrode in your sample for
approximately 1-2 minutes.
2. Set your pH level. Once the reading has stabilized, press the measure button. This is the
pH level of your sample
3. Clean your electrode after use. Rinse your electrode with distilled water and blot or dab
dry with a lint-free tissue. You may store your pH meter once clean and dry.
Consult your operation manual for optimal storage practices for your specific pH meter.
Measure its apparent absorbance against air at 240 nm for quartz cells and 650 nm for glass
cells.
The apparent absorbance should not be greater than 0.093 for 1 cm quartz cells (UV region)
and 0.035 for 1 cm glass cells (Visible region).
After that measure the apparent absorbance after Rotate the cell in its holder (180°) again
Check the absorbance, Rotating the cells should give the absorbance difference not greater
than 0.005 from initial.
Record the observations and attach the printout of UV graphs with calibration template.
Control of Wavelength:
Reagent Preparation
o 4M PerchloricAcid : 5.74 ml of Perchloric Acid (AR grade, 11.6M) shall take in a clean
and dried 50 ml of volumetric flask and volume shall make up to to 50 ml with distilled
water and mix well.
o Preparation of 4% w/v solution of Holmium oxide:
0.4 gm of Holmium oxide (HO) (AR grade) shall be taken in a clean and dried 10 ml
of Volumetric Flask.
Then add 8 ml 1.4 M Perchloric acid.
After that heat and sonicate the Flask till dissolve.
Then make up volume with 1.4 M Perchloric acid and filter the solution through
Whatman no.41 paper or Use certified standard solution of 4.0% w/v Holmium oxide.
Calibration Procedure :
o Take the UV spectrum of 4%w/v Holmium oxide in 1.4 M Perchloric acid solution from
200 nm to 600 nm against the 1.4 M Perchloric acid as a blank.
o Wavelength shall be check for the peak detection of Holmium Oxide at 241.15 nm,
287.15 nm, 361.5 nm, 486.0 nm and 536.3 nm.
o The permitted tolerance limit shall be ± 1 nm for the range of 200 nm to 400 nm (UV
range) and ± 3 nm for the range of 400 nm to 800 nm.(Visible range)
o Record the observations and attach the printout of UV graphs with calibration report.
Control of Absorbance:
Preparation Reagent / Dilution
o 0.005 M Sulfuric acid: 0.54 ml sulfuric acid (AR grade, 18.4 M) shall be taken in a clean
and dried 2000 ml volumetric flask containing at least 50 ml of distilled water.
Make up final volume cautiously to the mark with distilled water and mix well.
o Potassium dichromate solution in 0.005 M Sulfuric acid: Stock solution for 430 nm
(600 ppm):
Use potassium dichromate previously dried to constant mass at 130°C and About 60
mg (57 mg-63.0 mg) of potassium dichromate (AR grade) shall take in 100 ml clean
and dried volumetric flask,
Dissolve and make up the volume with 0.005 M Sulfuric acid and mix well or
Use certified standard solution of potassium dichromate(600 ppm).
o Final solution (60 ppm):
10 ml of above stock solution shall taken in 100 ml clean and dried volumetric flask,
make up the volume with 0.005 M Sulfuric acid and mix well or
Use certified standard solution of potassium dichromate (60 ppm).
Calibration Procedure:
o Take the spectrum of the Potassium dichromate final solution between 200 nm to 400 nm
using 0.005 M Sulfuric acid as a blank.
o Measure the absorbance of peak detection at 350 nm & 257 nm and Valley detection at
313 nm & 235 nm.
o Absorbance of the Potassium dichromate stock solution shall taken at 430 nm using
0.005 M Sulfuric acid as a blank in photo-metric mode. Calculate the specific absorbance
(A1 cm1%) & verify the results.
Calculation formula :-
Control of absorbance = (Absorbance X 10000 ) / Wt. Taken in mg.
Control of absorbance (for λ 430 nm) = (Absorbance X 1000) / Wt. Taken in mg.
o Record the observations and attach the printout of UV graphs with calibration report.
Linearity Study :
Preparation of 0.005 M Sulfuric acid:
o 0.54 ml sulfuric acid (AR grade, 18.4 M) shall take in a clean and dried 2000 ml
volumetric flask containing at least 50 ml of distilled water.
o Make up final volume cautiously to the mark with distilled water and mix well.
Linearity Solution Preparation in 0.005 M Sulfuric Acid :
100 ppm Solution:
o Weigh 100 mg of potassium dichromate (AR grade) (previously dried to constant mass at
130 °C ).
o Transfer in 1000 ml in clean and dried volumetric flask,
o Dissolve and make up the volume with 0.005 M Sulfuric acid and mix well or
o Use certified standard solution of Potassium dichromate (100 ppm).
80 ppm Solution:
o Take 20 ml of 100 mg/lit solution in 25 ml clean and dried volumetric flask and dilute up
to mark with 0.005 M Sulphuric acid and Mix well or .
o Use certified standard solution of Potassium dichromate (80 ppm).
60 ppm Solution :
o Take 15 ml of 100 mg / lit solution in 25 ml clean and dried volumetric flask and dilute up
to mark with 0.005 M Sulfuric acid and mix well or
o If 60 ppm (conc.) certified standard solution of Potassium dichromate available, same can
be used.
40 ppm Solution :
o 20 ml of 100 mg/lit solution shall be taken in 50 ml clean and dried volumetric flask and
dilute up to mark with 0.005 M Sulphuric acid and mix well or
o Then further take 1 ml of this solution in 100 ml clean and dried volumetric flask and
o Dilute up to mark with Methanol and mix well or
o Use certified standard solution of Toluene in Methanol (0.02%v/v).
Take second derivative spectrum of the resulting solution in the range of 255 nm to 275 nm.
A small negative extremum (or trough) located between two large negative extrema (or
troughs) at about 261 nm and 268 nm, should clearly visible, as shown in below figure.
The ratio A/B ( pl. see figure) is not less than 0.2.
Solution for calibration of UV Spectrophotometer shall prepare freshly and shall use within
24 hrs. Record the readings.
If any part of the instrument is replaced during the maintenance then record the activity in the
instrument history card and if required, calibrate the instrument.
If the readings do not fall within the specified ranges, Contact to service engineer .
Re-Calibration :
After maintenance (change in critical parts like lamp,filter,mirrors etc.). Re-Calibrate the UV
spectrophotometer .
Upon change in cuvette. Perform Cuvette Qualification..
Out Of Calibration :
In case failure of UV Calibration in any of the its calibration parameter. Follow SOP on
handling out of calibration procedure for laboratory instruments.
Relocation of UV Spectrophotometer :
If the instrument is shifted for any purpose. Perform the Qualification.
After following maintenance and for other maintenance. Carryout the UV Calibration ,
Decide the Calibration requirement case by case basis.
8. There should be evidence that the analysts, who are responsible for certain tests, are
appropriately qualified to perform those analyses (―analyst proficiency‖).
CHARACTERISTICS OF ANALYTICAL PROCEDURES
Characteristics that should be considered during validation of analytical methods include: —
Specificity
Linearity
Range
Accuracy
Precision
Detection limit
Quantitation limit
Robustness
Accuracy is the degree of agreement of test results with the true value, or the closeness of the
results obtained by the procedure to the true value. It is normally established on samples of the
material to be examined that have been prepared to quantitative accuracy. Accuracy should be
established across the specified range of the analytical procedure.
Note: it is acceptable to use a ―spiked‖ placebo where a known quantity or concentration of a
reference material is used.
Precision is the degree of agreement among individual results. The complete procedure should
be applied repeatedly to separate, identical samples drawn from the same homogeneous batch of
material. It should be measured by the scatter of individual results from the mean (good
grouping) and expressed as the relative standard deviation (RSD).
Repeatability should be assessed using a minimum of nine determinations covering the
specified range for the procedure e.g. three concentrations/three replicates each, or a minimum of
six determinations at 100% of the test concentration.
Intermediate precision expresses within-laboratory variations (usually on different days,
different analysts and different equipment). If reproducibility is assessed, a measure of
intermediate precision is not required.
Reproducibility expresses precision between laboratories.
Robustness (or ruggedness) is the ability of the procedure to provide analytical results of
acceptable accuracy and precision under a variety of conditions. The results from separate
Warehouse management is complex, but done right it can reduce costs, improve customer
satisfaction and increase warehouse operational efficiency.
Introduction:
Maintaining proper storage condition for pharmaceutical products and paramedical is vital to
ensure their quality, safety and efficacy. Factory stores will invariably be receiving duly
approved raw materials materials and packaging packaging materials materials from third party.
A suitable space is provided to raw material, handling of raw & packaging materials required for
manufacturing, including packaging of pharmaceuticals. This space is known as Warehouse. It
is a part of pharmaceutical company.
For what purpose?
To enable the fastest and cheapest transport of drugs and medical equipment from suppliers to
beneficiaries.
There are mainly 3 stages:
1. Purchase of pharmaceutical products.
2. Storage of ordered products.
3. Distribution of stocked products.
VARIOUS AREAS OF WARE HOUSING
Receiving area: - includes initial inspection, cleaning & weight checking.
Sampling area: - with adequate facilities to prevent cross contamination.
Storage area: - including specific storage like air
Storage area: - including specific storage like air condition rooms, cold rooms, hazardous
chemical storage room.
Rejected materials:-Destroy or retested unsuitable.
Dispensing area:-with adequate facilities to preclude cross contamination during dispensing.
Design:
Principle: Premises must be located, designed, constructed, adapted, and maintained to suit the
operations to be carried out.
General:
The layout and design of premises must aim to minimize the risk of errors and permit
effective cleaning and maintenance in order to avoid cross-contamination, build-up of dust or
dirt, and, in general, any adverse effect on the quality of products.
Where dust is generated (e.g. during sampling, weighing, mixing and processing operations,
packaging of powder), measures should be taken to avoid cross-contamination and facilitate
cleaning. to avoid cross-contamination and facilitate cleaning.
Premises should be situated in an environment in which the minimum risk of any
contamination of materials or products.
Premises used for the manufacture of finished products should be suitably designed and
constructed to facilitate good sanitation.
Premises should be carefully maintained, and it should be ensured that repair and
maintenance operations do not causes any hazard to the quality of products.
Premises should be cleaned and, where applicable, disinfected according to detailed written
procedures. Records should be maintained.
Electrical supply, lighting, temperature, humidity and ventilation should be appropriate and
such that they do not adversely affect, directly or indirectly, either the pharmaceutical
products during their manufacture and storage, or the accurate functioning of equipment.
Premises should be designed and equipped so as to afford maximum protection against the
entry of insects, birds or other animals. There should be a standard procedure to prevent from
rodent and pest control.
Premises should be designed to ensure the logical flow of materials and personnel.
General guide lines:-
Materials received against specific supply devices
QUESTION BANK
SHORT QUESTIONS(2marks)
1. Define validation.
2. Name the elements of validation protocol.
3. Name the different types of validation in pharmaceutical industry.
4. What is validation protocol?
5. What is the advantage of good warehousing?
6. What is material management?
7. Write the basic needs of material management
8. Define calibration.
9. What are the elements of material management?
10. Why blending is a critical parameter in tablet manufacturing.
11. Define process validation with flow chart.
12. Define Operational qualification.