Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
137 views7 pages

Human Factors - Module2

The Piper Alpha oil platform disaster in 1988 resulted in 167 deaths. An investigation found that a series of errors and procedural violations allowed barriers and safeguards to fail. These included deficiencies in work permit systems, information sharing, decision making, training, and hazard analysis. The initial explosion was caused when a temporary valve flange failed during the unplanned startup of an oil pump. This sparked fires that were sustained by continued oil and gas production from connected platforms, culminating in a massive gas explosion.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
137 views7 pages

Human Factors - Module2

The Piper Alpha oil platform disaster in 1988 resulted in 167 deaths. An investigation found that a series of errors and procedural violations allowed barriers and safeguards to fail. These included deficiencies in work permit systems, information sharing, decision making, training, and hazard analysis. The initial explosion was caused when a temporary valve flange failed during the unplanned startup of an oil pump. This sparked fires that were sustained by continued oil and gas production from connected platforms, culminating in a massive gas explosion.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

E-MTOM100-1 18H – Human Factors: Module 2- Exam Assignment

Date:28-Sep-2018
By: Sudarsan Prathipati
BRIEF DESCRIPTION OF THE INCIDENT

On 6th of July 1988, one hundred and sixty-seven men lost their life in the PIPER ALPHA Oil
platform disaster. This incident has been shown in various safety briefings many times over and
over, yet there is something new to learn from this particular incident. It has been three decades
since then leaving behind numerous safety regimes to think upon and act upon.

Lord Cullen of White Kirk, who chaired the commission of inquiry into piper alpha disaster, reflected
his thoughts on major accidents, saying that “How frequently major accidents have been preceded
by signs indicating danger, but those signs were not recognized or at any rate effectively acted on
to prevent the accidents in question or any rate to limit their extent” during a conference held in
remembrance of 30th anniversary of Piper Alpha disaster to consider and reflect on various themes
for securing a safer future. (Video Source: PSA, 2018).

“The Tragedy of Piper Alpha must be a constant reminder to us All”


quoted by Deirdre Michie Chief Executive – Oil & Gas UK
The Piper Alpha platform was built in 1976 and operated with a crew of 226 men, was producing
300,000 barrels of oil a day (Wonke, et al.), The main process consists of oil separation, gas and
condensate export. The Piper has been inter connected with two other remote oil platforms which
were also connected to same Oil and Gas raisers.

On the day of accident there wasn´t any unusual activities noticed during normal working hours,
according to reports incident happens around 10PM (Not before 10PM) and reports also confirms
that first explosion happened in Gas Export compressor Room.

Initiating Event

As it noted in “Piper Alpha Spiral to Disaster” (Appleton, 2017), 15minuets before the incident, there
was a trip on Propane condensate pump A in gas compressor room, Night shift operations
supervisor and his colleague inspected the site and decided to take Pump B online, But Pump B
was on annual overhauling/maintenance for quite some time and it has been electrically isolated.
Most of the job was finished and the work permit for that job has been returned for the day. Team
has thoroughly checked both suction and discharge valves and other pipe line works and found
satisfactory and decided to take pump B online instead of fixing Pump A as it will impact production
downtime.

What this night shift operations team unaware of that, there was a different permit on this spare
condensate pump B module to fix relief valve near the export line and this valve is located far from
the eye sight and elevated above from the gas compressor room. The work was not finished, and
valve opening was fitted with temporary flange. The permit for replacing relief valve on Condensate
pump B module has been returned for the day hoping to resume the job the next day morning.

The Contractual supervisor who responsible for relief valve replacement (who has never been
acted as supervisor before) has returned the permit at permit office at evening 1800Hrs, but the
status of job was not been informed to duty supervisor/operator as the whole operations team was
busy with shift handover at that time. He chose to leave permit on the desk and left permit office
without informing.
Since the night shift team was only aware of overhauling permit and assuming the clearance of
Pump B electrical isolation would be faster way to resume production, they have procured
clearance for pump start from control room without cross referencing if there was any other work
permit is active on this module.

As soon as pump B started, the temporary flange fitted near relief valve start leaking and within
short time catches fire and creates a big explosion when temporary flange can´t hold any longer
the pressure created by the pump. But the explosion could have been contained within the gas
compressor room if the explosion walls are strong enough to hold the explosion. As this initial
explosion blasted out walls between the modules, the oil equipment catches fire and they also
exploded which could be described as second big blast by the survivors.

Escalation

At this point of time, the fire (Called as “Oil Fire”) could have been sustained only till the inventory
on platform burned out, but as the Piper was connected with 2 other remote platforms Tartan and
Claymore were continuous to produce though they got repeated emergency calls from Piper during
the early moments of the incident. It helps the oil fire to continue long enough to weaken the gas
risers insulation.

Eventually the 2 gas risers at the pressure of 2000 PSI, which were directly running in the middle
of Oil fire, shouldn´t sustain longer and created the most terrific blast with in the gap of 25 minutes
and left behind a legacy of Piper Alpha platform with casualties of 167 men.

Emergency and Evacuation

The communication systems were active only for first few minutes after the first blast and all the
personnel onboard was mustered in LQ and waiting for further instructions. The LQ supposed to
be protected from process area by firewall. The Emergency procedures required to take the people
to life boats, but it was never realized due to fire and smoke engulfed around LQ. Some people
opted to jump into the sea whereas most of the crew waiting for further instructions which was
never happened and lost their lives. Few people who jumped into sea was taken by some of the
service vessels around platform and survived.

“It was a terrible reminder of what can go wrong if we fail to manage risk properly in
this business” says PSA head Magne Ognedal

Failure of Barriers

The Fire water pumps were supposed to start automatically but they were in manual to protect
divers who carried out work time to time, assumed that it would be a danger to divers if they were
too close to the pumps. As it was evident from various sources that deluge equipment was not in
operation due to rust and the same has been identified and kept pending for action to be taken for
nearly 4 years.

The original platform design principles were breached when platform was modified to receive and
process gas from other installations. Locating the gas compression area close to the control room
was significant reason for further escalation and early ingression of fumes and smoke into living
quarter.

The activation of Emergency Shutdown systems has isolated and contained the supply of
hydrocarbons, but the fire was maintained and sustained due to both Tartan and Claymore
continuous to send gas to the blazing platform.

(Ognedal, 2013)
ERRORS THAT HAVE BEEN CONTRIBUTED TO THE INCIDENT

The series of incidents can be applied to a famous swiss cheese model where the failure of
defenses, barriers and safeguards led to an accident trajectory (Reason, 2000).

There may be numerous errors that have been contributed but at the core of the incident, I believe
most of these errors can be characterized into following deficiencies that has led to the failure of
defenses, barriers and safeguards and resulted into Piper Alpha accident.

• Deficiency in Work permit system


• Deficiency in information exchange
• Poor decision Quality
• Lack of training
• Deficiency in hazard analysis
• Procedural violations

As we understand, there wasn´t any cross reference between the active work permits with in the
same area or on same equipment. If there could have any proper cross referencing the night shift
team would have known about relief valve unfinished job on the spare condensate pump.

Lack of training on work permit system and on job training contributed to accumulate faults within
the system which in turn leads to deficiency in information exchange. Appleton mentioned in his
technical assessment that operators were taken back equipment from maintenance without proper
inspection or status of the equipment before the equipment put back into operation and this has
been a common practice.

Quality of decision to keep critical fire water pumps in manual considering safety of divers has also
been questioned across various studies about this incident. Automatic activation of fire water
pumps might have delayed or completely avoided oil fire and the death toll would have been
reduced to 2 to 3 people who were actually present in the gas compressor room.

Piper was initially designed to produce Oil, but later it has been converted for gas production when
it was connected with two other remote platforms Tartan and Claymore. Without having proper
hazard analysis, it was not considered hazards associated with gas explosion. The fire walls were
originally designed to withhold heat of oil blaze rather than pressure of gas explosion. (Ognedal,
2013)

Procedural violations seeped into every stage of safety barriers:

• Mishandling of work permit system


• Bypassing safety systems - Fire water pumps in manual
• Design violations - Process module near LQ facility
• Organizational or Management violations - Gas risers hazard handling
• Safety culture and behavioral violations – Mustered men opening the fire doors frequently
to assess the situation outside, allows lot of smoke and fumes with in LQ

The maintenance error that eventually led to the initial leak was the result of inexperience, poor
maintenance procedures, and deficient learning mechanisms. (Redmill, et al., 1997)

(Reason, 2000) has mentioned that the errors that enables and aligning the holes with in swiss
cheese model representing the failure of safety barriers arise for two reasons: Active failures and
Latent conditions.
IDENTIFY 'HUMAN ERRORS' OF THE INCIDENT AND CLASSIFY THEM INTO 'HUMAN ERROR CAUSATION PARADIGMS '

While considering the human factors into an accident, human errors are broadly classified into two
categories Active failures and latent conditions. Active failures are acts of the people who directly
involved with the incident. The consequences are immediate. Whereas Latent conditions sets
within the system through various levels of the organization e.g. safety culture, violations and poor
quality of decisions, they tend to be hidden until trigged by an event, several latent conditions can
lead to latent failures. They always come in unforeseen way. (HSE UK)

The illustrated accident model taken from the HSE


defined model of human factors in accident investigation.
As it illustrated, unsafe acts and unsafe plant conditions
were result of both active and latent errors. Active failures
consist of slips, lapses and mistakes (Reason, 2000),
whereas latent failures are inherent and will be evident
with particular combination of events these include poor
design, poor quality of decisions, uncertainty in roles and
responsibilities etc.

In Piper Alpha incident one can consider returning of work permit without informing the status of
the equipment by contractual supervisor to operations is merely an active failure and at the same
time accepting the equipment back without proper inspection was an inherent action (“The way we
do things around here”) is a latent failure.

But what causes human errors, (Redmill, et al., 1997) has argued about four causes by looking at
the different perspectives on role of human failures in accident causation, such as:

The Engineering The Individual Error The Cognitive Error The Organizational
Error Paradigm Paradigm Paradigm Error Paradigm
Focus on technical Focus is on person´s Analyses human Here adopts a
aspects of the system motivation and safety errors in relation to broader perspective,
attributions the information looking higher than
processing abilities individual unsafe
and interaction with acts. System
task and situations of approach instead of
an individual person approach.
Mismatch between (Reason, 2000) The cognitive Preconditions like
human expectancy argued that human variables of most Poorly designed
and man machine error can be viewed interest are not procedures, unclear
interface in in two ways, one is directly observable allocation of
automation systems person approach and (Manchi, et al., 2013), responsibilities, lack
encourages human second one is system for example memory of training, poor
unreliability. approach. failure, attentional equipment design
failure, information may contribute
Operator expected to This view is related to overload and indirectly to accidents
do act where Person approach that decision-making through latent
automation designer majority of accidents failure. failures.
introduces errors into are caused by un
the system during safe acts rather than
design phase un safe conditions.
(Heinrich, et al.,
1980)

In the event of Piper Alpha, certainly both active failures and latent conditions contributed to the
accident on that fateful night which claimed 167 lives. The following table illustrates errors
contributed to the accident and their respective causation paradigms:
The Engineering error • Information overflow from alarms during initial moments of
paradigm accident
• Limited redundancy in communication systems

The Individual error paradigm • Lapses in handling of permit


o Unable to collect and consolidate permits as this was
not a usual practice – referred as skill-based human
error
• Individual safety behavior in the event of emergency
• Emergency Preparedness – Lack of proactive participation

The Cognitive error paradigm • Poor decision quality


o Decision to keep fire water pumps in manual
o Decision to take spare pump B into operation without
cross referencing if there is any other work permit is
active
• Ability to interact with task – Remote platform production
supervisors assumed that piper will handle the situation
and they decided to continue production

The Organizational error • Work Permit system – Training and new resource
paradigm orientation towards safety culture of the platform
• Routine violations
o Operators accepting equipment without proper
inspection
• Procedural violations
o Deluge valves maintenance kept pending for 4 yrs.
o Accepting process module near LQ Area
o Gas explosion hazard merely neglected during design
phase
• Unclear allocation of resources –Contractual supervisor
who supervised relief valve job was acted as supervisor for
the first time without proper mentorship and lack of training
further allowed to accumulate faults

DISCUSS WHAT SPECIFIC STEPS CAN BE TAKEN TO MITIGATE THE POTENTIAL FOR SIMILLAR HUMAN ERRORS (TO
INCREASE HUMAN RELIABILITY UNDER SIMILLAR CONDITIONS ) IN FUTURE , TAKING INTO CONSIDERATION THE
CHANGES / CHARACTERISTICS OF THE MODERN INDUSTRIAL ENVIRONMENT .

From the above discussion, it’s more of management failures and latent conditions over individual
and active failures. In Modern industrial environment its more of system approach rather than
personal approach gaining popularity, where organizations anticipating worst and equip
themselves to deal with it at all levels of organization. (Reason, 2000)

As a matter of note Human Reliability Assessment is a discipline that provides tools for estimating
the risks attributable to unavoidable human errors. The three main stages of HRA as described by
(Kirwan, 1994)

i. Human error identification (HEI) – What errors can occur


ii. Human error quantification (HEQ) – What is the probability that errors can occur
iii. Human error reduction (HER) – How to reduce that probability of errors

HRA is considered to a hybrid discipline founded on both technical/engineering and psychological


perspective, the combination of these perspectives helps us to asses total risk picture of a system
and easy to determine which part of the system possess more risk (human or technical). Though
HRA has been a purely quantitative approach before, now the focus has been shifted from pure
quantitative approach to understanding behavior and complexity of human errors and its causes
(Massaiu, 2006)

The following table gives a qualitative approach or steps can be taken to mitigate the potential
human errors so that similar events like Piper Alpha can be avoided in future:

Errors/Violations Causation Steps can be taken


Skill based slips • Familiarity with the • Make aware of workforce that slips and
and lapses job lapses cannot be avoided
• Confused with • Design a work procedure in such a way that
similarity in tasks sequence of actions that carried out cannot
• Distractions and be misinterpreted or skipped
interruptions etc. • Crete a work environment where less
distraction possible
• Check lists might be helpful

Rule based and • Doing wrong • Avoid multitasking especially when it comes
knowledge- things believing it to complex tasks (e.g. avoid compressor start
based mistakes to be right sequence when F&G routine detector testing
is under way), Both are complex tasks and
recommended to avoid multitasking.
• Developing a road map of mistakes where
they can occur and implementing controls
(training and equipment monitoring)
• Provide procedures and job aids like
drawings for the in-experienced work force
with proper mentorship

Violations • Intentional failures • Help the workforce why the rules are made in
– deliberately first place – counselling/training/Mock drills /
doing the wrong workshops
thing • Make sure that rules not so strict, where
people felt hard to follow and at the same
time should not be felt like unnecessary
• Adopt a safety culture that taking shortcuts
are forbidden
• Get involved the workforce in changes to
rules and their acceptance
• Create better work environment and try to
reduce time pressure
• Asses the reason for violation and remedial
actions shall be in place (e.g. Does the PPE
difficult to use or uncomfortable?)

Rebuilt based on Understanding human failure: Leadership and worker involvement tool kit /HSE UK

These are some of the steps helps to reduce errors and violations for better and safe working
environment. The leadership and management have the responsibility to provide workforce with
necessary tool kit and at the same time workforce as individuals need to be adopted safety culture
as their behavior.

“As an organization, the pursuit of safety is not so much about preventing isolated failures, either
human or technical, as about making the system as robust as is practicable in the face of its human
and operational hazards”. (Reason, 2000)
Works Cited
A guide to practical human reliability assesment [Book] / auth. Kirwan B. - London :
Taylor & Francis, 1994.
Core topic 2: HF in accident investigations [Online] / auth. HSE UK. - 2018. -
http://www.hse.gov.uk/humanfactors/topics/.
Fire in the Night / dir. Wonke Anthony and UK BBC.
Human Error and Models of Behaviour ; Report Nr:IFE/HR/E-2005/031 [Report] / auth.
Massaiu Salvatore. - [s.l.] : Norges forskningsråd, 2006.
Human Error : Models and Management [Journal] / auth. Reason James. - [s.l.] : BMJ
2000;320:768, 2000.
Human Factors in Safety-Critical Systems [Book] / auth. Redmill Felix and Rajan Jane. -
[s.l.] : Butterworth-Heinemann, 1997.
Industrial accident prevention : a safety management approach [Book] / auth. Heinrich
H.W, Petersen Dan and Roos Nestor. - [s.l.] : McGraw-Hill, 1980.
Piper Alpha Spiral to Disaster / dir. Appleton Brian -Technical Assesor Cullen Enquiry. -
BBC, 2017.
Piper Alpha: North Sea nightmare [Report] / auth. Ognedal Magne. - [s.l.] : Petroleum
Safety Authority-Norway, 2013.
Study on Cognitive Approch to Human Error and its Application to Reduce the Accidents
at Workplace [Journal] / auth. Manchi Ganapathi Bhat, Gowda Sidde and Hanspal Jaideep
Singh. - [s.l.] : International Journal of Engineering and Advanced Technology (IJEAT),
2013. - Issue-6 : Vols. Volume-2.
The Legacy of Piper Alpha / dir. Video Source: PSA NORWAY. - PSA,NORWAY, 2018.

You might also like