Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
85 views21 pages

ISO/IEC 15504-Evolution To An International Standard: Software Process Improvement and Practice January 2003

ISO 27904

Uploaded by

wendy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
85 views21 pages

ISO/IEC 15504-Evolution To An International Standard: Software Process Improvement and Practice January 2003

ISO 27904

Uploaded by

wendy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/227668602

ISO/IEC 15504—Evolution to an international standard

Article  in  Software Process Improvement and Practice · January 2003


DOI: 10.1002/spip.167 · Source: OAI

CITATIONS READS

21 1,129

1 author:

Terry Rout
Griffith University
80 PUBLICATIONS   551 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Development of the Standard for Process Assessment View project

All content following this page was uploaded by Terry Rout on 11 April 2020.

The user has requested enhancement of the downloaded file.


ISO/IEC 15504 - EVOLUTION TO AN INTERNATIONAL
STANDARD

Terence P. Rout

Software Quality Institute

School of Computing and Information Technology

Griffith University

Queensland 4111

Australia

Contact Details:
Terence P. Rout
School of Computing and Information Technology
Nathan Campus
Griffith University
Queensland 4111
Australia

Email: [email protected]
Phone: +61 7 3875 5046
Fax: +61 7 3875 5207
*

*
Accepted for publication in Software Process: Improvement and Practice.
Abstract
This paper describes the work currently being undertaken to progress ISO/IEC TR 15504 to
the status of a full International Standard, and outlines the changes in design that are to be
incorporated in the revision. It describes the inputs for the design decisions that were taken;
identifies the fundamental changes in the architecture of the Standard; and briefly describes
the current status of the development of the Standard.
Introduction
ISO/IEC 15504 [1] is the International Standard for Process Assessment. Its development,
with the parallel empirical studies of its use by the SPICE Project [22, 23], has spanned 10
years – the initial Study Group established by JTC1/SC7 to explore the needs and
requirements for the standard reported in 1992 [12].
The first version of the Standard was published in 1998 as a Technical Report (Type 2) [2].
This was a deliberate decision, recommended by the original Study Group report, and based
upon the JTC11 Directives [3] which state, "When the subject in question is still under
technical development or where for any other reason there is the possibility of an agreement
at some time in the future, JTC 1 may decide that the publication of a TR would be more
appropriate." At the 1998 Plenary Meeting of ISO/IEC JTC1/SC7, Working Group 10
(WG10), responsible for standards in the domain of Process Assessment, and thus for the
development and ongoing maintenance of ISO/IEC TR 15504, resolved to initiate a revision
of the document set, with the goal of preparing a revised version for full International
Standard status within the three-year period allowed for the revision of Technical Reports.
JTC1/SC7 adopted the following resolution:
JTC1/SC7 authorizes its WG10 to develop Project Requirements and Schedule for the
revision of TR15504 (Software Process Assessment). The intent is to evolve the TR into
an IS, seeking in the process as wide a representation as possible with the user
community of TR 15504. Further, JTC1/SC7 instructs WG10 to liaise with WG7 and
WG132 for this work. [4]
This paper identifies the key inputs to this revision, and sets out the design of the solution
that was determined for this revision. It concludes with a brief report on the current schedule
for publication of the Standard, and describes recommendations for transition to the new
version.

User Views
A Web-based survey of user opinions on the revision commenced in 1998. The survey was
initiated by WG10, which approved the content of the questions. The survey was promoted
widely among relevant interest groups; responses were collected through a web-based
interface hosted by the Software Quality Institute at Griffith University. The survey provided
useful insights into user opinions on the usability and usefulness of the Technical Report; of
considerable interest was the divergence of opinion on some key issues of usability.
A total of 89 responses were received (by December 2000) and analysed. The respondents
to the survey covered a wide range of industry groups and occupations; Figures 1 and 2
show an analysis of the respondents.

1
ISO (the International Organization for Standardization) and IEC (the International
Electrotechnical Commission) form the specialized system for worldwide standardization.
National bodies that are members of ISO or IEC participate in the development of International
Standards through technical committees established by the respective organization to deal
with particular fields of technical activity. ISO and IEC technical committees collaborate in
fields of mutual interest. Other international organizations, governmental and non-
governmental, in liaison with ISO and IEC, also take part in the work. In the field of information
technology, ISO and IEC have established a joint technical committee, ISO/IEC JTC 1. Sub-
Committee 7 (SC7) of JTC1 is responsible for standards in the domain of systems and
software engineering.
2
Within SC7, WG7 is responsible for standards relating to software and systems life cycle
processes; WG13 was responsible for a standard for the Software Measurement Process.
25

20

15

10

ia
er
e

el
l
um

h
t

n
n
e

ai

m
ce

en
es

al
od

es
n
iv

es

ns

in

ed
ns

rc
io
s

io

th
ac

av
et
tio

is
c

tic
ot

r
le
an

at

ea
ic

m
ct
ic

O
o
iti

e
io

tu

M
Tr
sp

ur
st
m

tra

G
tro

ef
rv

eu
rv

ru

uc
op
til

at
r

ac

es
gi

To
ro
su

to

D
Se

Se

st
er
is
ic

ac
Pe

Ed
el
Lo

uf

R
Au

Ae
In

on
in
un
ic

ev
m

an
s

an
m
bl

p;
su
,

C
an
m

ar

D
ng

ne

M
Pu

Ad

m
on
m

Ph

e
ki

ts
&a
si

ur
co

ar
C
ic
an

uc
Bu

is
d

ftw
bl
le

an
,B

Le
od
Pu

tio
Te

So
Pr
e

lth
bu
nc

ea
IT
tri
na

is

H
Fi

Figure 1 - Analysis of survey respondents by industry

25

20

15

10

0
nt
er

er
r

er

er
er
er
er

er
ito

lta
ag

ne

ag

in
rit
op

ag

th
ud

ra
lW

su

O
an

gi

an
el

an

/A

/T
En

on
ev

ca
tM
D

C
or

er
on

y
y

ni
ec

lit
lit
ss

h
e

al
ch
iti

ua

ac
ua
ar

oj

on
se
is

Te
ftw

Pr

Te
Q
Q
qu

si
As

es
So

Ac

of
Pr

Figure 2 - Analysis of survey respondents by occupation


The respondents reported (generally) good knowledge of ISO/IEC TR 15504 and other
relevant standards - ISO 9000:1994 [5] and ISO/IEC 12207 [6]. For ISO 9000, this was
evidenced by regular use of the standard; the most common response for others was
"knowledge but limited experience". Those with experience of ISO/IEC TR 15504 were
divided almost evenly between SPICE Trials Participants [7] and others. Respondents were
familiar with a wide range of different assessment approaches: the SW-CMM [8]; Trillium [9];
Bootstrap [10] and Process Professional [11] approach. Figure 3 shows the various
assessment approaches used; data on ISO/IEC TR 15504 include both Trials participants
and others. Respondents could record familiarity with more than one approach, so the total
responses are not consistent with other questions.
40

35

30

25

20

15

10

0
ISO 15504 CMM Trillium Bootstrap Process Other
Professional

Figure 3 - Experience of respondents with different assessment approaches


Questions in the survey were designed to establish:
• opinions on the content and usability of the existing Technical Report;
• views on the role and content of a standard for process assessment;
• views on the requirements originally established for ISO/IEC TR 15504 (and
documented in SC7 N944R) [13].
The survey content is set out in Table 1; the results of the survey (as at 1 December 2000)
are shown in Figures 4, 5 and 6. The questions in the survey were divided into the three
categories described above; Figures 4 - 6 show the distribution of responses to each
question.

Table 1 - Survey Questions


Usability of ISO/IEC TR 15504
No Question
1 ISO/IEC TR 15504 contains the right amount of information for its purpose
2 ISO/IEC TR 15504 is about the right size for the information needed
3 ISO/IEC TR 15504 presents information in an understandable form
4 ISO/IEC TR 15504 benefits from having an overview document (Part 1) relating the various parts
5 The requirements for performing assessments in ISO/IEC TR 15504-3 are easy to understand and
follow
6 The process dimension of the reference model in ISO/IEC TR 15504-2 achieves its purpose
7 The scale for measurement of process capability in ISO/IEC TR 15504-2 is easy to understand and
apply
8 The results of assessments performed using ISO/IEC TR 15504 are easy to understand
9 The results of assessments performed using ISO/IEC TR 15504 are easy to analyse providing a basis
for action
10 The guidance on performing assessments in ISO/IEC TR 15504-4 is useful and relevant
11 The guidance on achieving and validating assessor competency in ISO/IEC TR 15504-6 is useful and
relevant
12 The guidance on process improvement in ISO/IEC TR 15504-7 is useful and relevant
13 The guidance on process capability determination in ISO/IEC TR 15504-8 is useful and relevant
14 The availability of a separate consolidated vocabulary in ISO/IEC TR 15504-9 is valuable
15 ISO/IEC TR 15504-5 is useful as a guide for developing and evaluating other assessment models
16 ISO/IEC TR 15504-5 is useful as a model for use in performing assessments
Role of the Standard for Process Assessment
No Question
1 It is important for a Process Assessment standard to harmonise different assessment methods
2 An assessment standard should define the only approach to assessment, replacing all other methods
3 It is important to be able to compare the results of assessments that used different methods
4 It is important to be able to select processes for assessment from a standard
5 It is important to be able to consistently add new organization-specific processes to an assessment
standard
6 It is important to be able to categorise or classify work products
Requirements for the Standard
No Question
1 The results of a process assessment should reflect the organization’s ability to set and realize defined,
achievable goals for productivity and/or development cycle time linked to business needs and project
requirements
2 Process assessment is an important technique for an organization to achieve a repeatable software
process
3 A standard for process assessment should contain guidance on using assessment results to achieve
process improvement
4 A standard for process assessment should contain guidance on using assessment results in
determining process capability for assessment of risks in acquisition and development
5 A standard for process assessment should be expected to lead to more reliable and consistent
performance of assessment
6 A standard for process assessment should be usable in any organization involved in software
development, regardless of application domain, business needs or size of organization
7 A standard for process assessment should reflect current best practice in software engineering
8 Process assessment is useful for assessing both projects and organizations
9 Process assessment is primarily concerned with evaluating adherence to defined procedures
10 The results of process assessment must inevitably be subjective views of the assessor
11 The results of process assessment should be able to be presented as process profiles allowing views
at different levels of detail
12 The results of a process assessment should be able to be expressed as a single value for the
organization as a whole
13 The standard for process assessment should include information on the competency expected of
assessors
14 Assessment results should be available as defined process capabilities certified by a third party
15 Assessments should preferably be performed in-house with results that can be independently verified

The first component of the survey addressed the usability of the original Technical Report.
Responses were obtained from the 35 respondents with familiarity with ISO/IEC TR 15504.
The results are shown in Figure 4.
Strongly Disagree Disagree No Opinion Agree Strongly Agree

25

20

15

10

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Figure 4 - Survey Responses: Usability of ISO/IEC TR 15504


The respondents had a generally positive opinion of the existing Technical Report; the most
common response was "Agree", and the proportion of positive responses ("Agree" or
"Strongly agree") ranges from 54% to 74% of all responses.
The second component of the survey addressed the role of a Standard for Software Process
Assessment. In particular, this section addressed the relationship between the Standard and
other assessment approaches.

Strongly Disagree Disagree No Opinion Agree Strongly Agree

30

25

20
No of Responses

15

10

0
1 2 3 4 5 6
Question No

Figure 5 - Survey Responses - Role of the Standard for Process Assessment


The results, shown in Figure 5, generally confirmed the view that the Working Group has
taken over the course of the Project - that a Standard should not seek to be the "one and
only" approach to assessment, but should rather provide a mechanism for harmonising
different approaches. The importance of comparability of assessment results is endorsed;
the concept that it should be possible to scope each assessment to a limited set of
processes also found support.
The third section of the survey was used to validate key requirements defined for the
Standard in the original study, and documented in the Report of the Study Group [12] in
1992.
The results, shown in Figure 6, broadly supported the original defined requirements.
Responses to Question 12 showed significant support for the expression of organisational
maturity, an issue not addressed in these terms by ISO/IEC TR 15504. Another interesting
issue was the high degree of support for certification of assessment results (Question 14);
this was an area that the working group (for political rather than technical reasons) had
generally agreed should not be within the domain of the Standard.

Strongly Disagree Disagree No Opinion Agree Strongly agree

40

35

30
No of Responses

25

20

15

10

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Question No

Figure 6 - Survey Responses - Requirements for the Standard

Design Inputs
National bodies participating in WG10 were invited to submit proposals for the revision;
discussion papers were received from the USA, Japan, and Australia. There was a general
acknowledgement of the need to reduce the overall size of the Standard; there were
significant variations in the proposals for the extent and nature of any reduction, with one
view being for the removal of all material relating to guidance on use (process improvement
and capability determination). There was also a general acknowledgement of the need to
expand the scope of the Standard beyond the software life cycle processes, following on the
emergence of ISO/IEC 15288 - Systems Life Cycle Processes [14] and ISO 18529 - Human-
centred lifecycle process descriptions [15]. The need to maintain compatibility with these
current and emerging standards was seen as of significant importance in all submissions,
and a desire to increase the flexibility of application of the Standard was evident.
At meetings held in October 1998 and March 1999, WG10 developed a detailed proposal for
the review of ISO/IEC TR 15504, including a complete statement of requirements and
documented strategy for the review [16]. The proposal was distributed throughout the
Software Engineering Standards community, and was subsequently approved by JTC1. The
defined requirements have been maintained throughout the development of the Standard; in
the latest version (29 October 1999) 27 functional and 12 non-functional requirements are
defined. Many of these re-affirm the original recommendations of the Study Group report;
there are however some significant changes of direction. The principal of these is a decision
to remove the "process dimension" of the Reference Model of ISO/IEC TR 15504-2 from the
scope of the Standard, relying on external sources for definitions of processes to be
assessed. This was seen as critical to achieving the stated wish to make the standard more
flexible and extend its scope of application. In line with this goal, the title of the Standard has
been modified – from "Software Engineering – Software Process Assessment" to
"Information Technology – Process Assessment".

The Revised Framework


A high level design for the revised document set was developed. The design envisages
considerable simplification of the conceptual model for process assessment, with substantial
more flexibility in the range of process areas included in the domain of the standard. The
experiences with the use of the initial document set, monitored through the SPICE Trials,
together with the ongoing maturity of understanding of process management in systems and
software engineering, led to a significant re-design of the assessment framework. The most
significant change is to remove the Process Dimension of the Reference Model (the current
Part 2 of the document set) from the scope of the Standard. Instead, requirements are
defined for Process Reference Models that can be met by current and emerging standards
within the domain of JTC1/SC7 - in particular, the Amendment to ISO/IEC 12207- Software
Life Cycle Processes [13], and the standard for System Life Cycle Processes, ISO/IEC
15288 [14]. The new framework is shown in Figure 2.

Conformant or Compliant Measurement Framework


Process Reference Model • Capability Levels
• Process Purpose • Process Attributes
• Process Outcomes • Rating Scale
Conformant Process
Assessment Model
•i di t
Scope
• Indicators
• Mapping
• Translation

INITIAL INPUT ASSESSMENT PROCESS OUTPUT


• Sponsor identity Planning • Identification of
• Purpose Data Collection Evidence
• Scope Data Validation • Process Used
• Constraints Process Attribute Rating • Process Profiles
• Assessment Team Reporting

Roles and Responsibilities


• Sponsor
• Competent Assessor
• Assessors

Figure 2 - Revised View of the Assessment Process


The major benefit of this change is that it expands the available scope of the standard.
Instead of being constrained to the processes of the software life cycle, it now will define a
common assessment process that can be applied to any processes defined in the required
manner. Processes defined through an appropriate Process Reference Model are assessed
in relation to the measurement scale for capability set out in the Standard. The scale is
based upon the existing Capability Dimension of ISO/IEC TR 15504-2, but has been
modified and strengthened to address weaknesses identified through use of the Technical
Report. The issue of alignment to the new version of ISO 9001 [19] has also been
considered, and the revised Measurement Framework is strongly aligned to this Standard.
The Technical Report [2] was published with nine parts [23]:
Part 1 - Concepts and introductory guide
Part 2 - A reference model for processes and process capability
Part 3 - Performing an assessment
Part 4 - Guide to performing assessments
Part 5 - An assessment model and indicator guidance
Part 6 - Guide to competency of assessors
Part 7 - Guide for use in process improvement
Part 8 - Guide for use in determining supplier process capability
Part 9 - Vocabulary
The International Standard [1] is considerably reduced in size, and also in complexity, having
five parts. The overall structure of ISO/IEC 15504 is as follows:
Part 1 - Concepts and Vocabulary
Part 2 - Performing an Assessment
Part 3 - Guidance on Performing an Assessment
Part 4 - Guidance on Using Assessment Results
Part 5 - An Exemplar Process Assessment Model
In general, the new Part 1 contains material from Parts 1 and 9 of the Technical Report; Part
2 amalgamates issues from Parts 2 and 3; Part 3 draws upon Parts 4 and 6; and Part 4
combines the contents of the existing Parts 7 and 8. There is maintenance of the existing
level of guidance, but the detail in some aspects of the guidance has been reduced and the
advice made less verbose. Part 5 is not seen as “guidance” in the same sense as the other
parts, but as a usable exemplar of a key element in the overall framework.

Process Models in the Revision


Two different classes of process models are identified in the revised framework. These have
been implicitly recognised in the current version, but the decision to rely on external sources
for process descriptions has made the difference clearer and more explicit. The two classes
are:
1. Process Reference Models: The purpose of these models is to provide the
descriptions of the process entities to be evaluated - to define what is to be
measured. Process Reference Models are in a very real sense standards, in that
they provide a common terminology and description of scope for process
assessment.
2. Process Assessment Models: The purpose of these models is to support the
conduct of an assessment. Process Assessment Models may have significant
differences in structure and content, but can be referenced to a common source
(a Process Reference Model), providing a mechanism for harmonisation between
different approaches to assessment.
In ISO/IEC TR 15504 there is a requirement for the selected assessment model to be
compatible with the "reference model" contained in ISO/IEC TR 15504-2. The effect of this is
to limit the scope of ISO/IEC TR 15504-conformant assessments to processes associated
with the software life cycle. With the development of the new standard for system life cycle
processes (ISO/IEC 15288) and assessment approaches covering Systems Engineering,
such as the CMM Integration project [17], expansion of this scope was essential. This led to
a decision to rely on external sources for definitions of processes within the SC7 terms of
reference. Accordingly, there are two different classes of requirement: compliance
requirements for suitable Process Reference Models, and conformance requirements for
Process Assessment Models.
Conformity is fulfilment by a product, process or service of specified requirements [21].
Compliance, within the context of international standardisation, is defined as "adherence to
those requirements contained in standards and technical reports which specify requirements
to be fulfilled by other standards, technical reports or ISP3s (eg. reference models and
methodologies)". [18] To a significant extent, this mandates a degree of compatibility
between standards employed for process assessment, providing defined rules where
previously there was only a general agreement.

ISO/IEC 15504-2 Model/method


developers
Requirements determine applicability of
for Conformity

Process Process
Measurement Reference
Framework Assessment
Model Model

Requirements
for Compliance
determine suitability of

Figure 4 - ISO/IEC 15504 requirements and the Assessment Model


As shown in Figure 4, a Process Reference Model provides the basis for conformance of any
Process Assessment Model. The reference model contributes the overall definitions of the
processes within the scope of the Process Assessment Model. The derivation of
requirements for the definition of processes is an essential element of the derivation of the
framework. The requirements for compliance / conformance of reference models represent
the principal new material required for the standard.
Conventionally, in most relevant standards, a process is seen as a set of activities or tasks,
converting inputs into outputs – see for example, definitions in ISO 9000 and ISO/IEC 12207.
This definition, however, is not generally suited for the assessment of process capability;
there may be many different sets of distinct activities that achieve the same transformation.
For the purpose of assessing process capability, it is of more value to explore the purpose for
which the process is employed. Implementing a process results in the achievement of a
number of observable outcomes, which together demonstrate achievement of the process
purpose.
The purpose may be expressed in terms of the transformation of inputs into outputs; it may
involve elaboration, evaluation, or even some change of state of the inputs. This approach is
used to specify processes in a Process Reference Model; the model contains a set of

3
International Standardized Profile. An ISP is an internationally agreed-to, harmonized
document which identifies a standard or group of standards, together with options and
parameters, necessary to accomplish a function or set of functions.
processes described in terms of their purpose and the outcomes resulting from
implementation.
The requirement that processes are described in terms of Process Purpose and Process
Outcomes is the critical innovation in ISO/IEC 15504. It provides for a form of definition that
is independent of implementation concerns, and that focuses on the results of process
performance. The use of this approach has been endorsed by SC7 for general use across
all of its process-oriented standards; as a result, all of these Standards are effectively
harmonised with ISO/IEC 15504. In particular, the two central life-cycle standards – ISO/IEC
12207, Software Life Cycle Processes (Amd 1) [13], and ISO/IEC 15288, Systems Life Cycle
Process [14] – formally constitute Process Reference Models for the purposes of ISO/IEC
15504.
Other forms of process model may describe sets of activities or other elements that result in
achievement of the purpose; these are outside the concern of ISO/IEC 15504. A Process
Assessment Model describes processes in terms of the evidence that may be identified that
demonstrates that the process has in fact been implemented; they generally comprise sets of
practices and descriptions of work products that serve as indicators of process performance
and process capability.
As shown in Figure 5, there can be multiple Process Assessment Models for each accepted
Process Reference Model. The requirements for conformance of Process Assessment
Models in the revised Standard are based upon the existing requirements in ISO/IEC TR
15504-2, but have been re-worked in the light of experience. Conformance is on the basis of
relationships between the Process Assessment Model and both the external Process
Reference Model and the in-built measurement framework of ISO/IEC 15504.

Process Assessment
Process Model 1a
Reference Process Assessment
Model 1 Model 1b
Conformity Requirements

(eg ISO 12207)


Process Assessment
Model 1c
Process
Compliance Reference Process Assessment
Requirements Model 2 Model 2a
(eg ISO 15288)

Process
Reference Process Assessment
Model 3 Model 3a
(eg ISO 19529)

Figure 5: Process Models in the revised framework


Assessment models - as in the existing framework - will be developed independently by
model and method developers, in conformity with the requirements of the Standard. The
generalisation of the framework, with the extension to cover additional Process Reference
Models, provides more opportunities for model developers, providing suitable Process
Reference Models exist. With ISO/IEC 15504 providing the criteria for but not defining the
processes that may be assessed, the requirements for the resulting Process Assessment
Models will be sufficient to ensure that any process from an acceptable Process Reference
Model can be measured. As with the current Technical Report, the new ISO/IEC 15504
contains an exemplar Process Assessment Model, with a scope equivalent to that of the
Process Reference Model defined in ISO/IEC 12207, Amd 1.
There is a potential for significant multiplicity of different Process Reference Models if the
status of these key entities is not controlled in some manner. It should also be noted that the
requirements for conformance are of necessity more complex than in the Technical Report.
Clear distinction is made between the mapping requirements to the Process Reference
Model(s) and those to the measurement framework elements. The availability of different
Process Reference Models also contains implications for the comparability of assessment
results; in general, results are comparable only for processes within the same Process
Reference Model.

Measurement Framework
In the course of the revision of ISO/IEC 15504, there has been a detailed review and revision
of the definitions of the Process Attributes in the Capability Dimension. As shown in Figure 4
(above), the measurement framework remains a core element of ISO/IEC 15504, and is seen
as applicable across multiple different process domains. In the early drafts of the revision,
the whole intent and structure of the original Capability Dimension from the Technical Report
was retained, although there were several changes to increase clarity.
Following comments from the UK National Body and other sources in the course of balloting,
an Other Working Group with broad international participation was established to undertake
a full review of the Measurement Framework, with explicit attention being paid to issues of
harmonisation with ISO 9001: 2000 [19]. The general structure of the restructured
framework is given in Table 4, along with the original structure from the Technical Report. It
can be seen that the changes at the top level of the framework are limited: one of the
attributes at Level 3, and both of the attributes at Level 5, have been renamed. At a lower
level of detail, however, the changes are much more significant.
The major alterations in the details of the measurement framework are at Capability Levels 2
and 3. At Level 2, a much greater level of detail has been incorporated, and more formal
traceability to ISO 9001:2000 has been incorporated. At Level 3, although the overall
capabilities associated with the level have not changed, a different perspective on the
distribution of attributes has been adopted. The concepts for Level 4 and Level 5 capability
are generally unchanged, though the specifications have been modified to improve clarity
and understanding.

Table 4: Revised Measurement Framework


Capability Level Technical Report International Standard
Level 1 – Performed Process PA 1.1 – Process performance PA 1.1 – Process performance
attribute attribute
Level 2 – Managed Process PA 2.1 - Performance PA 2.1 - Performance
management attribute management attribute
PA 2.2 - Work product PA 2.2 - Work product
management attribute management attribute
Level 3 – Established Process PA 3.1 - Process definition PA 3.1 - Process definition
attribute attribute
PA 3.2 - Process resource PA 3.2 - Process deployment
attribute attribute
Level 4 – Predictable Process PS 4.1 - Process measurement PA 4.1 - Process measurement
attribute attribute
PA 4.2 - Process control PA 4.2 - Process control
attribute attribute
Level 5 – Optimising Process PA 5.1 - Process change PA 5.1 - Process innovation
attribute attribute
PA 5.2 - Continuous PA 5.2 - Process optimization
improvement attribute attribute

Table 5 shows the changes in the attributes at Level 2. It can be seen that the description of
Performance Management is significantly more detailed, with 6 distinct characteristics
identified, against 4 in the Technical Report. Characteristics for Work Product Management
have been clarified, removing explicit reference to "dependencies" and making terminology
consistent with usage in ISO 9001.

Table 5 – Characteristics for Capability Level 2


Attribute Technical Report International Standard
PA 2.1 Performance management Performance management
a. the objectives for the a. objectives for the performance
performance of the process will of the process are identified;
be identified (e.g. quality, time-
b. performance of the process is
scale, cycle time and resource
planned and monitored;
usage);
c. performance of the process is
b. the responsibility and authority
adjusted to meet plans;
for developing the work
products of the process will be d. responsibilities and authorities
assigned; for performing the process are
defined, assigned and
c. the performance of the process
communicated;
will be managed to produce
work products that meet the e. resources and information
defined objectives. necessary for performing the
process are identified, made
available, allocated and used;
f. interfaces between the involved
parties are managed to ensure
both effective communication
and also clear assignment of
responsibility.
Attribute Technical Report International Standard
PA 2.2 Work product management Work product management
1. the requirements (functional 1. requirements for the work
and non-functional) of the products of the process are
specified work products of the defined;
process will be defined;
2. requirements for the
2. the requirements for the documentation and control of
documentation and control of the work products are defined;
the work products will be
3. work products are appropriately
defined;
identified, documented, and
3. the dependencies among the controlled;
controlled work products will be
4. work products are reviewed in
identified;
accordance with planned
4. work products will be arrangements and adjusted as
appropriately identified and necessary to meet
documented, and changes will requirements.
be controlled.;
5. the work products will be
verified and adjusted to meet
the defined requirements.

Level 3 attributes are shown in Table 6. In the Technical Report, the distribution of
characteristics was on the basis that one attribute addressed the "process" elements
(procedures, etc) in relation to both the definition of a standard process and its deployment,
while the other addressed the definition and deployment of resources and infrastructure. In
the new draft, PA3.1 is associated with the existence and availability of a set of standard
process assets (including resources and infrastructure), while PA3.2 addresses the
deployment of these assets as a "defined process". The set of characteristics distributed
across the two attributes are basically the same. Terminology is clarified and made more
consistent.

Table 6 – Characteristics for Capability Level 3


Attribute Technical Report International Standard
PA 3.1 Process definition Process definition
1. a standard process including 1. a standard process, including
appropriate guidance on appropriate tailoring guidelines,
tailoring will be defined, that is defined that describes the
supports the execution of the fundamental elements that
managed process; must be incorporated into a
defined process;
2. performance of the process will
be conducted in accordance 2. the sequence and interaction of
with appropriately selected the standard process with other
and/or tailored standard processes is determined;
process documentation;
3. required competencies and
3. historical process performance roles for performing a process
data will be gathered to are identified as part of the
establish and refine the standard process;
understanding of the process
4. required infrastructure and work
behaviour (e.g. in order to
environment for performing a
estimate the process
process are identified as part of
performance resource needs);
the standard process;
4. experiences of using the
5. suitable methods for monitoring
defined process will be used to
the effectiveness and suitability
Attribute Technical Report International Standard
refine the standard process. of the process are determined.
PA 3.2 Process resource Process deployment
1. roles, responsibilities and 1. a defined process is deployed
competencies required for based upon an appropriately
performing the process will be selected and/or tailored
identified and documented; standard process;
2. the process infrastructure 2. required roles, responsibilities
required for performing the and authorities for performing
process will be identified and the defined process are
documented; assigned and communicated;
3. the required resources will be 3. personnel performing the
available, allocated and used to defined process are competent
support the performance of the on the basis of appropriate
defined process. education, training, and
experience;
4. required resources and
information necessary for
performing the defined process
are made available, allocated
and used;
5. required infrastructure and work
environment for performing the
defined process are made
available, managed and
maintained;
6. appropriate data are, collected
and analysed as a basis for
understanding the behaviour of,
and to demonstrate the
suitability and effectiveness of
the process, and to evaluate
where continuous improvement
of the process can be made.

At higher levels of capability, the process characteristics described are essentially


unchanged, although further clarification of terminology has been adopted. Tables 7 and 8
show the characteristics for Capability Levels 4 and 5 respectively. At Level 4, the
description of PA4.2 has been expanded significantly, but without change to its intent. The
description of PA 4.1 has been clarified, and the links to concepts of statistical process
control made more explicit.

Table 7 – Characteristics for Capability Level 4


Attribute Technical Report International Standard
PA 4.1 Measurement Process measurement
a. product and process goals and a. process information needs in
measures will be identified in support of relevant defined
line with relevant business business goals are established;
goals;
b. process measurement
b. product and process measures objectives are derived from
will be collected to monitor the process information needs;
extent to which the defined
c. quantitative objectives for
goals are met;
process performance in support
Attribute Technical Report International Standard
c. process performance trends of relevant business goals are
across the organization will be established;
analyzed;
d. measures and frequency of
d. process capability will be measurement are identified and
measured and maintained defined in line with process
across the organization. measurement objectives and
quantitative objectives for
process performance;
e. results of measurement are
collected, analysed and
reported in order to monitor the
extent to which the quantitative
objectives for process
performance are met;
f. measurement results are used
to characterise process
performance.
PA 4.2 Process control Process control
a. suitable analysis and control a. analysis and control techniques
techniques will be identified; are determined and applied
where applicable;
b. in-process product and process
measures will be collected and b. control limits of variation are
analyzed to support control of established for normal process
process performance within performance;
defined limits;
c. measurement data are
c. process performance will be analysed for special causes of
managed quantitatively. variation;
d. corrective actions are taken to
address special causes of
variation;
e. control limits are re-established
(as necessary) following
corrective action.

At Capability Level 5, the characteristics described are essentially the same; however, the
order and names of the attributes have been changed. Again, concepts have been clarified
and the linkage to statistical process control concepts made more explicit.

Table 8 – Characteristics for Capability Level 5


Attribute Technical Report International Standard
PA 5.1 Process change Process innovation
a. the impact of all proposed a. process improvement
changes will be assessed objectives for the process are
against the defined product and defined that support the
process goals of the defined relevant business goals;
and standard processes;
g. appropriate data are analysed
b. the implementation of all to identify common causes of
agreed changes will be variations in process
managed to ensure that any performance;
disruption to the process
h. appropriate data are analysed
performance is understood and
to identify opportunities for best
acted upon;
Attribute Technical Report International Standard
c. the effectiveness of process practice and innovation;
change on the basis of actual
i. improvement opportunities
performance will be evaluated
derived from new technologies
against the defined product and
and process concepts are
process goals and adjustments
identified;
made as needed.
j. an implementation strategy is
established to achieve the
process improvement
objectives.
PA 5.2 Continuous improvement Process optimization
a. the process improvement goals a. impact of all proposed changes
for the process will be defined is assessed against the
that support the relevant objectives of the defined
business goals of the process and standard process;
organization;
n. implementation of all agreed
k. the sources of real and changes is managed to ensure
potential problems will be that any disruption to the
identified; process performance is
understood and acted upon;
l. improvement opportunities will
be identified; o. effectiveness of process
change on the basis of actual
m. an implementation strategy will
performance is evaluated
be established and deployed to
against the defined product
achieve the process
requirements and process
improvement goals across the
objectives to determine whether
organization.
results are due to common or
special causes.

The impact of the changes to the measurement framework should be a significant


improvement in clarity of the attribute descriptions, resulting in improved consistency of rating
and better reliability of the assessment process. It should be simpler, with the use of more
consistent terminology, to establish the conformance of new assessment models (for
example, those developed by the CMMI Development Project [17]) more readily.

Transition from the Technical Report


The Working Group has considered the issue of transition from the existing Technical Report
to the new International Standard. At its meeting in March 2003, the following
recommendations were developed:
1. It is recommended that assessment approaches that support conformance with the
requirements of ISO/IEC TR 15504 commence to transition to support conformance with
the requirements of ISO/IEC FDIS 15504-2.
2. The transition should be implemented as soon as possible after approval for publication
of ISO/IEC 15504-2. The requirements include the use of the new measurement
framework for process capability.
3. Assessment approaches that are currently using the exemplar assessment model in
ISO/IEC TR 15504-5 may use the process dimension in ISO/IEC TR 15504-2 as the
basis for a Process Reference Model (PRM) and use ISO/IEC TR 15504-5 as the basis
for a Process Assessment Model (PAM) in relation to the PRM.
4. It is recommended that assessment approaches that are currently using the exemplar
assessment model in ISO/IEC TR 15504-5 commence to transition to use the exemplar
Process Assessment Model in ISO/IEC 15504-5 in conjunction with Process Reference
Model in ISO/IEC 12207:1995 AMD as soon as ISO/IEC 15504-5 is distributed for CD
ballot .
5. It is noted that the exemplar Process Assessment Model (ISO/IEC 15504-5) is expected
for publication early in 2005.

Summary and Conclusion


As of June 2003, the development of ISO/IEC 15504 is proceeding towards finalisation. Part
2 of the International Standard has been approved for publication, and two of the remaining
parts of the standard – Parts 3 and 4 - are at the stage of Final Draft International Standard.
Parts 1 and 5 were delayed in development because of dependencies on the other material;
both are currently registered as Committee Draft. The normative element of the Standard
(Part 2) will be published in 2003, while the current schedule indicates that the complete
document set will be published and available early in 2005.
The development of ISO/IEC 15504 has drawn together the best of international expertise in
process assessment, and through the synergy of these relationships it has led to significant
advances in the state of the art and in the theoretical underpinning for process assessment
[20]. New techniques for conducting assessments are emerging, and ISO/IEC 15504
provides a framework within which they can be validated and evaluated for their benefits to
the industry. The overall impact of the work has been a significant boost to the improvement
of software engineering practice internationally.

References
1. ISO/IEC 15504 – Information Technology – Process Assessment, Parts 1 – 5 (to be
published)
2. ISO/IEC TR 15504:1998 – Information Technology – Software Process Assessment,
Parts 1 – 9
3. ISO/IEC JTC1, Procedures for the technical work of ISO/IEC JTC 1 on Information
Technology, 2002.
4. ISO/IEC JTC1/SC7, N1939, 1998 Resolutions: Resolution 516. JTC1/SC7 N1939,
12 June 1998.
5. ISO 9000: 1994, Quality Management Systems - Guidelines for Selection and Use
6. ISO/IEC 12207: 1995, Information Technology - Software Life Cycle Processes.
7. F. Maclennan, G. Ostrolenk and M. Tobin. "Introduction to the SPICE Trials" ,in K. El
Emam, J.-N. Drouin and W. Melo (editors), SPICE - The Theory and Practice of Software
Process Improvement and Capability Determination, 1997, pp. 269-286.
8. M.C. Paulk, B. Curtis, M.B Chrissis, and C.V. Weber, Capability Maturity Model for
Software, Version 1.1. Report CMU/SEI-93-TR-24, Software Engineering Institute,
Pittsburgh, February 1993.
9. F. Coallier, and J.-N. Drouin, "Developing an Assessment Method for Telecom Software
System: an Experience Report", European Conference on Software Quality, 1992.
10. P. Kuvaja, J. Similä, L. Kranik, A. Bicego, S. Saukkonen, and G. Koch, Software Process
Assessment and Improvement: The Boostrap Approach, Blackwell, 1994.
11. Compita Ltd, Process Professional Process Portfolio, Process Professional Library
Services, 1996.
12. ISO/IEC JTC1/SC7, The Need and Requirements for a Software Process Assessment
Standard, Study Report, Issue 2.0, JTC1/SC7 N944R, 11 June 1992.
13. ISO/IEC 12207: 1995, Information Technology - Software Life Cycle Processes,
Amd 1:2000.
14. ISO/IEC 15288: 2002, Information Technology - System Life Cycle Process.
15. ISO TR 18529: 2000, Ergonomics -- Ergonomics of human-system interaction -- Human-
centred lifecycle process descriptions.
16. ISO/IEC JTC1, Proposed Modifications to the JTC 1/SC 7 Programme of Work (Revision
of ISO/IEC TR 15504: 1998), JTC1 N5848, 3 Aug 1999.
17. CMMI® Product Development Team, CMMI® -SE/SW, V1.0 Capability Maturity Model ® –
Integrated for Systems Engineering/Software Engineering, Version 1.0, SEI, 2000.
18. ISO/IEC, ISO/IEC Directives - Part 3: Rules for the structure and drafting of International
Standards, Third edition, 1997.
19. ISO 9001:2000, Quality management systems – Requirements.
20. Ho-Won Jung, Robin Hunter, Dennis R. Goldenson and Khaled El-Emam, "Findings from
Phase 2 of the SPICE Trials", Softw. Process Improve. Pract. 2001; 6: 205–242
21. ISO/IEC Guide 2, 1996, Standardization and related activities – General Vocabulary
22. Dorling, A. "SPICE: Software Process Improvement and Capability dEtermination".
Information and Software Technology, 35(6/7): 404-406, June/July 1993.
23. T.P. Rout, “The SPICE Project: Past, Present and Future”, invited keynote address,
Software Process ‘96, Brighton, December 1996.
24. Rout, T.P. and P.G. Simms, “Introduction to the SPICE Documents & Architecture”, in K.
El Emam, J-N Drouin and W. Melo: SPICE: The Theory and Practice of Software Process
Improvement and Capability Determination, IEEE Computer Society Press 1998.

View publication stats

You might also like