Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
39 views34 pages

Measuring BIM Performance Five Metrics

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views34 pages

Measuring BIM Performance Five Metrics

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

NOVA

University of Newcastle Research Online


nova.newcastle.edu.au

Succar Bilal, Sher William David, Williams Anthony Philip . (2012). Measuring BIM performance: Five metrics
Originally published in Architectural Engineering and Design Management, 8 120-142. Available from :
http://dx.doi.org/10.1080/17452007.2012.659506

This is an electronic version of an article published in Architectural Engineering


and Design Management Vol. 8, Issue 2, p. 120-142. Architectural Engineering
and Design Management is available online at:
http://www.tandfonline.com/openurl?genre=article&issn=1745-
2007&volume=8&issue=2&spage=120

Accessed from: http://hdl.handle.net/1959.13/933931


Measuring BIM Performance – five metrics
Bilal Succar
School of Architecture and Built Environment,
University of Newcastle,
Callaghan Campus
NSW 2308, Australia.

Willy Sher
School of Architecture and Built Environment,
University of Newcastle,
Callaghan Campus
NSW 2308, Australia.

Anthony Williams
School of Architecture and Built Environment,
University of Newcastle,
Callaghan Campus
NSW 2308, Australia.

Correspondence should be addressed to:


Bilal Succar
School of Architecture and Built Environment,
University of Newcastle,
Callaghan Campus
NSW 2308, Australia.
Tel: + 61 412 556 671
Fax: + 61 39515 0031
Email: [email protected]

1
Abstract
The term Building Information Modelling (BIM) refers to an expansive knowledge domain within the

Design, Construction and Operation (DCO) industry. The voluminous possibilities attributed to BIM

represent an array of challenges that can be met through a systematic research and delivery framework

spawning a set of performance assessment and improvement metrics. This paper identifies five

complementary components specifically developed to enable such assessment: [1] BIM Capability

Stages representing transformational milestones along the implementation continuum [2] BIM

Maturity Levels representing the quality, predictability and variability within BIM Stages, [3] BIM

Competencies representing incremental progressions towards and improvements within BIM Stages,

[4] Organisational Scales representing the diversity of markets, disciplines and company sizes and [5]

Granularity Levels enabling highly-targeted yet flexible performance analyses ranging from informal

self-assessment to high-detail, formal organisational audits. This paper explores these complementary

components and positions them as a systematic method to understand BIM performance and to enable

its assessment and improvement. Figure 1 provides a flowchart of the contents of this paper.

Keywords: Building Information Modelling, Performance Assessment and Improvement, Capability

and Maturity Models

Insert Figure 1 here

2
1. A brief introduction to BIM
Building Information Modelling (BIM) is a term that is used by different authors in many different

ways. The nuances between their definitions highlight the rapid growth the area has experienced, as

well as the potential for confusion to arise when ill-defined terminology is used to communicate

specific meanings. In the context of this paper, BIM refers to a set of interacting policies, processes

and technologies (illustrated in Figure 2) that generate a “methodology to manage the essential

building design and project data in digital format throughout the building’s life-cycle” (Penttilä, 2006).

It is important to identify the knowledge structures, internal dynamics and implementation

requirements of BIM if confusion and duplication of effort are to be avoided.

Insert Figure 2 here

1.1 Some indicators of the proliferation of BIM

There are many signs that the use of BIM tools and processes is reaching a tipping-point in some

markets (Keller, Gerjets, Scheiter, & Garsoffky, 2006; McGraw-Hill, 2009). For example, in the USA

an increasing number of large institutional clients now require object-based 3D models to be provided

as part of tender submissions (Alison, Eugene, & Garry, 1997). Furthermore, the UK Cabinet Office

has recently published a construction strategy paper that requires the submission of a “fully

collaborative 3D BIM (with all project and asset information, documentation and data being

electronic) as a minimum by 2016” (BIS, 2011; UKCO, 2011 p. 14). Other signs include the

abundance of BIM-specific software tools, books, new media tools and reports (M. J. Eppler & Platts,

2009).

3
1.2 Issues arising from the proliferation of BIM

Notwithstanding the much-touted benefits of BIM as a means of increasing productivity, there are

currently few metrics that measure such improvements. Furthermore, little guidance is available for

organisations wishing to generate new or enhance their existing BIM deliverables. Those wishing to

adopt BIM or identify and / or prioritize their requirements are thus left to their own devices. The

implementation of any new technology is fraught with challenges and BIM is no exception. In

addition, those implementing BIM frequently expect to be able to realise significant benefits and

productivity gains whilst they are still inexperienced users. Successful implementation of these

systems requires an appreciation of how BIM resources (including hardware, software as well as the

technical and management skills of staff) need to evolve in harmony with each other. The multiple

and varied understandings that practitioners have of BIM further compounds the difficulties they

experience. When the unforeseen happens, the risks, costs and difficulties associated with

implementing BIM increase. In such circumstances compromises are likely to be made leading, in

turn, to users’ expectations not being met.

1.3 The need for BIM performance metrics

BIM use needs to be assessable if the productivity improvements that result from its implementation

are to be made apparent. Without such metrics, teams and organisations are unable to consistently

measure their own successes and / or failures. Performance metrics enable teams and organisations to

assess their own competencies in using BIM and, potentially, to benchmark their progress against that

of other practitioners. Furthermore, robust sets of BIM metrics lay the foundations for formal

certification systems, which could be used by those procuring construction projects to pre-select BIM

service providers.

1.4 Developing BIM metrics and benchmarks

Whilst it is important to develop metrics and benchmarks for BIM performance assessment, it is

equally important that these metrics are accurate and able to be adapted to different industry sectors

4
and organisations. Considerable insight can be gained from the performance measurement tools

developed for other industries but it would be foolhardy to rely on any tool which is not designed for

the specific requirements of the task in question. Those required to measure key BIM

deliverables/requirements across the construction supply chain are no exception.

This paper describes a set of metrics purposefully developed to measure the specifics of BIM

performance. To increase their reliability, adoptability and usability for different stakeholders, the

first-named author identified the following performance criteria. The metrics should be:

Accurate: well-defined and able to measure performance at high levels of precision.

Applicable: able to be utilised by all stakeholders across all phases of a project’s lifecycle.

Attainable: achievable if defined actions are undertaken.

Consistent: yield the same results when conducted by different assessors.

Cumulative: set as logical progressions; deliverables from one act as prerequisites for another.

Flexible: able to be performed across markets, organisational scales and their subdivisions.

Informative: provide “feedback for improvement” and “guidance for next steps” (Nightingale &

Mize, 2002 p. 19).

Neutral: not prejudice proprietary, non-proprietary, closed, open, free or commercial solutions or

schemata.

Specific: serve the specific requirements of the Construction Industry.

Universal: apply equally across markets and geographies.

Usable: intuitive and able to be easily employed to assess BIM performance.

This paper describes the development of a set of BIM performance metrics based on these guiding

principles. It introduces a set of complementary knowledge components which enable BIM

performance assessment and facilitate its improvement.

5
2. Research design
The investigations described in this paper are part of a larger PhD study which addresses the question

of how to represent BIM knowledge structures and provide models that facilitate the implementation

of BIM in academic and industrial settings. It is grounded in a set of paradigms, theories, concepts

and experiences which combine to form the view of the BIM domain reported here.

2.1 Conceptual Background

According to Maxwell (2005), the conceptual background underpinning a study such as this is

typically based on several sources including previous research and existing theories, the researcher’s

own experiential knowledge and thought experiments. Various theories (including systems theory

(Ackoff, 1971; Chun, Sohn, & Granados, 2008), systems thinking (Chun, et al., 2008), diffusion of

innovation theory (Fox & Hietanen, 2007; Mutai, 2009; Rogers, 1995), technology acceptance models

(Davis, 1989; Venkatesh & Davis, 2000) and complexity theory (Froese, 2010; Homer-Dixon, 2001))

assisted in analysing the BIM domain and enriched the study’s conceptual background. Constraints

identified in these theories led to the development of a new theoretical framework based on an

inductive approach “[more suitable for researchers who are more concerned about] the correspondence

of their findings to the real world than their coherence with existing theories or laws” (Meredith,

Raturi, Amoako-Gyampah, & Kaplan, 1989 p. 307).

2.2 Methodology and Validation

The five components of BIM performance measurement are some of the deliverables of the BIM

Framework developed after assessing numerous publicly-available international guidelines (Succar,

2009). The Framework itself is composed of a number of high-level concepts which interact to

generate a set of guides and tools necessary to [i] facilitate BIM implementations, [ii] conduct BIM

performance assessments, [iii] and generate multi-tiered educational curricula.

6
The theoretical underpinnings of the BIM Framework have been generated through a process of

inductive inference (Michalski, 1987), conceptual clustering (Michalski & Stepp, 1987) and reflective

learning (Van der Heijden & Eden, 1998) (Walker, Bourne, & Shelley, 2008). Framework components

were then represented visually through a series of ‘knowledge models’ to reduce topic complexity

(Tergan, 2003) and facilitate knowledge transfer to others (M. Eppler & Burkhard, 2005).

Many of the BIM Framework’s components – Fields, Stages, Lenses, Steps, Competencies and several

visual knowledge models – have been subjected to a process of validation through a series of

international focus groups employing a mixed-model approach (Tashakkori, A. and Teddlie, C. ,1998).

The results from these focus groups and their impact on the development of the five components of

BIM performance measurement will be published separately.

3. The Five Components of BIM performance measurement


The first named author identified five BIM framework components as those required to enable

accurate and consistent BIM performance measurement (Succar, 2010b). These include BIM

Capability Stages, BIM Maturity Levels, BIM Competency Sets, Organisational Scales and

Granularity Levels.

The following sections provide brief introductions to each component. They are followed by a step-

by-step workflow which allows BIM Capability and Maturity assessments to be conducted.

3.1 BIM Capability Stages

BIM Capability is defined here as the basic ability to perform a task or deliver a BIM service/product.

BIM Capability Stages (or BIM Stages) define the minimum BIM requirements - the major milestones

that need to be reached by teams or organisations as they implement BIM technologies and concepts.

Three BIM Stages separate ‘pre-BIM’, a fixed starting point representing industry status before BIM

implementation, from ‘post-BIM’, a variable end-point representing the continually evolving goal of

employing virtually integrated Design, Construction and Operation (viDCO) tools and concepts. (The

term viDCO is used in preference to Integrated Project Delivery (IPD) as representing the ultimate

7
goal of implementing BIM (AIA, 2007) to prevent any confusion with the term’s evolving contractual

connotations within the United States). The stages are:

BIM Stage 1: object-based modelling

BIM Stage 2: model-based collaboration

BIM Stage 3: network-based integration

BIM Stages are defined by their minimum requirements. For example, to be considered as having

achieved BIM Capability Stage 1, an organisation needs to have deployed an object-based modelling

software tool similar to ArchiCAD, Revit, Tekla or Vico. Similarly, for BIM Capability Stage 2, an

organisation needs to be engaged in a multidisciplinary ‘model-based’ collaborative project. To be

considered at BIM Capability Stage 3, an organisation needs to be using a network-based solution

which links to external databases and shares object-based models with at least two other disciplines – a

solution similar to a model server or BIMSaaS solution (BIMserver, 2011; Onuma, 2011; Wilkinson,

2008).

Each of these three Capability Stages may be further subdivided into Competency Steps. What

differentiates stages from steps is that stages are transformational or radical changes, while steps are

incremental ones (Henderson & Clark, 1990; Taylor & Levitt, 2005). The collection of steps involved

in working towards or within a BIM Stage (i.e. across the continuum from pre-BIM to post-BIM) is

driven by different perquisites for, challenges within and deliverables of each BIM Stage. In addition

to their type (the Competency Set they belong to – refer to Section 3.3), the following BIM Steps can

be also identified according to their location on the continuum shown in Figure 3:

A Steps: from pre-BIM Status leading to BIM Stage 1

B Steps: from BIM Stage 1 leading towards BIM Stage 2

C Steps from BIM Stage 2 leading towards BIM Stage 3

D Steps from BIM Stage 3 leading towards post-BIM

8
Insert Figure 3 here

3.2 BIM Maturity Levels

The term ‘BIM Maturity’ refers to the quality, repeatability and degree of excellence within a BIM

Capability. Whilst ‘capability’ denotes a minimum ability (refer to Section 3.1), ‘maturity’ denotes the

extent of that ability in performing a task or delivering a BIM service/product. BIM Maturity’s

benchmarks are performance improvement milestones (or levels) that teams and organisations aspire

to or work towards. In general, the progression from lower to higher levels of maturity indicates (i)

improved control resulting from fewer variations between performance targets and actual results, (ii)

enhanced predictability and forecasting of reaching cost, time and performance objectives, and (iii)

greater effectiveness in reaching defined goals and setting new more ambitious ones (Lockamy III &

McCormack, 2004) (Kevin McCormack, Ladeira, & Oliveira, 2008).

The concept of BIM Maturity has been adopted from Software Engineering Institute’s (SEI)

Capability Maturity Model (CMM) (SEI, 2008a), a process improvement framework initially intended

as a tool to evaluate the ability of government contractors to deliver software projects. CMM

originated in the field of quality management (Crosby, 1979) and was later developed for the benefit

of the US Department of Defence (Hutchinson & Finnemore, 1999). Its successor, the more

comprehensive Capability Maturity Model Integration (CMMI) (SEI, 2006a, 2006b, 2008c), continues

to be developed and extended by the Software Engineering Institute, Carnegie Mellon University.

Several CMM variants exist for other industries (Succar, 2010a) but they are all, in essence,

specialised frameworks that assist stakeholders to improve their capabilities (Jaco, 2004) and benefit

9
from process improvements. Example benefits include increased productivity and Return On

Investment (ROI) as well as reduced costs and post-delivery defects (Hutchinson & Finnemore, 1999).

Maturity models are typically composed of multiple maturity levels, or process improvement ‘building

blocks’ or ‘components’ (Paulk, Weber, Garcia, Chrissis, & Bush, 1993). When the requirements of

each level are satisfied, implementers can then build on established components to attempt ‘higher’

maturity. Although CMMs are not without their detractors (for example (Bach, 1994; Jones, 1994;

Weinberg, 1993)), research conducted in other industries has already identified a correlation between

improved process maturity and business performance (Lockamy III & McCormack, 2004).

The ‘original’ software industry CMM, however, is not applicable to the construction industry. It does

not address supply chain issues, and its maturity levels do not account for the different phases of the

lifecycle of a construction project (Sarshar et al., 2000). Although other efforts, derived from CMM,

focus on the construction industry (refer to Table 1), there is no comprehensive maturity model/index

that can be applied to BIM, its implementation stages, players, deliverables or its effect on project

lifecycle phases.

Insert Table 1 here

The CMMs listed in Table 1 are similar in structure and objectives but differ in conceptual depth,

industrial focus, terminology and target audience. A common theme is how CMMs employ simple

experience–based classifications and benchmarks to facilitate continuous improvement within

organisations. In analysing their suitability for developing a BIM-specific maturity index, most are

broad in approach and can collectively form a basis for a range of BIM processes, technologies and

policies. However, none easily accommodates the size of organisations being monitored. Also, from a

terminology standpoint, there is insufficient differentiation between the notion of capability (an ability

10
to perform a task) and that of maturity (the degrees of excellence in performing a task). This

differentiation is critical when catering for staged BIM implementation as it responds to the disruptive

and expansive nature of BIM.

To address the aforementioned shortcomings, the BIM Maturity Index (BIMMI) has been developed

by analysing and then integrating these and other maturity models used across different industries. The

BIMMI has been customised to reflect the specifics of BIM capability, implementation requirements,

performance targets and quality management. It has five distinct levels: (a) Initial / Ad-hoc, (b)

Defined, (c) Managed, (d) Integrated and (e) Optimised (Figure. 4). Level names were chosen to

reflect the terminology used in many maturity models, to be easily understandable by DCO

stakeholders and to reflect increasing BIM maturity from ad-hoc to continuous improvement (Table

2).

Insert Figure 4 here

Insert Table 2 here

3.3 BIM Competency Sets

A BIM Competency Set is a hierarchical collection of individual competencies identified for the

purposes of implementing and assessing BIM. In this context, the term competency reflects a generic

set of abilities suitable for implementing as well as assessing BIM Capability and/or Maturity. Figure

11
5 illustrates how the BIM Framework generates BIM Competency Sets out of multiple Fields, Stages

and Lenses (Succar, 2009).

Insert Figure 5 here

BIM Competencies are a direct reflection of BIM Requirements and Deliverables and can be grouped

into three sets, namely Technology, Process and Policy:

Technology sets in software, hardware and networks. For example, the availability of a BIM tool

allows the migration from drafting-based to object-based workflow (a requirement of BIM Stage 1)

Process sets in leadership, infrastructure, human resources and products/services. For example,

collaboration processes and database-sharing skills are necessary to allow model-based collaboration

(BIM Stage 2).

Policy sets in contracts, regulations and research/education. For example, alliance-based or risk-

sharing contractual agreements are pre-requisites for network-based integration (BIM Stage 3).

Figure 6 provides a partial mind-map of BIM Competency Sets shown at Granularity Level 2 (For an

explanation of Granularity Levels, please refer to Section 3.5):

Insert Figure 6 here

12
3.4 BIM Organisational Scales

To allow BIM performance assessments to respect the diversity of markets, disciplines and company

sizes, an Organisational Scale (OScale) has been developed. The Scale can be used to customise

assessment efforts and is depicted in Table 3.

Insert Table 3 here

3.5 BIM Granularity Levels

Competency Sets include a large number of individual competencies grouped under numerous

headings (shown in Figure 6). To enhance BIM Capability and Maturity assessments and to increase

their flexibility, a Granularity ‘filter’ with four Granularity Levels (GLevels) has been developed.

Progression from lower to higher levels of granularity indicates an increase in (i) assessment breadth,

(ii) scoring detail, (iv) formality and (iv) assessor specialisation.

Using higher-granularity levels (GLevels 3 or 4) exposes more detailed Competency Areas than

lower-granularity levels (GLevels 1 or 2). This variability enables the preparation of several BIM

performance measurement tools ranging from low-detail, informal and self-administered assessments

to high-detail, formal and specialist-led appraisals. Table 4 provides more information about the four

Granularity Levels:

Insert Table 4 here

13
Granularity Levels increase or decrease the number of Competency Areas used for performance

assessment. For example, the mind map provided in Figure 6 reveals ten Competency Areas at

GLevel 1 and thirty-six Competency Areas at GLevel 2. Also, at GLevels 3 and 4, the number of

Competency Areas available for performance assessment increases dramatically as shown in Figure 7.

Insert Figure 7 here

The partial mind-map shown in Figure 7 reveals many additional Competency Areas under GLevel 3,

such as Data Storage and Data Exchange. At GLevel 4, the map reveals even more detailed

Competency Areas including Structured and Unstructured Data, which in-turn branch into computable

and non-computable components (Kong et al., 2005) (Mathes, 2004) (Fallon & Palmer, 2007).

4. Applying the five assessment components


The aforementioned five complementary BIM framework components (capability stages, maturity

levels, competency sets, organisational scales and granularity levels) allow performance assessments

to be conducted involving combinations of these components. The guiding principles discussed in

Section 1.4 all apply. To manage all possible configurations, a simple assessment and reporting

workflow has been developed (Figure 8):

Insert Figure 8 here

14
The workflow shown in Figure 8 identifies the five steps needed to conduct a BIM performance

assessment. Starting with an extensive pool of generic BIM Competencies - applicable across DCO

disciplines and organisational sizes – assessors can first filter-out non-applicable Competency Sets,

conduct a series of assessments based on the Competencies remaining and then generate appropriate

Assessment Reports.

5. A Final Note
The five BIM Framework components, briefly discussed in this paper, provide a range of opportunities

for DCO stakeholders to measure and improve their BIM performance. The components complement

each other and enable highly targeted yet flexible performance analyses to be conducted. These range

from informal self-assessments to highly detailed and formal organisational audits. Such a system of

assessment can be utilised to standardize BIM implementation and assessment efforts, enable a

structured approach to BIM education and training as well as establish a solid base for a formal BIM

certification process.

After scrutiny of a significant part of the BIM Framework through peer-reviewed publications and a

series of international focus groups, the five components and other related assessment metrics are

currently being extended and field-tested. Sample online tools (focusing on selected disciplines, at

different granularities) are currently being formulated. All these form part of an ongoing effort to

promote the establishment of an independent BIM certification body responsible for assessing and

accrediting individuals, organisations and collaborative project teams. Subject to additional field-

testing and tool calibration, the five components may be well-placed to consistently assess, and by

extension improve, BIM performance.

15
Acknowledgement
This chapter draws on the first-named author’s PhD research at the University of Newcastle, School of

Architecture and Built Environment (Australia). The first-named author wishes to acknowledge his

supervisors Willy Sher, Guillermo Aranda-Mena and Anthony Williams for their continuous support.

16
References
Ackoff, R. L. (1971). Towards a System of Systems Concepts. MANAGEMENT SCIENCE, 17(11),
661-671.

AIA. (2007). Integrated Project Delivery: A Guide: AIA California Council.

Alison, O., Eugene, A., & Garry, K. (1997). Is an illustration always worth ten thousand words?
Effects of prior knowledge, learning style and multimedia illustrations on text comprehension.
International Journal of Instructional Media, 24(3), 227.

Arif, M., Egbu, C., Alom, O., & Khalfan, M. M. A. (2009). Measuring knowledge retention: a case
study of a construction consultancy in the UAE. Engineering, Construction and Architectural
Management, 16(1), 92-108.

Bach, J. (1994). The Immaturity of the CMM. AMERICAN PROGRAMMER, 7, 13-13.

Bew, M., Underwood, J., Wix, J., & Storer, G. (2008). Going BIM in a Commercial World. Paper
presented at the EWork and EBusiness in Architecture, Engineering and Construction: European
Conferences on Product and Process Modeling (ECCPM 2008).

BIMserver. (2011). Open Source Building Information Modelserver. Retrieved October 20, 2011,
from http://bimserver.org/

BIS. (2011). A Report for the Government Construction Client Group, Building Information
Modelling (BIM) Working Party Strategy: Department for Business Innovation & Skills (BIS).

Chun, M., Sohn, K., & Granados, P. (2008). Systems Theory and Knowledge Management Systems:
The Case of Pratt-Whitney Rocketdyne.

Crawford, J. K. (2006). The Project Management Maturity Model. Information Systems Management,
23(4), 50-58.

Crosby, P. B. (1979). Quality is free: The art of making quality certain. New York: New American
Library.

Davis, F. D. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of
Information Technology. [Article]. MIS Quarterly, 13(3), 319-340.

Doss, D. A., Chen, I. C. L., & Holland, L. D. (2008). A proposed variation of the capability maturity
model framework among financial management settings. Paper presented at the Allied Academies
International Conference, Tunica.

Eppler, M., & Burkhard, R. A. (2005). Knowledge Visualization. In D. G. Schwartz (Ed.),


Encyclopedia of Knowledge Management (pp. 551-560): Idea Group Reference.

Eppler, M. J., & Platts, K. W. (2009). Visual Strategizing: The Systematic Use of Visualization in the
Strategic-Planning Process. Long Range Planning, 42(1), 42-74.

Fallon, K. K., & Palmer, M. E. (2007). General Buildings Information Handover Guide: Principles,
Methodology and Case Studies: NIST.

Fox, S., & Hietanen, J. (2007). Interorganizational use of building information models: potential for
automational, informational and transformational effects. Construction Management and Economics,
25(3), 289 - 296.

17
Froese, T. M. (2010). The impact of emerging information technology on project management for
construction. Automation in Construction, 19(5), 531-538.

Gillies, A., & Howard, J. (2003). Managing change in process and people: combining a maturity
model with a competency-based approach. Total Quality Management & Business Excellence, 14(7),
779 - 787.

Hardgrave, B. C., & Armstrong, D. J. (2005). Software process improvement: it's a journey, not a
destination. Commun. ACM, 48(11), 93-96.

Henderson, R. M., & Clark, K. B. (1990). Architectural Innovation: The Reconfiguration of Existing
Product Technologies and the Failure of Established Firms. Administrative Science Quarterly, 35(1),
9.

Homer-Dixon, T. (2001). The Ingenuity Gap. Canada: Vintage.

Hutchinson, A., & Finnemore, M. (1999). Standardized process improvement for construction
enterprises. Total Quality Management, 10, 576-583.

IU. (2009a). The Indiana University Architect's Office - BIM Design & Construction Requirements,
Follow-Up Seminar (PowerPoint Presentation). 32. Retrieved from
http://www.indiana.edu/~uao/IU%20BIM%20Rollout%20Presentation%209-10-2009.pdf

IU. (2009b). The Indiana University Architect's Office - IU BIM Proficiency Matrix (Multi-tab Excel
Workbook). 9 tabs. Retrieved from
http://www.indiana.edu/~uao/IU%20BIM%20Proficiency%20Matrix.xls

Jaco, R. (2004). Developing an IS/ICT management capability maturity framework. Paper presented at
the Proceedings of the 2004 annual research conference of the South African institute of computer
scientists and information technologists on IT research in developing countries.

Jones, C. (1994). Assessment and control of software risks: Prentice-Hall, New Jersey.

Keller, T., Gerjets, P., Scheiter, K., & Garsoffky, B. (2006). Information visualizations for knowledge
acquisition: The impact of dimensionality and color coding. Computers in Human Behavior, 22(1), 43-
65.

Kong, S. C. W., Li, H., Liang, Y., Hung, T., Anumba, C., & Chen, Z. (2005). Web services enhanced
interoperable construction products catalogue. Automation in Construction, 14(3), 343-352.

Kwak, Y. H., & Ibbs, W. C. (2002). Project Management Process Maturity (PM)2 Model. ASCE,
Journal of Management in Engineering, 18(3), 150-155.

Lainhart IV, J. W. (2000). COBIT™: A Methodology for Managing and Controlling Information and
Information Technology Risks and Vulnerabilities. Journal of Information Systems, 14(s-1), 21-25.

Lockamy III, A., & McCormack, K. (2004). The development of a supply chain management process
maturity model using the concepts of business process orientation. Supply Chain Management: An
International Journal, 9(4), 272-278.

Mathes, A. (2004). Folksonomies - Cooperative Classification and Communication Through Shared


Metadata. Paper presented at the Computer Mediated Communication, LIS590CMC (Doctoral Seminar), Graduate
School of Library and Information Science. Retrieved from http://www.adammathes.com/academic/computer-
mediatedcommunication/folksonomies.html

Maxwell, J. A. (2005). Qualitative Research Design: An Interactive Approach: Sage Publications, Inc.

18
McCormack, K. (2001). Supply Chain Maturity Assessment: A Roadmap for Building the Extended
Supply Chain. Supply Chain Practice, 3, 4-21.

McCormack, K., Ladeira, M. B., & Oliveira, M. P. V. d. (2008). Supply chain maturity and
performance in Brazil. Supply Chain Management: An International Journal, 13(4), 272-282.

McGraw-Hill. (2009). The Business Value of BIM: Getting Building Information Modeling to the
Bottom Line: McGraw-Hill Construction Analytics

Meredith, J. R., Raturi, A., Amoako-Gyampah, K., & Kaplan, B. (1989). Alternative research
paradigms in operations. Journal of Operations Management, 8(4), 297-326.

Michalski, R. S. (1987). Concept Learning. In S. S. Shapiro (Ed.), Encyclopedia of Artificial


Intelligence (Vol. 1, pp. 185-194). New York: Wiley.

Michalski, R. S., & Stepp, R. E. (1987). Clustering. In S. S. Shapiro (Ed.), Encyclopedia of Artificial
Intelligence (Vol. 1, pp. 103-111). New York: Wiley.

Mutai, A. (2009). Factors Influencing the Use of Building Information Modeling (BIM) within
Leading Construction Firms in the United States of America. Unpublished Doctor of Philosophy,
Indiana State University, Terre Haute.

NIBS. (2007). National Institute for Building Sciences (NIBS) Facility Information Council (FIC) –
BIM Capability Maturity Model. Retrieved October 11, 2008, from
www.buildingsmartalliance.org/client/assets/files/bsa/BIM_CMM_v1.9.xls

Nightingale, D. J., & Mize, J. H. (2002). Development of a Lean Enterprise Transformation Maturity
Model. Information Knowledge Systems Management, 3(1), 15.

NIST. (2007). National Building Information Modeling Standard - Version 1.0 - Part 1: Overview,
principles and Methodologies: National Institute of Building Sciences.

OGC. (2008). Portfolio, Programme, and Project Management Maturity Model (P3M3): Office of
Government Commerce - England.

OGC. (2009). Information Technology Infrastructure Library (ITIL) - Offic eof Government
Commerce. Retrieved February 13, 2009, from http://www.itil-officialsite.com/home/home.asp

Onuma. (2011). Onuma Model Server. Retrieved October 20, 2011, from
http://onuma.com/products/BimDataApi.php

Paulk, M. C., Weber, C. V., Garcia, S. M., Chrissis, M. B., & Bush, M. (1993). Key Practices of the
Capability Maturity Model - Version 1.1 (Technical Report): Software Engineering Institute, Carnegie
Mellon University.

Pederiva, A. (2003). The COBIT® Maturity Model in a Vendor Evaluation Case. INFORMATION
SYSTEMS CONTROL JOURNAL, 3, 26-29.

Penttilä, H. (2006). Describing The Changes In Architectural Information Technology To Understand


Design Complexity And Free-Form Architectural Expression. ITcon, 11(Special Issue The Effects of
CAD on Building Form and Design Quality), 395-408.

Rogers, E. M. (1995). Diffusion of Innovation. New York: Free Press.

Sahibudin, S., Sharifi, M., & Ayat, M. (2008). Combining ITIL, COBIT and ISO/IEC 27002 in Order
to Design a Comprehensive IT Framework in Organizations. Paper presented at the Modeling &
Simulation, 2008. AICMS 08. Second Asia International Conference

19
Sarshar, M., Haigh, R., Finnemore, M., Aouad, G., Barrett, P., Baldry, D., et al. (2000). SPICE: a
business process diagnostics tool for construction projects. Engineering Construction & Architectural
Management, 7(3), 241-250.

Sebastian, R., & Van Berlo, L. (2010). Tool for Benchmarking BIM Performance of Design,
Engineering and Construction Firms in the Netherlands. Architectural Engineering and Design
Management, Special Issue: Integrated Design and Delivery Solutions, 6, 254-263.

SEI. (2006a). Capability Maturity Model Integration for Development (CMMI-DEV), Improving
processes for better products: Software Engineering Institute / Carnegie Melon.

SEI. (2006b). Capability Maturity Model Integration Standard (CMMI) Appraisal Method for Process
Improvement (SCAMPI) A, Version 1.2- Method Definition Document: Software Engineering Institute
/ Carnegie Melon.

SEI. (2006c). CMMI for Development, Improving processes for better products: Software Engineering
Institute / Carnegie Melon.

SEI. (2008a). Capability Maturity Model Integration - Software Engineering Institute / Carnegie
Melon. Retrieved October 11, 2008, 2008, from http://www.sei.cmu.edu/cmmi/index.html

SEI. (2008b). Capability Maturity Model Integration for Services (CMMI-SVC), Partner and Piloting
Draft, V0.9c: Software Engineering Institute / Carnegie Melon.

SEI. (2008c). CMMI for Services. Retrieved December 24, 2008, from
http://www.sei.cmu.edu/cmmi/models/CMMI-Services-status.html

SEI. (2008d). People Capability Maturity Model - Version 2, Software Engineering Institute /
Carnegie Melon. Retrieved October 11, 2008, 2008, from http://www.sei.cmu.edu/cmm-
p/version2/index.html

Stephens, S. (2001). Supply Chain Operations Reference Model Version 5.0: A New Tool to Improve
Supply Chain Efficiency and Achieve Best Practice. Information Systems Frontiers, 3(4), 471-476.

Succar, B. (2009). Building information modelling framework: A research and delivery foundation for
industry stakeholders. Automation in Construction, 18(3), 357-375.

Succar, B. (2010a). Building Information Modelling Maturity Matrix. In J. Underwood & U. Isikdag
(Eds.), Handbook of Research on Building Information Modelling and Construction Informatics:
Concepts and Technologies: Information Science Reference, IGI Publishing.

Succar, B. (2010b). The Five Components of BIM Performance Measurement. Paper presented at the
CIB World Congress.

Suermann, P. C., Issa, R. R. A., & McCuen, T. L. (2008). Validation of the U.S. National Building
Information Modeling Standard Interactive Capability Maturity Model Paper presented at the 12th
International Conference on Computing In Civil and Building Engineering, October 16-18.

Taylor, J., & Levitt, R. E. (2005). Inter-organizational Knowledge Flow and Innovation Diffusion in
Project-based Industries. Paper presented at the 38th International Conference on System Sciences,
Hawaii, USA.

Tergan, S. O. (2003). knowledge with computer-based mapping tools. Paper presented at the ED-
Media 2003 World Conference on Educational Multimedia, Hypermedia & Telecommunication
Honolulu, HI: University of Honolulu.

TNO. (2010). BIM QuickScan - a TNO initiative (sample QuickScan Report - PDF). 3. Retrieved
from http://www.bimladder.nl/wp-content/uploads/2010/01/voorbeeld-quickscan-pdf.pdf

20
UKCO. (2011). Government Construction Strategy: United Kingdom Cabinet Office.

Vaidyanathan, K., & Howell, G. (2007). Construction Supply Chain Maturity Model - Conceptual
Framework. Paper presented at the International Group For Lean Construction (IGLC-15).

Van der Heijden, K., & Eden, C. (1998). The Theory and Praxis of Reflective Learning in Strategy
Making. In C. Eden & J.-C. Spender (Eds.), Managerial and Organizational Cognition: Theory,
Methods and Research (pp. 58-75). London: Sage.

Venkatesh, V., & Davis, F. D. (2000). A Theoretical Extension of the Technology Acceptance Model:
Four Longitudinal Field Studies. MANAGEMENT SCIENCE, 46(2), 186-204.

Walker, D. H. T., Bourne, L. M., & Shelley, A. (2008). Influence, stakeholder mapping and
visualization. Construction Management and Economics, 26(6), 645 - 658.

Weinberg, G. M. (1993). Quality software management (Vol. 2): First-order measurement: Dorset
House Publishing Co., Inc. New York, NY, USA.

Widergren, S., Levinson, A., Mater, J., & Drummond, R. (2010, 25-29 July 2010). Smart grid
interoperability maturity model. Paper presented at the Power and Energy Society General Meeting,
2010 IEEE.

Wilkinson, P. (2008, July 12, 2008). SaaS-based BIM. Extranet Evolution - Construction
Collaboration Technologies, from
http://www.extranetevolution.com/extranet_evolution/2008/04/saas-based-bim.html

21
Figures

Figure 1 Flowchart of the contents of this paper

22
Figure 2 The interlocking fields of BIM activity (Succar, 2009)

Figure 3 Step Sets leading to or separating BIM Stages – v1.1

23
Figure 4 Building Information Modelling Maturity Levels at BIM Stage 1

Figure 5 Structure of BIM Competency Sets v1.0

24
Figure 6 BIM Competency Sets v1.1 – shown at Granularity Level 2

25
Figure 7 Technology Competency Areas at Granularity Level 4 – partial mind map v1.1

Figure 8 BIM Capability and Maturity Assessment and Reporting Workflow Diagram - v2.0

26
Tables
Table 1 Maturity Models influencing the BIM Maturity Index

Sample Representation Abbreviation, Name – Organisation


Description and Number of maturity levels
BIM Proficiency Matrix – The Indiana University Architect's Office
The BIM Proficiency Matrix is “used to assess the proficiency of a respondent’s
skill at working in a BIM environment”. The matrix is “adaptable to project needs”
and intends to communicate “owner intent regarding BIM objectives” (IU, 2009a
p. 15 & 16).

The BIM Proficiency Matrix is a static, multi-worksheet, MS Excel workbook (IU,


2009b) which includes 8 categories to be assessed. Upon assessment, a score
ranging from 1 to 4 points is assigned against each category. Points for each
‘Simplified Matrix’ – an Excel category are then tallied and the total BIM Maturity Score is calculated. The matrix
Worksheet from the BIM identifies five ‘BIM Standards’ which a project can achieve, should achieve or has
Proficiency Matrix (IU, 2009b) already achieved depending on when the matrix is deployed.

The 5 Proficiency Levels (or BIM Standards) are: ‘Working towards BIM’ – the
lowest standard, ‘Certified BIM’, ‘Silver’, ‘Gold’ and ‘Ideal’ - the highest BIM
Maturity Standard.
BIM QuickScan – TNO Built Environment and Geosciences
The BIM QuickScan tool aims to “serve as a standard BIM benchmarking
instrument in the Netherlands”. The scan is intended to be performed “in a limited
time of maximum one day”(Sebastian & Van Berlo, 2010 p. 255 & 258).
The BIM QuickScan Tool is organized around 4 chapters: Organization and
Management, Mentality and Culture, Information Structure and Information Flow,
and Tools and Applications. “Each chapter contains a number of KPIs in the form
of a multiple-choice questionnaire…With each KPI, there are a number of possible
answers. For each answer, a score is assigned. Each KPI also carries a certain
Score representation (by category) weighting factor. The sum of all the partial scores after considering the weighting
from the sample BIM QuickScan factors represents the total score of BIM performance of an organization”
report (TNO, 2010) (Sebastian & Van Berlo, 2010 p. 258 & 259).
KPIs are assessed against a percentile score while ‘Chapters’, representing a
collation of KPIs, are assessed against a 5-level system (0 to 4).
COBIT, Control Objects for Information and related Technology –
Information Systems Audit and Control Association (ISACA) and the IT
Governance Institute (ITGI)
The main objective of COBIT is to “enable the development of clear policy and
good practice for IT control throughout organizations” (Lainhart IV, 2000 p. 22).

The COBIT Maturity Model is “an IT governance tool used to measure how well
developed the management processes are with respect to internal controls. The
maturity model allows an organization to grade itself from non-existent (0) to
optimized (5)” (Pederiva, 2003 p. 1). COBIT includes 6 Maturity Levels (Non-
existent, Initial/ad hoc, Repeatable but Intuitive, Defined Process, Managed and
Measurable and Optimised), 4 Domains and 34 Control Objectives.
(Lainhart IV, 2000)
Note: There is some alignment between ITIL (OGC, 2009) and COBIT with
respect to IT governance within organisations (Sahibudin, Sharifi, & Ayat, 2008)
of value to BIM implementation efforts.
CMMI, Capability Maturity Model Integration - Software Engineering
Institute / Carnegie Melon

27
Capability Maturity Model® Integration (CMMI) is a process improvement
approach that helps integrate traditionally separate organizational functions, set
process improvement goals and priorities, provide guidance for quality processes,
and provide a point of reference for appraising current processes (SEI, 2006b,
2006c, 2008a, 2008b, 2008c).

CMMI has 5 Maturity Levels (for Staged Representation, 6 Capability Levels for
Continuous Representation), 16 core Process Areas (22 for CMMI-DEV and 24 for
CMMI-SVC) and 1 to 4 Goals for each Process Area.
Source: NASA, Software
Engineering Process Group The 5 Maturity Levels are: Initial, Managed, Defined, Quantitatively Managed and
http://bit.ly/CMMI-NASA Optimising.
CSCMM, Construction Supply Chain Maturity Model
“Construction supply chain management (CSCM) refers to the management of
information, flow, and money in the development of a construction project” as
mentioned in (Vaidyanathan & Howell, 2007 p. 170).

CSCMM has 4 Maturity Stages: Ad-hoc, Defined, Managed and Controlled.

(Vaidyanathan & Howell, 2007)

iBIM – integrated Building Information Modelling


The iBIM maturity model - introduced in Bew, Underwood, Wix and Storer (2008)
- has been devised “to ensure clear articulation of the standards and guidance notes,
their relationship to each other and how they can be applied to projects and
contracts in industry” (BIS, 2011 p. 40).

The iBIM model identifies specific capability targets (not performance milestones)
for the UK Construction Industry covering technology, standards, guides,
classifications and delivery (total number of topics not defined). Targets for each
(BIS, 2011) topic are organised under one or more loosely defined Maturity Levels (0-3)
I-CMM, Interactive Capability Maturity Model - National Institute for
Building Sciences (NIBS) Facility Information Council (FIC)
This I-CMM is closely coupled with the NBIMS effort (Version1, Part 1) and
establishes “a tool to determine the level of maturity of an individual BIM as
measured against a set of weighted criteria agreed to be desirable in a Building
Information Model” (Suermann, et al., 2008 p. 2) (NIST, 2007) (NIBS, 2007).

The ICMM has 11 ‘Areas of Interest’ measured against 10 Maturity Levels.


(Suermann, Issa, & McCuen, 2008)

Knowledge Retention Maturity Levels


Arif, Egbu, Alom and Khalfan (2009) introduced 4 levels of knowledge retention
maturity.

Knowledge management is an integral part of BIM capability and subsequent


maturity. The Matrix thus incorporates these levels: (1) knowledge is shared
between employees, (2) shared knowledge is documented (transferred from tacit to
explicit), (3) documented knowledge is stored and (4) stored knowledge is
accessible and easily retrievable (Arif, et al., 2009).

(Arif, Egbu, Alom, & Khalfan,


2009)
LESAT, Lean Enterprise Self-Assessment Tool - Lean Aerospace Initiative
(LAI) at the Massachusetts Institute of Technology (MIT)

28
LESAT is focused on “assessing the degree of maturity of an enterprise in its use
of ‘lean’ principles and practices to achieve the best value for the enterprise and its
stakeholders” (Nightingale & Mize, 2002 p. 17).

LESAT has 54 Lean Practices organised within three Assessment Sections: Lean
Transformation/ Leadership, Life Cycle Processes and Enabling Infrastructure and
5 Maturity Levels: Some Awareness/Sporadic, General Awareness/Informal,
Systemic Approach, Ongoing Refinement and Exceptional/Innovative.

(Nightingale & Mize, 2002)

P3M3, Portfolio, Programme and Project Management Maturity Model -


Office of Government Commerce
The P3M3 provides “a framework with which organizations can assess their
current performance and put in place improvement plans with measurable
outcomes based on industry best practice” (OGC, 2008 p. 8).

The P3M3 has 5 Maturity Levels: Awareness, Repeatable, Defined, Managed and
Optimised.

(OGC, 2008)

P-CMM®, People Capability Maturity Model v2 – Software Engineering


Institute / Carnegie Melon
P-CMM is an “organizational change model” and a “roadmap for implementing
workforce practices that continuously improve the capability of an organization’s
workforce” (SEI, 2008d p. 3 & 15).

P-CMM has 5 Maturity Levels: Initial, Managed, Defined, Predictable and


Optimising.

(SEI, 2008d)

(PM)², Project Management Process Maturity Model


The project management process maturity (PM)² model “determines and positions
an organization’s relative project management level with other organizations”. It
also aims to integrate PM “practices, processes, and maturity models to improve
PM effectiveness in the organization” (Kwak & Ibbs, 2002 p. 150).

(PM)² has 5 Maturity Levels: Initial, Planned, Managed at Project Level, Managed
at Corporate Level and Continuous Learning.

(Kwak & Ibbs, 2002)

SPICE, Standardised Process Improvement for Construction Enterprises -


Research Centre for the Built and Human Environment, The University of Salford
SPICE is a project which developed a framework for continuous process
improvement for the construction industry. SPICE is an “evolutionary step-wise
model utilizing experience from other sectors, such as manufacturing and IT”
(Hutchinson & Finnemore, 1999 p. 576; Sarshar, et al., 2000).

SPICE has 5 Stages: Initial/Chaotic, Planned & Tracked, Well Defined,


Quantitatively Controlled, and Continuously Improving.

(Hutchinson & Finnemore, 1999)

29
Supply Chain Management Process Maturity Model and Business Process
Orientation (BPO) Maturity Model
The model conceptualizes the relation between process maturity and supply chain
operations as based on the Supply-chain Operations Reference Model (Stephens,
2001). The model’s maturity describes the “progression of activities toward
effective SCM and process maturity. Each level contains characteristics associated
with process maturity such as predictability, capability, control, effectiveness and
efficiency" (Lockamy III & McCormack, 2004 p. 275; K. McCormack, 2001).

The 5 Maturity Levels are: Ad-hoc, Defined, Linked, Integrated and Extended.
(Lockamy III & McCormack,
2004)
Other maturity models – or variation on listed maturity models - include those on Software Process Improvement
(Hardgrave & Armstrong, 2005), IS/ICT Management Capability (Jaco, 2004), Interoperability (Widergren,
Levinson, Mater, & Drummond, 2010), Project Management (Crawford, 2006), Competency (Gillies & Howard, 2003)
and Financial Management (Doss, Chen, & Holland, 2008).

30
Table 2 A non-exhaustive list of terminology used by CMMs to denote maturity levels including those
used by the BIM Maturity Index

MATURITY LEVELS
Maturity Models 0 1 or a 2 or b 3 or c 4 or d 5 or e
BIM Maturity Index Initial/ Defined Managed Integrated Optimised
Ad-hoc
COBIT, Control Objects for Non-existent Initial/ Repeatable Defined Managed & Optimised
Information and related Ad- hoc but Intuitive Process Measurable
Technology
CMMI, Capability Maturity Initial Managed Defined Quantitatively Optimising
Model Integration (Staged Managed
Representation)
CMMI (Continuous Incomplete Performed Managed Defined Quantitatively Optimising
Representation) Managed
CSCMM, Construction Ad-hoc Defined Managed Controlled N/A
Supply Chain Maturity
Model
LESAT, Lean Enterprise Awareness/ General Systemic Ongoing Exceptional/
Self-Assessment Tool Sporadic Awareness/ Approach Refinement Innovative
Informal
P-CMM®, People Initial Managed Defined Predictable Optimising
Capability Maturity Model
P3M3, Portfolio, Programme Awareness Repeatable Defined Managed Optimised
and Project Management
Maturity Model
(PM)², Project Management Ad-hoc Planned Managed at Managed at Continuous
Process Maturity Model Project Level Corporate Level Learning
SPICE, Standardised Initial/ Planned & Well Quantitatively Continuously
Process Improvement for Chaotic Tracked Defined Controlled Improving
Construction Enterprises
Supply Chain Management Ad-hoc Defined Linked Integrated Extended
Process Maturity Model

31
Table 3 Organisational Scales

Low Detail High Detail

Name Sym Granularity Name Sym Granularity Short Definition


MACRO M Markets (Macro M) M Market Markets are the “world of commercial
Markets and activity where goods and services are
Industries bought and sold” http://bit.ly/pjB3c
(Meso M) Md Defined Market Defined Markets can be geographical,
geopolitical or resultant from multi-
party agreements similar to NAFTA
or ASIAN.
(Micro M) Ms Sub-Market Sub-markets can be local or regional.

I Industries (Macro I) I Industry Industries are the organised action of


making of goods and services for
sale. Industries can traverse markets
and may be service, product or
project-based. The AEC industry is
mostly Project-Based.
http://bit.ly/ielY3
(Meso I) Is Sector A sector is a "distinct subset of a
market, society, industry, or economy
whose components share similar
characteristics"
http://bit.ly/15UkZD
(Micro I) Id Discipline Disciplines are industry sectors,
“branches of knowledge, systems of
rules of conduct or methods of
practice”. http://bit.ly/7jT82
Isp Specialty Specialty is a focus area of
knowledge, expertise, production or
service within a sub-discipline.
MESO P Project Teams n/a P Project Team Project Teams are temporary
Projects and groupings of organisations with the
their teams aim of fulfilling predefined objectives
of a project - a planned endeavour,
usually with a specific goal and
accomplished in several steps or
stages. http://bit.ly/dqMYg
MICRO O Organisations (Macro O) O Organisation An organisation is a 'social
Organisations arrangement which pursues collective
Units, their goals, which controls its own
Groups & performance, and which has a
Members
boundary separating it from its
environment. http://bit.ly/v7p9N
(Meso O) Ou Organisational Departments and Units are specialised
Unit divisions of an organisation. These
can be co-located or distributed
geographically.
Og Organisational Organisational Groups consist of
Group (or team) individual human resources assigned
to perform an activity or deliver a set
of assigned objectives. Groups (also
referred to as organisational teams)
can be physically co-located or
formed across geographical or
departmental lines.
(Micro O) Om Organisational Organisational members can be part
Member of multiple Organisational Groups.

32
Table 4 BIM Competency Granularity Levels v2.1

GLevel Number, GLevel Name, Description and OScale Assessment By, Report Type and Guide
Scoring System (Numerical and/or Named) applicability Name
1 Discovery A low detail assessment used All Scales Self Discovery Notes
for basic and semi-formal
discovery of BIM Capability BIMC&M Discovery
and Maturity. Discovery Guide
assessments yield a basic
numerical score.
2 Evaluation A more detailed assessment of All Scales Self and Peer Evaluation Sheets
BIM Capability and Maturity.
Evaluation assessments yield a BIMC&M Evaluation
detailed numerical score. Guide
3 Certification A highly-detailed appraisal of 8 and 9 External Consultant Certificate
those Competency Areas
applicable across disciplines, BIMC&M
markets and sectors. Certification Guide
Certification appraisal is used
for Structured (Staged)
Capability and Maturity and
yields a formal, Named
Maturity Level.
4 Auditing Auditing is the most 8, 9, 10 & 11 Self, Peer Audit Report
comprehensive appraisal type. and External
In addition to competencies Consultant BIMC&M Auditing
covered under Certification, Guide
Auditing appraises detailed
Competency Areas including
those specific to a market,
discipline or a sector. Audits are
highly customisable, suitable
for Non-structured
(Continuous) Capability and
Maturity and yield a Named
Maturity Level plus a
Numerical Maturity Score for
each Competency Area audited.

33

You might also like