Measuring BIM Performance Five Metrics
Measuring BIM Performance Five Metrics
Succar Bilal, Sher William David, Williams Anthony Philip . (2012). Measuring BIM performance: Five metrics
Originally published in Architectural Engineering and Design Management, 8 120-142. Available from :
http://dx.doi.org/10.1080/17452007.2012.659506
Willy Sher
School of Architecture and Built Environment,
University of Newcastle,
Callaghan Campus
NSW 2308, Australia.
Anthony Williams
School of Architecture and Built Environment,
University of Newcastle,
Callaghan Campus
NSW 2308, Australia.
1
Abstract
The term Building Information Modelling (BIM) refers to an expansive knowledge domain within the
Design, Construction and Operation (DCO) industry. The voluminous possibilities attributed to BIM
represent an array of challenges that can be met through a systematic research and delivery framework
spawning a set of performance assessment and improvement metrics. This paper identifies five
complementary components specifically developed to enable such assessment: [1] BIM Capability
Stages representing transformational milestones along the implementation continuum [2] BIM
Maturity Levels representing the quality, predictability and variability within BIM Stages, [3] BIM
Competencies representing incremental progressions towards and improvements within BIM Stages,
[4] Organisational Scales representing the diversity of markets, disciplines and company sizes and [5]
Granularity Levels enabling highly-targeted yet flexible performance analyses ranging from informal
self-assessment to high-detail, formal organisational audits. This paper explores these complementary
components and positions them as a systematic method to understand BIM performance and to enable
its assessment and improvement. Figure 1 provides a flowchart of the contents of this paper.
2
1. A brief introduction to BIM
Building Information Modelling (BIM) is a term that is used by different authors in many different
ways. The nuances between their definitions highlight the rapid growth the area has experienced, as
well as the potential for confusion to arise when ill-defined terminology is used to communicate
specific meanings. In the context of this paper, BIM refers to a set of interacting policies, processes
and technologies (illustrated in Figure 2) that generate a “methodology to manage the essential
building design and project data in digital format throughout the building’s life-cycle” (Penttilä, 2006).
There are many signs that the use of BIM tools and processes is reaching a tipping-point in some
markets (Keller, Gerjets, Scheiter, & Garsoffky, 2006; McGraw-Hill, 2009). For example, in the USA
an increasing number of large institutional clients now require object-based 3D models to be provided
as part of tender submissions (Alison, Eugene, & Garry, 1997). Furthermore, the UK Cabinet Office
has recently published a construction strategy paper that requires the submission of a “fully
collaborative 3D BIM (with all project and asset information, documentation and data being
electronic) as a minimum by 2016” (BIS, 2011; UKCO, 2011 p. 14). Other signs include the
abundance of BIM-specific software tools, books, new media tools and reports (M. J. Eppler & Platts,
2009).
3
1.2 Issues arising from the proliferation of BIM
Notwithstanding the much-touted benefits of BIM as a means of increasing productivity, there are
currently few metrics that measure such improvements. Furthermore, little guidance is available for
organisations wishing to generate new or enhance their existing BIM deliverables. Those wishing to
adopt BIM or identify and / or prioritize their requirements are thus left to their own devices. The
implementation of any new technology is fraught with challenges and BIM is no exception. In
addition, those implementing BIM frequently expect to be able to realise significant benefits and
productivity gains whilst they are still inexperienced users. Successful implementation of these
systems requires an appreciation of how BIM resources (including hardware, software as well as the
technical and management skills of staff) need to evolve in harmony with each other. The multiple
and varied understandings that practitioners have of BIM further compounds the difficulties they
experience. When the unforeseen happens, the risks, costs and difficulties associated with
implementing BIM increase. In such circumstances compromises are likely to be made leading, in
BIM use needs to be assessable if the productivity improvements that result from its implementation
are to be made apparent. Without such metrics, teams and organisations are unable to consistently
measure their own successes and / or failures. Performance metrics enable teams and organisations to
assess their own competencies in using BIM and, potentially, to benchmark their progress against that
of other practitioners. Furthermore, robust sets of BIM metrics lay the foundations for formal
certification systems, which could be used by those procuring construction projects to pre-select BIM
service providers.
Whilst it is important to develop metrics and benchmarks for BIM performance assessment, it is
equally important that these metrics are accurate and able to be adapted to different industry sectors
4
and organisations. Considerable insight can be gained from the performance measurement tools
developed for other industries but it would be foolhardy to rely on any tool which is not designed for
the specific requirements of the task in question. Those required to measure key BIM
This paper describes a set of metrics purposefully developed to measure the specifics of BIM
performance. To increase their reliability, adoptability and usability for different stakeholders, the
first-named author identified the following performance criteria. The metrics should be:
Applicable: able to be utilised by all stakeholders across all phases of a project’s lifecycle.
Cumulative: set as logical progressions; deliverables from one act as prerequisites for another.
Flexible: able to be performed across markets, organisational scales and their subdivisions.
Informative: provide “feedback for improvement” and “guidance for next steps” (Nightingale &
Neutral: not prejudice proprietary, non-proprietary, closed, open, free or commercial solutions or
schemata.
This paper describes the development of a set of BIM performance metrics based on these guiding
5
2. Research design
The investigations described in this paper are part of a larger PhD study which addresses the question
of how to represent BIM knowledge structures and provide models that facilitate the implementation
of BIM in academic and industrial settings. It is grounded in a set of paradigms, theories, concepts
and experiences which combine to form the view of the BIM domain reported here.
According to Maxwell (2005), the conceptual background underpinning a study such as this is
typically based on several sources including previous research and existing theories, the researcher’s
own experiential knowledge and thought experiments. Various theories (including systems theory
(Ackoff, 1971; Chun, Sohn, & Granados, 2008), systems thinking (Chun, et al., 2008), diffusion of
innovation theory (Fox & Hietanen, 2007; Mutai, 2009; Rogers, 1995), technology acceptance models
(Davis, 1989; Venkatesh & Davis, 2000) and complexity theory (Froese, 2010; Homer-Dixon, 2001))
assisted in analysing the BIM domain and enriched the study’s conceptual background. Constraints
identified in these theories led to the development of a new theoretical framework based on an
inductive approach “[more suitable for researchers who are more concerned about] the correspondence
of their findings to the real world than their coherence with existing theories or laws” (Meredith,
The five components of BIM performance measurement are some of the deliverables of the BIM
2009). The Framework itself is composed of a number of high-level concepts which interact to
generate a set of guides and tools necessary to [i] facilitate BIM implementations, [ii] conduct BIM
6
The theoretical underpinnings of the BIM Framework have been generated through a process of
inductive inference (Michalski, 1987), conceptual clustering (Michalski & Stepp, 1987) and reflective
learning (Van der Heijden & Eden, 1998) (Walker, Bourne, & Shelley, 2008). Framework components
were then represented visually through a series of ‘knowledge models’ to reduce topic complexity
(Tergan, 2003) and facilitate knowledge transfer to others (M. Eppler & Burkhard, 2005).
Many of the BIM Framework’s components – Fields, Stages, Lenses, Steps, Competencies and several
visual knowledge models – have been subjected to a process of validation through a series of
international focus groups employing a mixed-model approach (Tashakkori, A. and Teddlie, C. ,1998).
The results from these focus groups and their impact on the development of the five components of
accurate and consistent BIM performance measurement (Succar, 2010b). These include BIM
Capability Stages, BIM Maturity Levels, BIM Competency Sets, Organisational Scales and
Granularity Levels.
The following sections provide brief introductions to each component. They are followed by a step-
by-step workflow which allows BIM Capability and Maturity assessments to be conducted.
BIM Capability is defined here as the basic ability to perform a task or deliver a BIM service/product.
BIM Capability Stages (or BIM Stages) define the minimum BIM requirements - the major milestones
that need to be reached by teams or organisations as they implement BIM technologies and concepts.
Three BIM Stages separate ‘pre-BIM’, a fixed starting point representing industry status before BIM
implementation, from ‘post-BIM’, a variable end-point representing the continually evolving goal of
employing virtually integrated Design, Construction and Operation (viDCO) tools and concepts. (The
term viDCO is used in preference to Integrated Project Delivery (IPD) as representing the ultimate
7
goal of implementing BIM (AIA, 2007) to prevent any confusion with the term’s evolving contractual
BIM Stages are defined by their minimum requirements. For example, to be considered as having
achieved BIM Capability Stage 1, an organisation needs to have deployed an object-based modelling
software tool similar to ArchiCAD, Revit, Tekla or Vico. Similarly, for BIM Capability Stage 2, an
which links to external databases and shares object-based models with at least two other disciplines – a
solution similar to a model server or BIMSaaS solution (BIMserver, 2011; Onuma, 2011; Wilkinson,
2008).
Each of these three Capability Stages may be further subdivided into Competency Steps. What
differentiates stages from steps is that stages are transformational or radical changes, while steps are
incremental ones (Henderson & Clark, 1990; Taylor & Levitt, 2005). The collection of steps involved
in working towards or within a BIM Stage (i.e. across the continuum from pre-BIM to post-BIM) is
driven by different perquisites for, challenges within and deliverables of each BIM Stage. In addition
to their type (the Competency Set they belong to – refer to Section 3.3), the following BIM Steps can
8
Insert Figure 3 here
The term ‘BIM Maturity’ refers to the quality, repeatability and degree of excellence within a BIM
Capability. Whilst ‘capability’ denotes a minimum ability (refer to Section 3.1), ‘maturity’ denotes the
extent of that ability in performing a task or delivering a BIM service/product. BIM Maturity’s
benchmarks are performance improvement milestones (or levels) that teams and organisations aspire
to or work towards. In general, the progression from lower to higher levels of maturity indicates (i)
improved control resulting from fewer variations between performance targets and actual results, (ii)
enhanced predictability and forecasting of reaching cost, time and performance objectives, and (iii)
greater effectiveness in reaching defined goals and setting new more ambitious ones (Lockamy III &
The concept of BIM Maturity has been adopted from Software Engineering Institute’s (SEI)
Capability Maturity Model (CMM) (SEI, 2008a), a process improvement framework initially intended
as a tool to evaluate the ability of government contractors to deliver software projects. CMM
originated in the field of quality management (Crosby, 1979) and was later developed for the benefit
of the US Department of Defence (Hutchinson & Finnemore, 1999). Its successor, the more
comprehensive Capability Maturity Model Integration (CMMI) (SEI, 2006a, 2006b, 2008c), continues
to be developed and extended by the Software Engineering Institute, Carnegie Mellon University.
Several CMM variants exist for other industries (Succar, 2010a) but they are all, in essence,
specialised frameworks that assist stakeholders to improve their capabilities (Jaco, 2004) and benefit
9
from process improvements. Example benefits include increased productivity and Return On
Investment (ROI) as well as reduced costs and post-delivery defects (Hutchinson & Finnemore, 1999).
Maturity models are typically composed of multiple maturity levels, or process improvement ‘building
blocks’ or ‘components’ (Paulk, Weber, Garcia, Chrissis, & Bush, 1993). When the requirements of
each level are satisfied, implementers can then build on established components to attempt ‘higher’
maturity. Although CMMs are not without their detractors (for example (Bach, 1994; Jones, 1994;
Weinberg, 1993)), research conducted in other industries has already identified a correlation between
improved process maturity and business performance (Lockamy III & McCormack, 2004).
The ‘original’ software industry CMM, however, is not applicable to the construction industry. It does
not address supply chain issues, and its maturity levels do not account for the different phases of the
lifecycle of a construction project (Sarshar et al., 2000). Although other efforts, derived from CMM,
focus on the construction industry (refer to Table 1), there is no comprehensive maturity model/index
that can be applied to BIM, its implementation stages, players, deliverables or its effect on project
lifecycle phases.
The CMMs listed in Table 1 are similar in structure and objectives but differ in conceptual depth,
industrial focus, terminology and target audience. A common theme is how CMMs employ simple
organisations. In analysing their suitability for developing a BIM-specific maturity index, most are
broad in approach and can collectively form a basis for a range of BIM processes, technologies and
policies. However, none easily accommodates the size of organisations being monitored. Also, from a
terminology standpoint, there is insufficient differentiation between the notion of capability (an ability
10
to perform a task) and that of maturity (the degrees of excellence in performing a task). This
differentiation is critical when catering for staged BIM implementation as it responds to the disruptive
To address the aforementioned shortcomings, the BIM Maturity Index (BIMMI) has been developed
by analysing and then integrating these and other maturity models used across different industries. The
BIMMI has been customised to reflect the specifics of BIM capability, implementation requirements,
performance targets and quality management. It has five distinct levels: (a) Initial / Ad-hoc, (b)
Defined, (c) Managed, (d) Integrated and (e) Optimised (Figure. 4). Level names were chosen to
reflect the terminology used in many maturity models, to be easily understandable by DCO
stakeholders and to reflect increasing BIM maturity from ad-hoc to continuous improvement (Table
2).
A BIM Competency Set is a hierarchical collection of individual competencies identified for the
purposes of implementing and assessing BIM. In this context, the term competency reflects a generic
set of abilities suitable for implementing as well as assessing BIM Capability and/or Maturity. Figure
11
5 illustrates how the BIM Framework generates BIM Competency Sets out of multiple Fields, Stages
BIM Competencies are a direct reflection of BIM Requirements and Deliverables and can be grouped
Technology sets in software, hardware and networks. For example, the availability of a BIM tool
allows the migration from drafting-based to object-based workflow (a requirement of BIM Stage 1)
Process sets in leadership, infrastructure, human resources and products/services. For example,
collaboration processes and database-sharing skills are necessary to allow model-based collaboration
Policy sets in contracts, regulations and research/education. For example, alliance-based or risk-
sharing contractual agreements are pre-requisites for network-based integration (BIM Stage 3).
Figure 6 provides a partial mind-map of BIM Competency Sets shown at Granularity Level 2 (For an
12
3.4 BIM Organisational Scales
To allow BIM performance assessments to respect the diversity of markets, disciplines and company
sizes, an Organisational Scale (OScale) has been developed. The Scale can be used to customise
Competency Sets include a large number of individual competencies grouped under numerous
headings (shown in Figure 6). To enhance BIM Capability and Maturity assessments and to increase
their flexibility, a Granularity ‘filter’ with four Granularity Levels (GLevels) has been developed.
Progression from lower to higher levels of granularity indicates an increase in (i) assessment breadth,
Using higher-granularity levels (GLevels 3 or 4) exposes more detailed Competency Areas than
lower-granularity levels (GLevels 1 or 2). This variability enables the preparation of several BIM
performance measurement tools ranging from low-detail, informal and self-administered assessments
to high-detail, formal and specialist-led appraisals. Table 4 provides more information about the four
Granularity Levels:
13
Granularity Levels increase or decrease the number of Competency Areas used for performance
assessment. For example, the mind map provided in Figure 6 reveals ten Competency Areas at
GLevel 1 and thirty-six Competency Areas at GLevel 2. Also, at GLevels 3 and 4, the number of
Competency Areas available for performance assessment increases dramatically as shown in Figure 7.
The partial mind-map shown in Figure 7 reveals many additional Competency Areas under GLevel 3,
such as Data Storage and Data Exchange. At GLevel 4, the map reveals even more detailed
Competency Areas including Structured and Unstructured Data, which in-turn branch into computable
and non-computable components (Kong et al., 2005) (Mathes, 2004) (Fallon & Palmer, 2007).
levels, competency sets, organisational scales and granularity levels) allow performance assessments
Section 1.4 all apply. To manage all possible configurations, a simple assessment and reporting
14
The workflow shown in Figure 8 identifies the five steps needed to conduct a BIM performance
assessment. Starting with an extensive pool of generic BIM Competencies - applicable across DCO
disciplines and organisational sizes – assessors can first filter-out non-applicable Competency Sets,
conduct a series of assessments based on the Competencies remaining and then generate appropriate
Assessment Reports.
5. A Final Note
The five BIM Framework components, briefly discussed in this paper, provide a range of opportunities
for DCO stakeholders to measure and improve their BIM performance. The components complement
each other and enable highly targeted yet flexible performance analyses to be conducted. These range
from informal self-assessments to highly detailed and formal organisational audits. Such a system of
assessment can be utilised to standardize BIM implementation and assessment efforts, enable a
structured approach to BIM education and training as well as establish a solid base for a formal BIM
certification process.
After scrutiny of a significant part of the BIM Framework through peer-reviewed publications and a
series of international focus groups, the five components and other related assessment metrics are
currently being extended and field-tested. Sample online tools (focusing on selected disciplines, at
different granularities) are currently being formulated. All these form part of an ongoing effort to
promote the establishment of an independent BIM certification body responsible for assessing and
accrediting individuals, organisations and collaborative project teams. Subject to additional field-
testing and tool calibration, the five components may be well-placed to consistently assess, and by
15
Acknowledgement
This chapter draws on the first-named author’s PhD research at the University of Newcastle, School of
Architecture and Built Environment (Australia). The first-named author wishes to acknowledge his
supervisors Willy Sher, Guillermo Aranda-Mena and Anthony Williams for their continuous support.
16
References
Ackoff, R. L. (1971). Towards a System of Systems Concepts. MANAGEMENT SCIENCE, 17(11),
661-671.
Alison, O., Eugene, A., & Garry, K. (1997). Is an illustration always worth ten thousand words?
Effects of prior knowledge, learning style and multimedia illustrations on text comprehension.
International Journal of Instructional Media, 24(3), 227.
Arif, M., Egbu, C., Alom, O., & Khalfan, M. M. A. (2009). Measuring knowledge retention: a case
study of a construction consultancy in the UAE. Engineering, Construction and Architectural
Management, 16(1), 92-108.
Bew, M., Underwood, J., Wix, J., & Storer, G. (2008). Going BIM in a Commercial World. Paper
presented at the EWork and EBusiness in Architecture, Engineering and Construction: European
Conferences on Product and Process Modeling (ECCPM 2008).
BIMserver. (2011). Open Source Building Information Modelserver. Retrieved October 20, 2011,
from http://bimserver.org/
BIS. (2011). A Report for the Government Construction Client Group, Building Information
Modelling (BIM) Working Party Strategy: Department for Business Innovation & Skills (BIS).
Chun, M., Sohn, K., & Granados, P. (2008). Systems Theory and Knowledge Management Systems:
The Case of Pratt-Whitney Rocketdyne.
Crawford, J. K. (2006). The Project Management Maturity Model. Information Systems Management,
23(4), 50-58.
Crosby, P. B. (1979). Quality is free: The art of making quality certain. New York: New American
Library.
Davis, F. D. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of
Information Technology. [Article]. MIS Quarterly, 13(3), 319-340.
Doss, D. A., Chen, I. C. L., & Holland, L. D. (2008). A proposed variation of the capability maturity
model framework among financial management settings. Paper presented at the Allied Academies
International Conference, Tunica.
Eppler, M. J., & Platts, K. W. (2009). Visual Strategizing: The Systematic Use of Visualization in the
Strategic-Planning Process. Long Range Planning, 42(1), 42-74.
Fallon, K. K., & Palmer, M. E. (2007). General Buildings Information Handover Guide: Principles,
Methodology and Case Studies: NIST.
Fox, S., & Hietanen, J. (2007). Interorganizational use of building information models: potential for
automational, informational and transformational effects. Construction Management and Economics,
25(3), 289 - 296.
17
Froese, T. M. (2010). The impact of emerging information technology on project management for
construction. Automation in Construction, 19(5), 531-538.
Gillies, A., & Howard, J. (2003). Managing change in process and people: combining a maturity
model with a competency-based approach. Total Quality Management & Business Excellence, 14(7),
779 - 787.
Hardgrave, B. C., & Armstrong, D. J. (2005). Software process improvement: it's a journey, not a
destination. Commun. ACM, 48(11), 93-96.
Henderson, R. M., & Clark, K. B. (1990). Architectural Innovation: The Reconfiguration of Existing
Product Technologies and the Failure of Established Firms. Administrative Science Quarterly, 35(1),
9.
Hutchinson, A., & Finnemore, M. (1999). Standardized process improvement for construction
enterprises. Total Quality Management, 10, 576-583.
IU. (2009a). The Indiana University Architect's Office - BIM Design & Construction Requirements,
Follow-Up Seminar (PowerPoint Presentation). 32. Retrieved from
http://www.indiana.edu/~uao/IU%20BIM%20Rollout%20Presentation%209-10-2009.pdf
IU. (2009b). The Indiana University Architect's Office - IU BIM Proficiency Matrix (Multi-tab Excel
Workbook). 9 tabs. Retrieved from
http://www.indiana.edu/~uao/IU%20BIM%20Proficiency%20Matrix.xls
Jaco, R. (2004). Developing an IS/ICT management capability maturity framework. Paper presented at
the Proceedings of the 2004 annual research conference of the South African institute of computer
scientists and information technologists on IT research in developing countries.
Jones, C. (1994). Assessment and control of software risks: Prentice-Hall, New Jersey.
Keller, T., Gerjets, P., Scheiter, K., & Garsoffky, B. (2006). Information visualizations for knowledge
acquisition: The impact of dimensionality and color coding. Computers in Human Behavior, 22(1), 43-
65.
Kong, S. C. W., Li, H., Liang, Y., Hung, T., Anumba, C., & Chen, Z. (2005). Web services enhanced
interoperable construction products catalogue. Automation in Construction, 14(3), 343-352.
Kwak, Y. H., & Ibbs, W. C. (2002). Project Management Process Maturity (PM)2 Model. ASCE,
Journal of Management in Engineering, 18(3), 150-155.
Lainhart IV, J. W. (2000). COBIT™: A Methodology for Managing and Controlling Information and
Information Technology Risks and Vulnerabilities. Journal of Information Systems, 14(s-1), 21-25.
Lockamy III, A., & McCormack, K. (2004). The development of a supply chain management process
maturity model using the concepts of business process orientation. Supply Chain Management: An
International Journal, 9(4), 272-278.
Maxwell, J. A. (2005). Qualitative Research Design: An Interactive Approach: Sage Publications, Inc.
18
McCormack, K. (2001). Supply Chain Maturity Assessment: A Roadmap for Building the Extended
Supply Chain. Supply Chain Practice, 3, 4-21.
McCormack, K., Ladeira, M. B., & Oliveira, M. P. V. d. (2008). Supply chain maturity and
performance in Brazil. Supply Chain Management: An International Journal, 13(4), 272-282.
McGraw-Hill. (2009). The Business Value of BIM: Getting Building Information Modeling to the
Bottom Line: McGraw-Hill Construction Analytics
Meredith, J. R., Raturi, A., Amoako-Gyampah, K., & Kaplan, B. (1989). Alternative research
paradigms in operations. Journal of Operations Management, 8(4), 297-326.
Michalski, R. S., & Stepp, R. E. (1987). Clustering. In S. S. Shapiro (Ed.), Encyclopedia of Artificial
Intelligence (Vol. 1, pp. 103-111). New York: Wiley.
Mutai, A. (2009). Factors Influencing the Use of Building Information Modeling (BIM) within
Leading Construction Firms in the United States of America. Unpublished Doctor of Philosophy,
Indiana State University, Terre Haute.
NIBS. (2007). National Institute for Building Sciences (NIBS) Facility Information Council (FIC) –
BIM Capability Maturity Model. Retrieved October 11, 2008, from
www.buildingsmartalliance.org/client/assets/files/bsa/BIM_CMM_v1.9.xls
Nightingale, D. J., & Mize, J. H. (2002). Development of a Lean Enterprise Transformation Maturity
Model. Information Knowledge Systems Management, 3(1), 15.
NIST. (2007). National Building Information Modeling Standard - Version 1.0 - Part 1: Overview,
principles and Methodologies: National Institute of Building Sciences.
OGC. (2008). Portfolio, Programme, and Project Management Maturity Model (P3M3): Office of
Government Commerce - England.
OGC. (2009). Information Technology Infrastructure Library (ITIL) - Offic eof Government
Commerce. Retrieved February 13, 2009, from http://www.itil-officialsite.com/home/home.asp
Onuma. (2011). Onuma Model Server. Retrieved October 20, 2011, from
http://onuma.com/products/BimDataApi.php
Paulk, M. C., Weber, C. V., Garcia, S. M., Chrissis, M. B., & Bush, M. (1993). Key Practices of the
Capability Maturity Model - Version 1.1 (Technical Report): Software Engineering Institute, Carnegie
Mellon University.
Pederiva, A. (2003). The COBIT® Maturity Model in a Vendor Evaluation Case. INFORMATION
SYSTEMS CONTROL JOURNAL, 3, 26-29.
Sahibudin, S., Sharifi, M., & Ayat, M. (2008). Combining ITIL, COBIT and ISO/IEC 27002 in Order
to Design a Comprehensive IT Framework in Organizations. Paper presented at the Modeling &
Simulation, 2008. AICMS 08. Second Asia International Conference
19
Sarshar, M., Haigh, R., Finnemore, M., Aouad, G., Barrett, P., Baldry, D., et al. (2000). SPICE: a
business process diagnostics tool for construction projects. Engineering Construction & Architectural
Management, 7(3), 241-250.
Sebastian, R., & Van Berlo, L. (2010). Tool for Benchmarking BIM Performance of Design,
Engineering and Construction Firms in the Netherlands. Architectural Engineering and Design
Management, Special Issue: Integrated Design and Delivery Solutions, 6, 254-263.
SEI. (2006a). Capability Maturity Model Integration for Development (CMMI-DEV), Improving
processes for better products: Software Engineering Institute / Carnegie Melon.
SEI. (2006b). Capability Maturity Model Integration Standard (CMMI) Appraisal Method for Process
Improvement (SCAMPI) A, Version 1.2- Method Definition Document: Software Engineering Institute
/ Carnegie Melon.
SEI. (2006c). CMMI for Development, Improving processes for better products: Software Engineering
Institute / Carnegie Melon.
SEI. (2008a). Capability Maturity Model Integration - Software Engineering Institute / Carnegie
Melon. Retrieved October 11, 2008, 2008, from http://www.sei.cmu.edu/cmmi/index.html
SEI. (2008b). Capability Maturity Model Integration for Services (CMMI-SVC), Partner and Piloting
Draft, V0.9c: Software Engineering Institute / Carnegie Melon.
SEI. (2008c). CMMI for Services. Retrieved December 24, 2008, from
http://www.sei.cmu.edu/cmmi/models/CMMI-Services-status.html
SEI. (2008d). People Capability Maturity Model - Version 2, Software Engineering Institute /
Carnegie Melon. Retrieved October 11, 2008, 2008, from http://www.sei.cmu.edu/cmm-
p/version2/index.html
Stephens, S. (2001). Supply Chain Operations Reference Model Version 5.0: A New Tool to Improve
Supply Chain Efficiency and Achieve Best Practice. Information Systems Frontiers, 3(4), 471-476.
Succar, B. (2009). Building information modelling framework: A research and delivery foundation for
industry stakeholders. Automation in Construction, 18(3), 357-375.
Succar, B. (2010a). Building Information Modelling Maturity Matrix. In J. Underwood & U. Isikdag
(Eds.), Handbook of Research on Building Information Modelling and Construction Informatics:
Concepts and Technologies: Information Science Reference, IGI Publishing.
Succar, B. (2010b). The Five Components of BIM Performance Measurement. Paper presented at the
CIB World Congress.
Suermann, P. C., Issa, R. R. A., & McCuen, T. L. (2008). Validation of the U.S. National Building
Information Modeling Standard Interactive Capability Maturity Model Paper presented at the 12th
International Conference on Computing In Civil and Building Engineering, October 16-18.
Taylor, J., & Levitt, R. E. (2005). Inter-organizational Knowledge Flow and Innovation Diffusion in
Project-based Industries. Paper presented at the 38th International Conference on System Sciences,
Hawaii, USA.
Tergan, S. O. (2003). knowledge with computer-based mapping tools. Paper presented at the ED-
Media 2003 World Conference on Educational Multimedia, Hypermedia & Telecommunication
Honolulu, HI: University of Honolulu.
TNO. (2010). BIM QuickScan - a TNO initiative (sample QuickScan Report - PDF). 3. Retrieved
from http://www.bimladder.nl/wp-content/uploads/2010/01/voorbeeld-quickscan-pdf.pdf
20
UKCO. (2011). Government Construction Strategy: United Kingdom Cabinet Office.
Vaidyanathan, K., & Howell, G. (2007). Construction Supply Chain Maturity Model - Conceptual
Framework. Paper presented at the International Group For Lean Construction (IGLC-15).
Van der Heijden, K., & Eden, C. (1998). The Theory and Praxis of Reflective Learning in Strategy
Making. In C. Eden & J.-C. Spender (Eds.), Managerial and Organizational Cognition: Theory,
Methods and Research (pp. 58-75). London: Sage.
Venkatesh, V., & Davis, F. D. (2000). A Theoretical Extension of the Technology Acceptance Model:
Four Longitudinal Field Studies. MANAGEMENT SCIENCE, 46(2), 186-204.
Walker, D. H. T., Bourne, L. M., & Shelley, A. (2008). Influence, stakeholder mapping and
visualization. Construction Management and Economics, 26(6), 645 - 658.
Weinberg, G. M. (1993). Quality software management (Vol. 2): First-order measurement: Dorset
House Publishing Co., Inc. New York, NY, USA.
Widergren, S., Levinson, A., Mater, J., & Drummond, R. (2010, 25-29 July 2010). Smart grid
interoperability maturity model. Paper presented at the Power and Energy Society General Meeting,
2010 IEEE.
Wilkinson, P. (2008, July 12, 2008). SaaS-based BIM. Extranet Evolution - Construction
Collaboration Technologies, from
http://www.extranetevolution.com/extranet_evolution/2008/04/saas-based-bim.html
21
Figures
22
Figure 2 The interlocking fields of BIM activity (Succar, 2009)
23
Figure 4 Building Information Modelling Maturity Levels at BIM Stage 1
24
Figure 6 BIM Competency Sets v1.1 – shown at Granularity Level 2
25
Figure 7 Technology Competency Areas at Granularity Level 4 – partial mind map v1.1
Figure 8 BIM Capability and Maturity Assessment and Reporting Workflow Diagram - v2.0
26
Tables
Table 1 Maturity Models influencing the BIM Maturity Index
The 5 Proficiency Levels (or BIM Standards) are: ‘Working towards BIM’ – the
lowest standard, ‘Certified BIM’, ‘Silver’, ‘Gold’ and ‘Ideal’ - the highest BIM
Maturity Standard.
BIM QuickScan – TNO Built Environment and Geosciences
The BIM QuickScan tool aims to “serve as a standard BIM benchmarking
instrument in the Netherlands”. The scan is intended to be performed “in a limited
time of maximum one day”(Sebastian & Van Berlo, 2010 p. 255 & 258).
The BIM QuickScan Tool is organized around 4 chapters: Organization and
Management, Mentality and Culture, Information Structure and Information Flow,
and Tools and Applications. “Each chapter contains a number of KPIs in the form
of a multiple-choice questionnaire…With each KPI, there are a number of possible
answers. For each answer, a score is assigned. Each KPI also carries a certain
Score representation (by category) weighting factor. The sum of all the partial scores after considering the weighting
from the sample BIM QuickScan factors represents the total score of BIM performance of an organization”
report (TNO, 2010) (Sebastian & Van Berlo, 2010 p. 258 & 259).
KPIs are assessed against a percentile score while ‘Chapters’, representing a
collation of KPIs, are assessed against a 5-level system (0 to 4).
COBIT, Control Objects for Information and related Technology –
Information Systems Audit and Control Association (ISACA) and the IT
Governance Institute (ITGI)
The main objective of COBIT is to “enable the development of clear policy and
good practice for IT control throughout organizations” (Lainhart IV, 2000 p. 22).
The COBIT Maturity Model is “an IT governance tool used to measure how well
developed the management processes are with respect to internal controls. The
maturity model allows an organization to grade itself from non-existent (0) to
optimized (5)” (Pederiva, 2003 p. 1). COBIT includes 6 Maturity Levels (Non-
existent, Initial/ad hoc, Repeatable but Intuitive, Defined Process, Managed and
Measurable and Optimised), 4 Domains and 34 Control Objectives.
(Lainhart IV, 2000)
Note: There is some alignment between ITIL (OGC, 2009) and COBIT with
respect to IT governance within organisations (Sahibudin, Sharifi, & Ayat, 2008)
of value to BIM implementation efforts.
CMMI, Capability Maturity Model Integration - Software Engineering
Institute / Carnegie Melon
27
Capability Maturity Model® Integration (CMMI) is a process improvement
approach that helps integrate traditionally separate organizational functions, set
process improvement goals and priorities, provide guidance for quality processes,
and provide a point of reference for appraising current processes (SEI, 2006b,
2006c, 2008a, 2008b, 2008c).
CMMI has 5 Maturity Levels (for Staged Representation, 6 Capability Levels for
Continuous Representation), 16 core Process Areas (22 for CMMI-DEV and 24 for
CMMI-SVC) and 1 to 4 Goals for each Process Area.
Source: NASA, Software
Engineering Process Group The 5 Maturity Levels are: Initial, Managed, Defined, Quantitatively Managed and
http://bit.ly/CMMI-NASA Optimising.
CSCMM, Construction Supply Chain Maturity Model
“Construction supply chain management (CSCM) refers to the management of
information, flow, and money in the development of a construction project” as
mentioned in (Vaidyanathan & Howell, 2007 p. 170).
The iBIM model identifies specific capability targets (not performance milestones)
for the UK Construction Industry covering technology, standards, guides,
classifications and delivery (total number of topics not defined). Targets for each
(BIS, 2011) topic are organised under one or more loosely defined Maturity Levels (0-3)
I-CMM, Interactive Capability Maturity Model - National Institute for
Building Sciences (NIBS) Facility Information Council (FIC)
This I-CMM is closely coupled with the NBIMS effort (Version1, Part 1) and
establishes “a tool to determine the level of maturity of an individual BIM as
measured against a set of weighted criteria agreed to be desirable in a Building
Information Model” (Suermann, et al., 2008 p. 2) (NIST, 2007) (NIBS, 2007).
28
LESAT is focused on “assessing the degree of maturity of an enterprise in its use
of ‘lean’ principles and practices to achieve the best value for the enterprise and its
stakeholders” (Nightingale & Mize, 2002 p. 17).
LESAT has 54 Lean Practices organised within three Assessment Sections: Lean
Transformation/ Leadership, Life Cycle Processes and Enabling Infrastructure and
5 Maturity Levels: Some Awareness/Sporadic, General Awareness/Informal,
Systemic Approach, Ongoing Refinement and Exceptional/Innovative.
The P3M3 has 5 Maturity Levels: Awareness, Repeatable, Defined, Managed and
Optimised.
(OGC, 2008)
(SEI, 2008d)
(PM)² has 5 Maturity Levels: Initial, Planned, Managed at Project Level, Managed
at Corporate Level and Continuous Learning.
29
Supply Chain Management Process Maturity Model and Business Process
Orientation (BPO) Maturity Model
The model conceptualizes the relation between process maturity and supply chain
operations as based on the Supply-chain Operations Reference Model (Stephens,
2001). The model’s maturity describes the “progression of activities toward
effective SCM and process maturity. Each level contains characteristics associated
with process maturity such as predictability, capability, control, effectiveness and
efficiency" (Lockamy III & McCormack, 2004 p. 275; K. McCormack, 2001).
The 5 Maturity Levels are: Ad-hoc, Defined, Linked, Integrated and Extended.
(Lockamy III & McCormack,
2004)
Other maturity models – or variation on listed maturity models - include those on Software Process Improvement
(Hardgrave & Armstrong, 2005), IS/ICT Management Capability (Jaco, 2004), Interoperability (Widergren,
Levinson, Mater, & Drummond, 2010), Project Management (Crawford, 2006), Competency (Gillies & Howard, 2003)
and Financial Management (Doss, Chen, & Holland, 2008).
30
Table 2 A non-exhaustive list of terminology used by CMMs to denote maturity levels including those
used by the BIM Maturity Index
MATURITY LEVELS
Maturity Models 0 1 or a 2 or b 3 or c 4 or d 5 or e
BIM Maturity Index Initial/ Defined Managed Integrated Optimised
Ad-hoc
COBIT, Control Objects for Non-existent Initial/ Repeatable Defined Managed & Optimised
Information and related Ad- hoc but Intuitive Process Measurable
Technology
CMMI, Capability Maturity Initial Managed Defined Quantitatively Optimising
Model Integration (Staged Managed
Representation)
CMMI (Continuous Incomplete Performed Managed Defined Quantitatively Optimising
Representation) Managed
CSCMM, Construction Ad-hoc Defined Managed Controlled N/A
Supply Chain Maturity
Model
LESAT, Lean Enterprise Awareness/ General Systemic Ongoing Exceptional/
Self-Assessment Tool Sporadic Awareness/ Approach Refinement Innovative
Informal
P-CMM®, People Initial Managed Defined Predictable Optimising
Capability Maturity Model
P3M3, Portfolio, Programme Awareness Repeatable Defined Managed Optimised
and Project Management
Maturity Model
(PM)², Project Management Ad-hoc Planned Managed at Managed at Continuous
Process Maturity Model Project Level Corporate Level Learning
SPICE, Standardised Initial/ Planned & Well Quantitatively Continuously
Process Improvement for Chaotic Tracked Defined Controlled Improving
Construction Enterprises
Supply Chain Management Ad-hoc Defined Linked Integrated Extended
Process Maturity Model
31
Table 3 Organisational Scales
32
Table 4 BIM Competency Granularity Levels v2.1
GLevel Number, GLevel Name, Description and OScale Assessment By, Report Type and Guide
Scoring System (Numerical and/or Named) applicability Name
1 Discovery A low detail assessment used All Scales Self Discovery Notes
for basic and semi-formal
discovery of BIM Capability BIMC&M Discovery
and Maturity. Discovery Guide
assessments yield a basic
numerical score.
2 Evaluation A more detailed assessment of All Scales Self and Peer Evaluation Sheets
BIM Capability and Maturity.
Evaluation assessments yield a BIMC&M Evaluation
detailed numerical score. Guide
3 Certification A highly-detailed appraisal of 8 and 9 External Consultant Certificate
those Competency Areas
applicable across disciplines, BIMC&M
markets and sectors. Certification Guide
Certification appraisal is used
for Structured (Staged)
Capability and Maturity and
yields a formal, Named
Maturity Level.
4 Auditing Auditing is the most 8, 9, 10 & 11 Self, Peer Audit Report
comprehensive appraisal type. and External
In addition to competencies Consultant BIMC&M Auditing
covered under Certification, Guide
Auditing appraises detailed
Competency Areas including
those specific to a market,
discipline or a sector. Audits are
highly customisable, suitable
for Non-structured
(Continuous) Capability and
Maturity and yield a Named
Maturity Level plus a
Numerical Maturity Score for
each Competency Area audited.
33