Using an Idea Evaluation Canvas for Innovation
Experimentation
Refreshed 13 July 2022, Published 19 March 2021 - ID G00734229 - 10 min read
FOUNDATIONAL This research is reviewed periodically for accuracy.
By Analyst(s): Owen Chen, Gunjita Mundeja, Peter Skyttegaard
Initiatives: CIO Leadership of Innovation, Disruptive Trends and Emerging Practices
A one-page idea evaluation canvas provides structure for the
evaluation stage of an innovation process. It emphasizes short,
focused experimentation CIOs can use to reduce the risks and
uncertainties associated with an idea or opportunity.
Overview
Key Findings
■ Investigating uncertainty and risk is a key part of the innovation process, yet many
evaluation efforts are unfocused and lead to wasted investment.
■ Too many innovation projects focus on investigating a technology, rather than on
understanding the potential business or user impact of an opportunity.
■ Once innovation projects start, it can be challenging to “kill” them in the absence of
prespecified success criteria.
Recommendations
CIOs and other innovation leaders involved in innovation, disruptive trends and emerging
practices should:
■ Promote the use of a one-page idea evaluation canvas by showing how it can
capture the risks and benefits of an idea or opportunity and highlight areas where
additional information or experimentation is needed.
■ Focus evaluation efforts on proving or disproving a hypothesis about the potential
business value of an innovation by funding a cycle of “hypothesize and test”
experimentation.
Gartner, Inc. | G00734229 Page 1 of 11
This research note is restricted to the personal use of [email protected].
■ Target areas of highest risk or uncertainty first by asking questions relating to
desirability, feasibility and viability, and by focusing on the “minimum viable
experiment” that will obtain the information needed to make a decision.
Introduction
A one-page canvas with predefined sections is a useful and popular mechanism for
summarizing an innovation or new business opportunity. 1 A one-page canvas is valuable
as it:
■ Provides an easily digestible summary of critical information relating to the idea or
opportunity.
■ Highlights areas where more information is needed to make a decision on whether to
progress the opportunity.
The Gartner Idea Evaluation Canvas (see Figure 1) adapts the canvas concept for
evaluating any innovation (i.e., it’s not limited to new products and services), for example,
internal processes, infrastructure or management practices. It also combines the canvas
approach with the three key characteristics used by design company Ideo 2 in exploring
ideas:
■ Desirability — Will people want this product, service or improvement?
■ Feasibility — Can we make this idea work in practice?
■ Viability — Is there business value in delivering this idea?
Download the Idea Evaluation Canvas Template
Gartner, Inc. | G00734229 Page 2 of 11
This research note is restricted to the personal use of [email protected].
Figure 1: Idea Evaluation Canvas Template
Each box in the canvas represents a category of risk or uncertainty that can be used to
guide the evaluation phase of the innovation process. Rather than invest in a “prototype”
or a “proof of concept,” evaluation should proceed as a series of targeted experiments,
each of which addresses a specific category of risk or uncertainty and any of which could
result in a decision not to move forward.
The canvas in Figure 1 provides a starting point, with common categories of risk and
uncertainty. You may choose to replace some of the categories with others that better
reflect the nature of your ideas and opportunities. For example, Figure 1 includes a
technology performance category that could be replaced by a different risk factor if
technology doesn’t play a role in the solution. For additional examples of factors that
could form alternative categories in the canvas, see Assessing Emerging Technology
Readiness, Innovation Idea Selection — Choosing for Success, and The Art of the
Innovation Workshop.
Gartner, Inc. | G00734229 Page 3 of 11
This research note is restricted to the personal use of [email protected].
Analysis
Use the Canvas to Guide Hypothesize-and-Test Cycles for Areas of Highest
Risk
Create an idea evaluation canvas for a specific idea or innovation proposal. This can arise
from the work of an innovation or emerging technology team, or from open innovation
activities such as hackathons, innovation challenges, shark tanks or workshops. The
canvas may be used as an idea proposal form or introduced at the point where an idea
has been selected for further evaluation. The level of detail may range from a scratch pad
for current thinking to a repository of findings and results collected over time. For more
information on how a canvas can fit within the broader innovation process, see Executing
on Innovation: Design the Process From Idea to Value.
To use the idea evaluation canvas template, take the following steps:
1. Create a value hypothesis for the idea. Turn the idea or proposal into a hypothesis
relating to the value of the idea, as in the following examples:
■ Chatbots will allow us to automate common help desk questions without
adversely impacting caller experience.
■ This new approach to testing will allow us to shorten development cycles.
■ Although the hypothesis emphasizes the potential value of the idea, this isn’t
the same as a full business case, which may be developed as part of the
evaluation process.
2. Document your current state of knowledge. Complete each box in the canvas to
reflect the current state of knowledge about that category (e.g., user benefit, potential
obstacles or technology performance). At this stage, it is perfectly acceptable for the
canvas to contain only outline information in each box, as the purpose of the
canvas is to highlight the areas that need further investigation.
Gartner, Inc. | G00734229 Page 4 of 11
This research note is restricted to the personal use of [email protected].
3. Identify the area of highest risk. Based on the information in the canvas, determine
the area of highest risk or uncertainty, and use this to guide the order of
experimentation. This is a key difference from traditional evaluation phasing, which
typically progresses by scaling up the level of effort and investment through
proposal, prototype, proof of concept and pilot with real users to scale-up and
delivery. The lean startup approach emphasizes tackling the highest risk or
uncertainty first, then progressing to the next highest. In this way, a potential idea-
killer is less likely to surface late in the game when substantial investment has
already been made. In particular, user or customer acceptance is often an area of
high uncertainty that needs to be investigated much earlier than waiting for a full
pilot installation.
4. Create a “minimum viable experiment” for the risk area. Start with a hypothesis
specific to the identified risk or uncertainty that includes a metric indicating success
or failure, such as “80% of callers will be satisfied with an answer from a chatbot.”
Then, design an experiment to prove or disprove the hypothesis, expending as few
resources as possible. In the chatbot example, a minimum viable experiment might
involve a call center agent pretending to be a chatbot where really there is no
technology involved. The experiment could measure a caller’s satisfaction when
they think they are talking to a chatbot, even if they are actually conversing with a
human.
5. Assess the results of the experiment and decide whether to proceed. Update the
canvas with the experimental findings. If the experiment proves the hypothesis to be
true, progress to the next highest category of risk or uncertainty and design the next
minimum viable experiment. If the results disproved the hypothesis, examine and
document what would have had to be different for the hypothesis to be proven. For
example, an ROI investigation may determine that the cost of a technology is too
high, but an important learning might be “when the price falls to x, we should
reexamine this idea.” Or, the experimental results may lead to a rethinking or
reformulation of the value hypothesis. For example, “If people were calling in to
complain, they didn’t like the chatbot, but if they were calling in for information, they
were fine. So, information access will be our use case.”
Gartner, Inc. | G00734229 Page 5 of 11
This research note is restricted to the personal use of [email protected].
6. Keep experimenting until you can make a decision. While an idea will never be risk
free, once the main categories of risk and uncertainty have been investigated, the
organization will be in a stronger position to decide whether, when and how much to
invest in an idea. Sometimes, as with the minimum viable product approach, the
experiments migrate into the market or deployment phase, with continued learnings
and adjustments based on risks and uncertainties that couldn’t be tested cost-
effectively before launch.
7. Evaluate the effectiveness of the approach and the canvas categories. If you find
that you are replacing or modifying the same category as you determine the risks
and uncertainties for your innovations, create a new version of the canvas that more
accurately reflects the experimentation that you tend to need.
See Figure 2 for an example of a series of minimum viable experiments that prove a
hypothesis regarding chatbots in the call center.
Figure 2: Innovation as Hypothesize and Test
Gartner, Inc. | G00734229 Page 6 of 11
This research note is restricted to the personal use of [email protected].
Ask Desirability, Feasibility and Viability Questions to Focus
Experimentation
To decide whether to commit to an idea, the following common questions typically need
to be answered. We’ve organized the categories of questions into three groups: desirability
categories, feasibility categories and viability categories. When surfacing an idea or
opportunity, you may be able to answer some questions immediately, while other answers
may be unknown or incomplete. Gaps and uncertainties represent areas that need to be
explored or risks that should be mitigated through Steps 2, 3, 4 and 5 of the
experimentation process described above. This list of questions is not exhaustive but
should be used as a starting point to identify the information that will be needed during
the evaluation and decision process.
Desirability Categories
User/Customer Benefit and Impact
■ What value does this deliver to the customer/user (e.g., speed, convenience, social
status, cost reduction, accessibility, personalization)?
■ What problem is being solved for the customer/user?
Organizational Benefit and Impact
■ What value does this deliver to the organization (e.g., productivity, efficiency, speed,
cost reduction, risk reduction, brand enhancement, new products or services, new
revenue streams, higher customer satisfaction)? See The Gartner Business Model
Innovation Framework: A Tool for Deciphering High-Impact Digital Initiatives.
■ What problem is being solved for the organization?
■ How broad will the impact be for the organization (specific business unit, whole
organization, industrywide)?
■ Is the potential benefit aligned with the strategic direction and goals of the
organization?
Societal Benefit and Impact
■ If this idea is successfully delivered and adopted, what positive social implications
(e.g., energy efficiency, privacy and security, environmental impact) will it have?
Gartner, Inc. | G00734229 Page 7 of 11
This research note is restricted to the personal use of [email protected].
■ What are the potentially negative implications that are counter to organizational
values, or to the values of the target audiences?
See also The Gartner Digital Business Value Model: A Framework for Measuring Business
Performance for more detailed information on categories of benefit as shown in Figure 3.
Figure 3: The Gartner Digital Business Value Model
Feasibility Categories
Enablers and Accelerators
■ What factors will help this idea succeed (e.g., existing capabilities, executive support,
prior experience, alignment with strategy and goals)?
Obstacles and Inhibitors
Gartner, Inc. | G00734229 Page 8 of 11
This research note is restricted to the personal use of [email protected].
■ What factors will inhibit adoption (e.g., lack of internal support, high levels of risk or
internal disruption, design complexity, incompatibility with existing approaches, lack
of skills, employee resistance)?
Integration Impact
■ How easily will the capability integrate with existing technology infrastructure, data
and networks?
■ How easily will the capability integrate with current processes?
■ Are there implications for security, compliance, standards or similar factors?
Technology Performance
■ Does the technology perform to the level needed to drive value (i.e., is it accurate
enough, reliable enough and fast enough?)
■ Is the performance predictable and repeatable?
Commercial Status and Partnerships
■ How mature is the market for the technology or capability? Are there commercially
viable vendors?
■ Does the organization already have relationships with external partners that will
support or co-develop the idea?
■ Are there any intellectual property benefits or issues?
Viability Categories
Cost and Timing
■ How much talent, time and financial resource will be required to remove the
uncertainty around the idea and create a full business case?
■ What is the approximate projected investment required for development and
delivery?
Gartner, Inc. | G00734229 Page 9 of 11
This research note is restricted to the personal use of [email protected].
■ What is the projected time frame until the idea delivers value (to the user/customer,
to the organization, and to society)?
■ Are the projected costs and time frame well-aligned to the projected value?
Success Metrics
■ What is the approximate projected value, using the normal organizational metrics
(e.g., return on investment, payback period)?
Note that cost, timing and success metrics will be very approximate at the early stages of
an idea, and become better formed through targeted experimentation, investigation and
analysis
Evidence
1
A. Osterwalder and Y. Pigneur, “Business Model Generation,” John Wiley & Sons, 2010.
The Gartner Evaluation Canvas is adapted from the Business Model Canvas presented by
Osterwalder and Pigneur. Their Business Model Canvas contained nine sections for new
product information including value proposition, cost structure, revenue streams and key
partners.
Why Lean Canvas vs Business Model Canvas? Lean Stack.
Ash Maurya created a Lean Canvas version of the Business Model Canvas as part of his
work with lean startup principles, which emphasized areas of high risk or uncertainty for
entrepreneurs.
2
Three Lenses of Innovation, Isaac Jeffries.
In this 2016 blog post, Isaac Jeffries noted that the sections of the original Business
Model Canvas can be characterized under the three IDEO lenses of feasibility, desirability
and viability.
How to Prototype a New Business, IDEO.
Recommended by the Authors
Gartner, Inc. | G00734229 Page 10 of 11
This research note is restricted to the personal use of [email protected].
Some documents may not be available as part of your current Gartner subscription.
The Gartner Digital Business Value Model: A Framework for Measuring Business
Performance
Innovation Idea Selection — Choosing for Success
Three Questions About Innovation Every Leader Should Ask
Executing on Innovation: Design the Process From Idea to Value
To Innovate More, Define Failure, Not Success
The Art of the Innovation Workshop
Assessing Emerging Technology Adoption Readiness
Jump-Start Your Innovation Journey With a Customizable Innovation Framework
Create a Research Engagement Plan to Advance Your Innovation Programs, Processes
and Culture
Turn Post-COVID-19 Uncertainties Into Strategic Options
© 2022 Gartner, Inc. and/or its affiliates. All rights reserved. Gartner is a registered trademark of
Gartner, Inc. and its affiliates. This publication may not be reproduced or distributed in any form
without Gartner's prior written permission. It consists of the opinions of Gartner's research
organization, which should not be construed as statements of fact. While the information contained in
this publication has been obtained from sources believed to be reliable, Gartner disclaims all warranties
as to the accuracy, completeness or adequacy of such information. Although Gartner research may
address legal and financial issues, Gartner does not provide legal or investment advice and its research
should not be construed or used as such. Your access and use of this publication are governed by
Gartner’s Usage Policy. Gartner prides itself on its reputation for independence and objectivity. Its
research is produced independently by its research organization without input or influence from any
third party. For further information, see "Guiding Principles on Independence and Objectivity."
Gartner, Inc. | G00734229 Page 11 of 11
This research note is restricted to the personal use of [email protected].