Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
168 views89 pages

Health Analytics Workshop November 11 2013

This document summarizes a healthcare analytics workshop that took place on November 11, 2013. It introduces four speakers who have extensive experience in healthcare IT, informatics, data analytics and quality measurement. The workshop consisted of sessions on building a data foundation for analytics, developing an analytics roadmap and considerations for human resources, and a hands-on case study session. The goal of the workshop was to help participants develop skills and knowledge around clinical data analytics that could be applied in practice to improve care delivery, outcomes and performance.

Uploaded by

Prakash Babu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
168 views89 pages

Health Analytics Workshop November 11 2013

This document summarizes a healthcare analytics workshop that took place on November 11, 2013. It introduces four speakers who have extensive experience in healthcare IT, informatics, data analytics and quality measurement. The workshop consisted of sessions on building a data foundation for analytics, developing an analytics roadmap and considerations for human resources, and a hands-on case study session. The goal of the workshop was to help participants develop skills and knowledge around clinical data analytics that could be applied in practice to improve care delivery, outcomes and performance.

Uploaded by

Prakash Babu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 89

Healthcare Analytics Workshop Nov.

11, 2013

Rosemary Kennedy, PhD, RN, MBA,


FAAN
Associate Professor and Associate
Dean of Strategic Initiatives
Jefferson School of Nursing
25 years health IT implementation
and development experience
Chief Nursing Informatics Officer for
Siemens
Data representation within EHRs for
clinical decision support
Quality Measurement experience,
National Quality Forum (NQF)
Care Coordination research related
to data and plans of care

Floyd Eisenberg, MD, MPH, FACP


President, iParsimony, LLC
Physician with specialty in Infectious
Diseases, practiced in Norristown,
PA
Data and measurement experience
in Independence Blue Cross,
Siemens Medical Solutions,
National Quality Forum
Current work with
CDC on immunization registries
NLM on data requirements for
functional status
ONC for Meaningful Use Stage 3
Pharmaceutical company on clinical
decision support in EHRs.

Caterina E.M. Lasome, PhD, MBA, RN,


CPHIMS
President & CEO
iON Informatics,
Registered Nurse 25+ years
Army Nurse 23 years
Formally prepared informaticist
(Masters and PhD) 19 years
COO National Cancer Institutes
Center for Biomedical Informatics
and IT
Led major Department of Defense
and NATO level EHR development
and deployment initiatives
Clinical leader enterprise analytics

Dana Womack, RN, MSN


Registered Nurse 20 years
Masters prepared informaticist
13 years
Led the launch of multiple health
innovation initiatives
Led the development and design of
Analytics Dashboards for clinical
use

Healthcare Analytics Overview


08:30 09:30

Building a Data Foundation for Analytics


09:45-10:45

Roadmap to Analytics and Human Resource Considerations


10:45 12:00

Hands On Case Study


1:00 2:00

Group Report Out with Shared Insights & Best Practices


2:30 4:00

Understand analytics
fundamentals

Identify challenges
& best practices

Learn about others


successes

Understand key clinical


& technical building blocks

Be prepared to apply knowledge in daily practice

This session will help you


develop a foundation upon
which to build a successful
clinical analytics or business
intelligence initiative.
Participating in a real-world
case study will help you
internalize steps involved in
transforming data into
actionable insight.

Analytics Definitions & Importance


Environmental Drivers
Analytics Exemplars
Challenges
Common Success Characteristics

Section 1
Health Analytics
Overview

Analytics Definitions & Importance


Environmental Drivers
Analytics Exemplars
Challenges
Common Success Characteristics

Section 1
Health Analytics
Overview

Analytics Definitions & Importance


Environmental Drivers
Analytics Exemplars
Challenges
Common Success Characteristics

Section 1
Health Analytics
Overview

Definitions:

Use of data collection & analysis to optimize decisions that


result in improved care delivery1

Process of reviewing large amounts of raw data to identify


patterns or trends that will help organizations better
understand behavior and outcomes2

1. Davenport, et al. Analytics at Work: Smarter Decisions, Better Results. Cambridge, MA: Harvard Business Review Press; 2010
2. Murphy et al. Evidence and the Executive. JONA Volume 43, Number 7/8, pp. 367-370

Electronic
Health
Record

Increased
data
availability

Clinical analytics must become a pervasive


activity to achieve the global vision of timely,
effective, equitable & excellent care.

Continuous
quality
improvement

Limited EHR-based capabilities organize and measure clinical, patient safety, cost
and patient satisfaction outcomes, and otherwise generate the analytics required
to improve care
Lack of ability to extract, aggregate & integrate demographic, clinical, financial,
administrative, patient experience and other relevant data
Data accuracy entered by humans
Fragmentation of data across different facilities

Data may not be stored in the same place or manner; significant time investment
required to map data from multiple sites and sources
Data can lose granularity as it is shared among different systems and lose the
value of the information
Lack of formalized processes for translating narrative care guidelines into
computable clinical decision support (CDS) rules
Consistent definitions for clinical terms like premature labor
Not yet able to easily share CDS across EHR systems or update CDS as
easily as computer virus definitions

Business Intelligence*: encompasses the processes and technologies used to


obtain timely & valuable insights into business & clinical data. BI levels:
Descriptive can be accomplished via ad hoc reporting
Predictive
Require simulation, forecasting, predictive modeling, visual
Prescriptive
representation

Secondary Use: uses of clinical data for purposes other than direct patient care

Clinical Decision Support**: provides clinicians, staff, patients or other


individuals with knowledge and person-specific information, intelligently filtered
or presented at appropriate times, to enhance health and health care.
* Adams, J, Klein, J. Business Intelligence and Analytics in Health Care A Primer. The Advisory Board Company, 2011.
**HealthIT.gov http://www.healthit.gov/policy-researchers-implementers/clinical-decision-support-cds

Data Model*: Representation of a real world situation about which data is to


be collected and stored in a database. A data model depicts the dataflow
and logical interrelationships among different data elements.

Data Warehouse**: A database used for reporting and data analysis. It is a


central repository of data which is created by integrating data from one or
more disparate sources.

Data Mart**: A small data warehouse focused on a specific area of interest.


Data warehouses can be subdivided into data marts for improved
performance and ease of use within that area.
* http://www.businessdictionary.com/definition/data-model.html
**https://en.wikipedia.org/wiki/Data_warehouse

Analytics can create actionable insight that can improve care


quality, safety, efficiency & cost.
Analysis of massive amounts of data should support:
Earlier detection and more effective treatments
Better targeted clinical decision support
Real-time biosurveillance
Streamlined operations
Actual & predictive clinical & financial performance/outcomes

A survey of 40 hospitals and 30 insurers found that respondents' top


goals for analytics were:

Identifying at-risk patients (66%)

Tracking clinical outcomes (64%)

Performance measurement and management (64%)

Clinical decision making at the point of care (57%)


Note: percentages represent the respondents aspirations rather than current capabilities.
Source: Terry, K. Healthcare Organizations Go Big for Analytics. Information Weekly, March 19, 2013.

Analyze data

Make the data


actionable

Measure
outcomes

Evaluate
strategy, set
new goals

Collect data

Analytics
Set analytics
goals & strategy

Quality

Plan

Care
Delivery Assess

Do
Plan

Diagnose

Check
Intervene

Act
Evaluate

Area

Examples

Point of Care Impact

Clinical
Analytics

Readmission risk
Personalized healthcare
Clinical protocol adherence

Analysis of clinical data enables detection of phenomena of interest


(readmission risk, early sepsis), which can subsequently trigger
clinical decision support, based on the relevant evidence base,
created through research.

Operational
Analytics

Tracking surgery start times


Tracking ER wait and holding times
Staffing based on patient demand
& staff qualifications

Monitoring key performance indicators (KPIs) over time supports care


improvement efforts
Data-driven staffing recommendations can improve safety

Financial
Analytics

Profitability by case, service line


Determining capacity utilization
Justification of equipment
investments

Financial viability of the organization is critical to adequate


infrastructure, staffing and organizational health

Research

Evidence generation for evidencebased clinical practice

Research is made actionable at the bedside via development of care


guidelines, quality measures, and clinical decision support

The DELTA-Powered Analytics Assessment TM


measures organizations effectiveness in
using data to shape strategic decision
making.1
Distribution of 850 representative
healthcare organizations in 20112:
Level 4: 3-5%
Level 3: 15-20%
Level 2: 25-30%
Level 1: 50%

1. https://www.himssanalytics.org/emram/delta.aspx
2. http://healthcare-executive-insight.advanceweb.com/Features/Articles/The-Healthcare-Analytics-Maturity-Framework.aspx

Increased spending
Increased % elderly
spending does not
ensure outcome

Modified from: Miller, H., From Volume to Value: Better Ways to Pay for Health Care, Health Affairs, September/
October 2009

External Drivers

Internal Drivers

Meaningful Use

Cost Reduction Initiatives

Pay for Performance

Patient Centered Medical Home

Most hospitals in NJ and PA are running at 3%


below the national average

New Payment Factors and Bundled


Episodes

Accountable Care Organizations

Patient Safety and Quality Improvement


Programs

Health Information Exchanges

Competitive Positioning

1/5 of PA hospitals are running negative


margins

External Drivers
Meaningful Use
Pay for Performance
Patient Centered Medical Home
New Payment Factors and Bundled Episodes
Accountable Care Organizations
Health Information Exchanges

Internal Drivers
Cost Reduction Initiatives
Most hospitals in NJ and PA are running at 3%
below the national average
1/5 of PA hospitals are running negative margins
Patient Safety and Quality Improvement Programs
Competitive Positioning

Same Common Denominator


Healthcare Analytics
Report Data (both internally and externally)
Analyze Data for Performance Improvement
Use the Data to Adjust Workflow Based on Trends

Results:
Data-driven analysis:

15% -18% decrease in admissions

Post-hospital discharge
information is combined with
a predictive analytics tool to
identify priority patients for
case management
enrollment.

22% decrease in readmissions


72% of patients perceive improved
quality of care
ED visits remain flat while
unmanaged patients ED visits
increase

Data-driven analysis:
Analytics employed to identify
top causes of harm overall
and at individual points on
the continuum of care.

Results:
In the first 6 mo. (Jan June, 2008),
overall harm dropped 32%,
infection-related harm dropped 41%,
medication-related harm dropped 43%,
falls dropped 13%,
pressure ulcer harm dropped 21%.
From 2009 to 2010, HFHS calculated that total
costs for harm were reduced from $38.6M to
34.5M, a drop of >$4M, or $85 per patient.

Data-driven analysis:
Brigham and Womens implemented
a Balanced Scorecard that enables
them to track changes in
performance on a number of
indicators. The scorecard takes data
feeds from approximately 80
different data sources.

Results:
Press Ganey survey scores rose
dramatically, with several departments
moving from the 50th to the 90th
percentile.
Scorecard has evolved from performance
measurement to performance
management to strategic management.
Many operational metrics and monitored
and with shifting payment models, they
analyze margins by case and by payer.

Data-driven analysis:
Children's of Omaha has fostered a
significant cultural revolution and
has deployed dashboards to infuse
clinical, financial and operational
activities with a passion for
data-driven performance
improvement.

Results:
Reduced medication & prescribing
errors, increased screenings,
reduced in ED wait times
5% reduction in the organizations
overall budget

Hagland, M. The 2011 Healthcare Informatics Innovator Awards: Honoring Leaders in Healthcare IT Innovation. Healthcare Informatics. Feb.
4, 2011.
http://www.healthcare-informatics.com/article/2011-healthcare-informatics-innovator-awards-honoring-leaders-healthcare-it-innovation
Accessed Sept, 2013.

Leaders have taken personal risks to compel their organizations forward


toward a set of clearly defined strategic goals that span quality, safety,
efficiency, and effectiveness
Team-based, multidisciplinary teams drive organized, measurable
improvements in care delivery and operations
Clinicians are engaged & activated via actionable data that enables direct
bottoms-up improvement
Data is integrated from multiple sources for aggregate, electronic analysis
Analysis occurs in near-real-time based on naturally occurring data rather
than manual chart reviews

Morning Break

Start 0930 End 0945

Data Architecture and


Standards
Data Governance

Section 2
Building a Data
Foundation for
Analytics

Clinical Analytics and Business Intelligence Goal


To Aggregate the Most Granular Level of Data
up into Summary Level Data
To Improve the Ability to Use the Data for
Business Management, Reporting, and
Performance Improvement

Data from multiple sources


Not Just the EHR
Data have attributes, metadata, which are different across the different
systems

Systems that
generate data

Allows you to query data to answer


questions:

Electronic Health
Record
Pharmacy
Radiology PACS
Billing
Quality tracking
systems
Time & Attendance
And more

Warehouse
that
normalizes
data

What medications and dosages are most


frequently given in error?
With significant adverse reaction
Without significant adverse reaction
How many hours into a staff persons
shift do errors most frequently occur?
Do units with high error rates have low
scores on other quality measures?

Data Source
EHR
EHR

LAB

Finance

ODS
Operational
Data Store
ETL
Extract,
Transform,
Load

Warehouse
Enterprise
Data
Warehouse

Analytic
Sandbox

Data Mart

BI
Presentation
Layer

Requirements
1.Data Model
2.Data Structure
3.Data Governance (Quality and Integrity)

Data Model
Description of the objects
(concepts) represented within
systems
Showing the order of their
relationships

1.
2.
3.
4.

Free Text
Semi-Structured
Structured
Coded

Structured Text, coded


using a standardized
coding system such as
SNOMED
Unstructured Text, aka
Free (Narrative) Text

Role of the Good Data Structure


Research

Data Model

Performance Improvement

Data Warehouse

System Requirements

Point of Care Documentation

Definition
Data Governance is the exercise of decision-making and authority for data-related
matters to maintain a consistent model of meaning across the organization.
It refers to the organizational bodies, rules, decision rights, and accountabilities of
people and information systems as they perform information-related processes.

Adapted from: The Data Governance Institute. The DGI Data Governance Framework. Available at:
www.DataGovernance.com.

John Q. Individual

00/00/0000

Is an Active Smoker

00000-0000

00/00/0000

Child

Non-smoker

Assure Smoking Status is Recorded Model of Use


Meaningful Use Stage 1
Objective:
Record smoking status for patients 13 years old or older.
Measure:
More than 50 percent of all unique patients 13 years old or older
seen by the eligible professional have smoking status recorded as
structured data.

True comparisons require common, computable definitions

Feels rough
and hairy

Feels smooth
and sharp

Current Smoker: Used tobacco


product(s) within the past 30 days
(The Joint Commission)

Current Smoker: Smoked >= 100 cigarettes in their lifetime


and who smoke either every day or some days (CDC)

Model of Use

Model of Meaning

Represents the underlying meaning


in a way that is determined by a
limited set of use cases. The
information captured is not
necessarily re-usable for other
purposes.

Provide a common understanding for initial


use and re-use. It represents the underlying
meaning in a way that is common to and
reusable between different use cases.

Modified from: International Health Terminology Standards Development Organization (IHTSDO) Glossary, January 2012 International
Release. Available at: http://www.ihtsdo.org/fileadmin/user_upload/doc/tig/glsct/glsct_ss_ModelOfUse.html#_c0cc3aca-4e72-40baaf25-116e04a36fad, accessed 25 April 2012.
.

Value:
Achieve clarity
Ensure value from efforts
Create a clear mission
Maintain scope and focus
Establish accountabilities
Define measureable successes

Benchmark:
The most common objective of Data
Governance programs is to standardize data
definitions across an enterprise.

Adapted from: The Data Governance Institute. The DGI Data Governance Framework. Available at:
www.DataGovernance.com.

Metadata Information about the data e.g.:


o Date and time of origin
o Actions taken (e.g., medications ordered, canceled, dispensed,
administered, refused)
o History, or provenance the original source, e.g., blood glucose

measured by a specific glucometer device


May be present in system of origin only
Enhanced comparability of data for trending and analysis

Business-driven
IT-driven

IT efficiency and
compliance

Risk reduction,
cost controls, and
business
efficiencies

0: Unaware
No Activity
Fragmented

Greater compliance,
efficiency, and
support for revenue
growth

Strategic
differentiation

Data
n
o
n
etur
ng R
i
t
a
l
5: Optimized
Esca
Function
4: Managed
Program
3: Defined
Project
2: Repeatable
Pilot
1: Initial
Ad hoc
Holistic

Informatica. 2012. Holistic Data Governance: A Framework for Competitive Advantage

Accenture. A Framework for Building a Data Governance Strategy, available at:


http://www.accenture.com/us-en/Pages/insight-data-governance-strategy.aspx .
IBM. The IBM Data Governance Council Maturity Model: Building a roadmap for effective data
governance. Available at:
http://www-935.ibm.com/services/us/cio/pdf/leverage_wp_data_gov_council_maturity_model.pdf
Informatica. Holistic Data Governance: A Framework for Competitive Advantage. Available at:
http://www.informatica.com/Images/02297_holistic-data-governance-framework_wp_en-US.pdf.
Collibra. Data Governance Center. Available at:
http://www.collibra.com/products/data-governance-center.
UK DOH. Research Governance Framework for Health and Social Care, 2nd Edition, 2005. Available at:
https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/139565/
dh_4122427.pdf
Will Senn Data Governance Institute. The DGI Data Governance Framework. Available at:
http://www.datagovernance.com/fw_the_DGI_data_governance_framework.html.

Accreditation
Organizations

Doctors

Nurses

Board of
Directors

Government
Regulators
Patients

Consultants

Chief Medical /
Nursing Officer

Chief Information
Officer
Finance

Think Differently

Roadmap to Healthcare
Analytics
Human Resource
Considerations

Section 3
Roadmap to
Analytics and
Human Resource
Considerations

Define
Strategy, Set
Analytics Goals

Assemble the
right team

Define
Dashboard
Content
Around Key
Areas

Identify Data
Sources Across
all Electronic
Systems

Standardize
Data Using
Structured and
Coded Fields

Create
Business Rules
and generate
dashboards.

Validate with
Users, Test

C-Suite Leader (s)

Health Analytics
Leader(s)

Clinical Governance
Body

IT Leader(s)

Data Analysis Teams

Quality/Performance
Improvement Teams

C-Suite Leader (s)

Health Analytics
Leader(s)

Clinical Governance
Body

IT Leader(s)

Data Analysis Teams

Set the Analytics Vision


Provide Clinical Leadership
Resource the Effort
Consume Insights

Quality/Performance
Improvement Teams

C-Suite Leader (s)

Health Analytics
Leader(s)

Establish Analytics Strategy


Governance
Coordinate with Clinical
IT
Body
Engage Clinicians
Produce Analytic Insights

IT Leader(s)

Data Analysis Teams

Establish the Compute


Infrastructure
Quality/Performance
Improvement TeamsCoordinate activities
Maintain IT Systems

C-Suite Leader (s)

Health Analytics
Leader(s)

Clinical Governance
Body

Set Analytics Priorities


Conduct Evidence Reviews
Set common definitions of
clinical terms

IT Leader(s)

Data Analysis Teams

Aggregate & Normalize Data


Define data queries
Create reports
Configure Decision Support

Quality/Performance
Improvement Teams

Coordinate Analyses
Translate Findings
Make Findings Actionable

Permanent Team
Individual Analysts
Temporary Team
Do not have

Original online survey, n=87 (2011)


Par9cipants recruited via four

informa9cs List Servs


Source: Bria, Kennedy & Womack
JHIM, Vol 26(3) p. 46-51, 2012

62.1%
28.7%

If permanent team, length of existence

4.6%
4.6%

66.7%

5+ years
4 years

9.3%

3 years

11.1%

2 years

11.1%

1 year

1.9%

CMIO,
Other, 20.4% 14.8%
CIO, 7.4%
CMO, 16.7%
CQO or Quality
Leader, 40.7%

Original online survey, n=87 (2011)


Par9cipants recruited via four

informa9cs List Servs


Source: Bria, Kennedy & Womack
JHIM, Vol 26(3) p. 46-51, 2012

Different analysts/
teams
31%
Same analysts or
teams 69%
Original online survey, n=87 (2011)
Par9cipants recruited via four

informa9cs List Servs


Source: Bria, Kennedy & Womack
JHIM, Vol 26(3) p. 46-51, 2012

Insight Creation
C-Suite Leader (s)

Health Analytics
Leader(s)

Clinical Governance
Body

IT Leader(s)

Data Analysis Teams

Quality/Performance
Improvement Teams

Insight Consumption

Administrative Leaders
Clinical Leaders
Quality & Process Improvement Teams
Providers
Direct Caregivers
Patients
Regulatory bodies and other agencies

Move to first section after Exemplars and


Parallel construction
1. Create a scope and business case that recognizes that creating an analytics
foundation not only addresses MU, but facilitates performance and outcomes
management for accountable care.
2. Success requires corporate commitment, accountability, and a willingness to
engage clinicians in a continuous data-driven improvement process.
3. Commit to the creation of the creation of data integration capabilities so that data
can be stored, retrieved, and manipulated in near real-time.
4. Prepare to build a broad infrastructure (beyond MU) that can support performance
reporting and dashboards.
5. Utilize naturally occurring data to avoid special data collection.
Adapted from: Murphy, L, Wilson, M, & Newhouse, R. Evidence and the Executive. JONA Volume 43, Number 7/8, pp. 367-370.

Develop an action plan for launching an analytics initiative to address your


case study challenge
Identify objectives for the project, build the business case, identify how you
would fund and resource the project, how you would assemble the team,
identify metrics and measure outcomes.
Identify what data are needed, where data resides, what calculations would
need to be performed, and how results would be converted into actionable
insight.
Articulate key elements of a robust organizational analytics plan

Divide into assigned groups (8-10 people)


Assign roles/perspectives: Clinical, Operational, Financial, Safety & Quality,
Patient Satisfaction & Engagement, IT, Analytics

Over the past 6 months, your facility has had to cancel 12% of elective
surgeries, cases routinely run late and the average OR turnaround
time is 52 minutes.
You have been challenged to improve OR throughput, including
development of a dashboard that continually monitors performance
Draft a plan for an analytics initiative by completing a analytics plan
(provided).
A group presenter will be chosen at random to report your groups plan
to the larger audience.

Stakeholders
Scope & objectives
Team
Data Analysis
Workflow Considerations
Performance Metrics
Application of analytics
Anticipated Challenges

Elements to Consider
oDefine Terms
OR Prep Criteria Met
Equipment General
Equipment Casespecific
Surgeon
Assistant
OR Nurse, Circulator
Nurse Workflow
Anesthesiologist
CRNA
Transport
Patient Factors

Elective surgery
OR Start Time
OR End Time
Etc.

oCurrent Turnaround

oTarget

Your hospital was recently identified has having had among the
highest rates of reported sepsis-related mortality in your state for
the most recent year.
You have been challenged to improve the sepsis-related mortality
rate, and instantiate both clinical decision support to alert
clinicians to possible impending sepsis, and develop a near realtime infection rate monitoring dashboard.

3 Hour Sepsis Bundle

Measure Lactate Level


Obtain Blood Cultures Prior to Administration of Antibiotics
Administer Broad Spectrum Antibiotics
Administer 30 mL/kg Crystalloid for Hypotension or Lactate 4 mmol/L

6 Hour Sepsis Bundle


Apply Vasopressors (for Hypotension That Does Not Respond to Initial Fluid Resuscitation
to Maintain a Mean Arterial Pressure (MAP) 65 mm Hg)
In the Event of Persistent Arterial Hypotension Despite Volume Resuscitation (Septic
Shock) or Initial Lactate 4 mmol/L (36 mg/dL):
o Measure Central Venous Pressure (CVP)
o Measure Central Venous Oxygen Saturation (ScvO2)
Remeasure Lactate If Initial Lactate Was Elevated

The intention in applying the bundle is to perform all indicated tasks 100 percent
of the time within the first 3 and 6 hours of identification of severe sepsis.
Institute for Healthcare Improvement. Severe Sepsis Bundles. June 3, 2013. Available at:
http://www.ihi.org/knowledge/Pages/SevereSepsisBundles.aspx. Accessed 3 November 2013.

Stakeholders
Scope & objectives
Team
Data Analysis
Workflow Considerations
Performance Metrics
Application of analytics
Anticipated Challenges

Split into groups of 8-10


Select a group leader
Get your topic assignment

Start 1200 End 1300

Hands-on Case Studies


Teams will develop an
analytics plan to address an
assigned case study
Report out to larger group

Section 4
Hands On Case
Study

Stakeholders: Who needs the analysis? Who will use the results?
Scope & objectives: What is the scope & objectives of the analytics project?
Team: What roles & skillsets are required? How will the project be funded?
Data Analysis: How will you identify root causes of problems? What data will you analyze?
Where does the data reside? What analyses need to be performed?
Workflow Considerations: What workflow changes will be required to ensure valid and reliable
data capture?
Performance Metrics: What metrics will you measure? How will you deliver analytical results
to stakeholders? How will you measure the effectiveness of the overall analytics initiative?
Application of analytics: How will you translate your data analysis into real-world
improvements?
Challenges: What challenges do you anticipate? What proactive steps could you take to
minimize or avert challenges?

Afternoon Break
Start 1400 End 1430

Team Report Outs


Discussion
Insights & Best Practices

Section 5
Group Report Out
with Shared Insights
& Best Practices

Stakeholders
Scope & objectives
Team
Data Analysis
Workflow Considerations
Performance Metrics
Application of analytics
Anticipated Challenges

Summary: applying principles to practice


Completion of evaluation forms

Rosemary Kennedy, Ph.D., RN


[email protected]

Floyd Eisenberg, MD
[email protected]

Caterina Lasome, Ph.D., RN


[email protected]

Dana Womack, MS, RN


[email protected]

We are happy to help you!

You might also like