9/17/2013
Improving Quality of Care and Patient Outcomes: Through Transparency and Data Management of Hospital Acquired Conditions
Karen Lawler, MPS, RHIA, CHPS, FABC Director of Health Information Management and Privacy
Carolyn Hoffman-Kaminski, MSW,RN,MS,CPHQ,CPHRM Director of Quality and Performance Improvement
1
9/17/2013
Key Learning Objectives • Challenge strategy to improve Data Validity and Data Confidence • Demonstrate a controlled process and feedback mechanism for Performance Improvement • Identification of key stakeholders, responsibilities and accountability • Direct link between HIM and Patient Outcomes
2
9/17/2013
Background • Shift from Pay for Reporting to Pay for Performance • Deficit Reduction Act (DRA) enacted in 2005 • Increased focus on prevention and standardized guidelines • Oct 1, 2008 CMS identified 8 HACs for which Medicare would no longer reimburse hospitals at a higher rate
3
9/17/2013
Establish a Framework for all Publicly Reported Processes Through a controlled methodology we can apply the process to : • Quality Measures and Patient Safety reporting • ACO (Accountable Care Organization) whose goal is to improve quality and efficiency and reduce costs • Meaningful Use – Continuity of Care Document
• ICD 10 • Data that informs consumer decision making
4
9/17/2013
Definition of Publicly Reported Data • Quality or Pay for Performance – HEDIS, Leapfrog, Healthgrades, JCAHO, Hospital Compare, AHRQ • Utilization Data – Medpar, individual State measures • Clinical Condition Data ‐ CDC
5
9/17/2013
Why is Publicly Reported Data Important? • POA (present on admission) – reimbursement, risk adjusted methodology, for the patient may affect healthcare coverage • Healthcare consumer choice • Reimbursement based on quality metrics beginning 2015 CMS
6
9/17/2013
Key Terms • Data Quality Management – The business processes that ensure the integrity of an organization’s data during collection, application, warehousing and analysis
• Data Quality Measurement – A quality measure is a mechanism to assign a quantity to a quality of care by comparison to a criterion. Quality measurements typical focus on structures or processes of care that have a demonstrated relationship to positive health outcomes and are under the control of the healthcare system. • Think – completeness, accuracy, granularity (clinical specificity)
7
9/17/2013
Key Focus Areas that we applied Collaborative Methodology • Mortality Rate – process improvement for identifying Hospice patients
• Decubitus Ulcer – documentation strategy beginning in the Emergency Department
• CLAPSI ‐ understanding of definitional difference between infectious disease and coding
• DVT ‐ improvement strategies through templates and mandatory documentation
8
9/17/2013
Methodology A business philosophy that aligns business practices with customers, clinical/non‐clinical employees and patient needs Methodology for improving key processes and analyzing variations, while focusing on continuous improvement A “tool box” of quality and management tools for problem resolution that is data driven An organized process to reduce “Defects per Million Opportunities (DPMO)” Aligning work for one process (identifying HACs) to benefit many Defect: Hospital Acquired Condition
Villanova University
9
9/17/2013
Six Sigma 6σ = 99.9997% Accuracy
Validate measurement systems Collect data on current performance and defects Data Evaluation and analysis for special causes Minimize variation Calculate Six Sigma (DPMO)
CALCULATOR A Six Sigma defect is defined as anything outside of customer specifications. A Six Sigma opportunity is the total quantity of chances for a defect. First we calculate Defects Per Million Opportunities (DPMO) and based on that a Sigma is decided from a predefined table:
DPMO = _________Number of defects __________ x 1,000,000 Number of Units x Number of opportunities Unknown Source
10
9/17/2013
Toolbox Procedures
Environments
Material
GOAL
Measurement
Process Flow Diagram
Score
5
4
Severity (SEV)
Severe 1
Occurrence (OCC)
Very High
Escaped Detection (DET)
High
Very High
People
Cause and Effect Diagram
RISK PRIORITY NUMBER (RPN) = SEVERITY X 0CCURRENCE X Category
Equipment
High
High
Control Charts ESCAPED DETECTION
3 Moderate Moderate
Moderate
2 Minor
Negligible
Low
Very Low
Low
Very Low
Failure Modes and Effects Analysis - FMEA
HAHV
11
9/17/2013
Six Sigma Project Tasks • Task 1: Project Charter • Task 2: D‐Define • Task 3: M‐Measure • Task 4: A‐Analyze • Task 4: I‐Improve • Task 5: C‐Control
12
9/17/2013
Elements of a Project Charter
• A communication plan of deliverable • Business case • Problem Statement
Project Charter Project Information
Project Management Team
Project Name:
Exec utive Spon sor:
OCEG owner:
Proc ess Own er: Belt:
Department: Project Type:
Basic Flow 5 S
• Scope Statement • Team members and roles • Cost estimates • Return on Investment (ROI) • Authorization of project
DMAIC
Project Definition (Step 1) problem statement; voice of customer; link to strategy; process steps; scope Problem Statement • Paragraph(s) describing current state: when, what, where and how much •
Supporting Facts / VOC (What data do we have to prove the problem exists)
•
Describe Link to Strategic Plan:
• Goal Statement
Lean Visu al Cont rol
Project Scope: (What’s our target audience/area) Process Steps: (High level process map to show current process)
Key Metrics/Goal (Step 2)
IN:
OUT:
1)
2)
3)
4)
5)
6)
7)
8)
specific to problem; measurable improvement; quality or speed related Name: Operational Definition: (process (clarify meaning; start and metrics end points; etc.) only)
Opportunity/Goal: (from.._to.._)
Primary “most important” Secondary “also important” Consequential “watch out”
Business Impact (Step 3)
Indicate a single Primary focus area with * [X] Primary Strategic Focus What primary strategic area does this project support?
Patient Satisfaction Revenue Quality / Safety
Perceived Benefits (w/ key assumptions)
Assumptions:
Budget Requirements
Antici pated Capit al:
Corporate Responsibility Innovation Enabling
Simplification Other
• • $
Antici pated Expe nse:
Project Planning (Step 4) What functions should be communicated with regularly? Who are the team members and where are they from? What is the time frame?
Stakeholders (MBB or Sponsor to help identify)
Team Member & Dept. (identified by Champion and Belt) % of Time
Milestones (targets: mm/yy) Start *D *M *A *I *C
13
9/17/2013
Define • Team selection and name – Think High‐Level (Publicly Reported Safety Measures Committee) • Team training • Define project objectives & plan • Review existing process, tools and documentation • Define and map “as is” process • Clearly identify the problem • Present objectives and plan to management • Review and redefine problem, if necessary
14
9/17/2013
Publicly Reported Measures Team • VP of Quality/CQO • Director, Hospitalist Service • Hospitalist Designee(s) • Director of HIM • OCEG Designee • Director Case Management
• • • • • •
HIM Coding Manager RN Manager of IC Director of Quality Sr. System Analyst Systems Analyst Nursing Director of Clinical Operations and Quality
15
9/17/2013
Measure • Data collection plan • Validate measurement systems • Collect data on current performance and defects • Data Evaluation and analysis for special causes
16
9/17/2013
Analyze • Determine sources or “root cause(s)” for defect • Prepare baseline graphs on subtasks • Analyze inefficiencies with detailed process maps • Analyze time, value and risk • Benchmark other hospitals/healthcare sites • Consolidate analysis and findings
17
9/17/2013
Improve • Develop potential solutions to eliminate root causes, think systematically • Prioritize solutions and conduct a feasibility assessment (ie. accurate documentation) • Obtain necessary approvals • Prepare for improved process pilot • Test improved process/run pilot • Analyze pilot and results • Develop and implement plan for a system‐wide roll out
18
9/17/2013
Control • Define control metrics • Develop ongoing metrics collection tool • Roll‐out control metrics • Control process by measuring and evaluating results • Transparency of data‐ PSQC Scorecard • Respond promptly when defects occur
19
9/17/2013
20
9/17/2013
Six Sigma Road Map D Develop a vision Understand customer needs Team Selection
M Data Collection Plan Validate Measurement Systems
Team Training Review existing processes Map “as is” process Identify problem clearly Present objectives and plan to management Review and redefine problem, if necessary
A Determine sources or “root causes” for defect Use statistical methods to quantify cause & effect relationship
Collect baseline data on defects and possible cause
Analyze impacts to determine greatest inefficiencies
Plot data over time & analyze for special causes
Analyze time, value and risk
Stratify frequency plots & do Pareto analysis (80/20)
Benchmark other hospitals/healthcare sites
Calculate starting sigma level (DPMO)
Consolidate analysis and findings
I
C
Develop recommendations and solutions for root causes
Standardize Practices
Conduct a feasibility analysis
Develop ongoing metrics collection tool
Obtain approval
Roll-out control metrics
Prepare, pilot and analyze improved process
Define control Metrics
Monitor process by measuring and evaluating results
Implement plan system-wide
Summarize and communicate results
Train staff
Recommend future plans Unknown Source
21
9/17/2013
Stamford Hospital Hospital Acquired Condition (HAC) Review Process - Post Discharge
22
9/17/2013
Stamford Hospital Hospital-Acquired Condition (HAC) Concurrent Qualification Process
23
9/17/2013
Reduction in HACs 60 50 40 30 20 10 0 2011
2012
2013
Applied collaborative methodology with solid end results
24
9/17/2013
What’s Next?
• Questions Hospital-Acquired Condition Reduction Program
25
9/17/2013
2013 Hospital Acquired Conditions Stage III/IV Pressure Ulcers Air Embolism Blood Incompatibility Foreign Object Retained After Surgery • Iatrogenic Pneumothorax with Venous Catheterization • Manifestations of Poor Glycemic Control • • • •
• Catheter Associated Urinary Tract Infections • Vascular‐Catheter Associated Infections • Surgical Site Infections (CABG, Bariatric, Certain Orthopedic cases and CIED)
• Falls and Trauma • DVT/PE (Certain Orthopedic cases)
26
9/17/2013
Plan your methodology • Review your data now – Concurrent – Historical
• Review documentation – ICD 10 opportunity
• Identify Risk – What needs to change?
• Map out Strategy
27
9/17/2013
Program Overview • Hospitals with poor performance on HAC’s receive a monetary penalty • Based on AHRQ PSI‐90 (Domain 1, 35%) and CDC NHSN Measures (Domain 2, 65%) • Domain 1 + Domain 2 = Total HAC Score • 1% Medicare Payment Penalty beginning FY 2015 payment for hospital’s in the worst quartile based on total HAC score
28
9/17/2013
Program Overview AHRQ PSI 90 (Composite ) Performance Period: 7/1/11-6/30/13 Data Source: Medicare FFS claims data AHRQ PSI 90 (Composite of 8 measures)
CDC NHSN Measures (NEW) Beginning FY 2015 Performance Period: 1/1/12-12/31/13 Data Source: Chart-abstracted data
Version 4.5 weights (Use POA = 0)
Measure PSI 3: Pressure Ulcer Rate
0.3438
CAUTI
PSI 6: Iatrogenic Pneumothorax rate
0.0203
CLABSI
PSI 7: Central Venous Catheter‐Related Blood Stream Infection Rate PSI 8: Postoperative Hip Fracture Rate
0.0210
SSI ‐colon ‐abdominal hysterectomy C. difficile
0.0037
PSI 12: Postoperative pulmonary embolism (PE) or 0.3839 deep vein thrombosis rate (DVT) PSI 13: Postoperative Sepsis Rate
0.0900
PSI 14: Postoperative Wound Dehiscence Rate
0.0144
PSI 15: Accidental puncture and laceration rate
0.1229
MRSA
FY 2015
FY 2016
X
X X
FY 2017 X
X
X
X
X
X X
GNYHA
29
9/17/2013
Applying Principals • Develop actionable information with continuous feedback loop and communication • Leverage data warehouse to identify metrics of measure and validate data through the data quality governance structure • Develop an oversight task force for data analysis and deep dive into data for improvement and additional identification of other metrics – continuous improvement • Utilize lessons learned for documentation improvement, standardization of templates, education, content management of the EHR
30
9/17/2013
Call to HIM for Action • Collaborative process • 2014‐2017 AHIMA Environmental Scan – Data Integrity – Reimbursement – data quality – Big Data – think ACO’s or other organizations like it – Healthcare reform
31
9/17/2013
Questions
32