OBIGGS II Improvement Project Team

2007 ASQ World Conference for Quality and Improvement INTRODUCTION OBIGGS II Improvement Project Team Good afternoon, my name is Don Snow. I am the...
23 downloads 2 Views 7MB Size
2007 ASQ World Conference for Quality and Improvement

INTRODUCTION

OBIGGS II Improvement Project Team

Good afternoon, my name is Don Snow. I am the lead engineer and system architect of the Boeing C-17 OBIGGS II improvement project.

International Team Excellence Award Competition – 30 April, 2007

1

2007 ASQ World Conference for Quality and Improvement OBIGGS II Improvement Project INTRODUCTION

(US Air Force Photo)

The C-17 is an amazing military cargo plane. It carries enormous payload over long distances, yet can land on short, un-prepared runways. The C-17 delivers cargo and troops directly to the battlefield, so the fuel tanks are protected by an “OBIGGS”, which stands for OnBoard Inert Gas Generating System. The OBIGGS prevents the tanks from exploding if hit by enemy gunfire by injecting inert nitrogen gas into the space above the fuel.

International Team Excellence Award Competition – 30 April, 2007

2

2007 ASQ World Conference for Quality and Improvement OBIGGS II Improvement Project INTRODUCTION

(US Air Force Photo)

The first 141 C-17s were delivered with an OBIGGS - which we call OBIGGS 1 – that did successfully protect the fuel tanks, but required frequent maintenance. Our presentation tells the story of how our team successfully replaced OBIGGS 1 with a completely new design, called OBIGGS II. OBIGGS is now one of the strongest systems on the C-17, instead of the weakest.

International Team Excellence Award Competition – 30 April, 2007

3

2007 ASQ World Conference for Quality and Improvement OBIGGS II Improvement Project INTRODUCTION

Top Row: Brent Theodore, John Watson, Dan Ehlers Bottom Row: Rick Morey, Don Snow, Ben Canfield

The six of us will represent the …

International Team Excellence Award Competition – 30 April, 2007

4

2007 ASQ World Conference for Quality and Improvement OBIGGS II Improvement Project INTRODUCTION

… more than 200 Boeing, 150 supplier, and 50 US Air Force team members, that took this project from concept to reality.

International Team Excellence Award Competition – 30 April, 2007

5

2007 ASQ World Conference for Quality and Improvement Types of Data and Quality Tools Used to Select the Project and Why they were Used

1A.a

Mean Manhours To Repair 30

OBIGGS 1

25

Man Hours

20

Good

15

10

5

68

65

64

62

69

63

72

51

57

12

47

61

44

71

13

91

97

45

11

55

41

46

24

14

42

76

49

23

0 System

Section 1A.a describes the data and quality tools we used to select the project. The C-17 is very reliable compared to other military transports. Even so, we continuously look for ways to improve the design and make the airplane more reliable for the men and women who fly and maintain it. Boeing created a tool to capture the time required to maintain each of the aircraft systems from the Air Force maintenance records. We have a process to review the output from that tool every month. A typical example is shown in this chart. The engines were the only system that required more repair time than OBIGGS I. We observed that improving OBIGGS reliability would have more impact on the airplane reliability than almost any other system.

International Team Excellence Award Competition – 30 April, 2007

6

2007 ASQ World Conference for Quality and Improvement 1A.a

Types of Data and Quality Tools Used to Select the Project and Why they were Used

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts & In-Service Evaluations

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

We have other tools that track the reliability of each subsystem and component on the airplane. These tools will be discussed in more detail in section 2, but the output confirmed the low reliability of the OBIGGS I components. We knew it wouldn’t be easy to improve the OBIGGS I reliability, because we had already identified the root causes of the most frequent failures and had tried to upgrade those components.

International Team Excellence Award Competition – 30 April, 2007

7

2007 ASQ World Conference for Quality and Improvement 1A.b

Reasons Why the Project was Selected

580 560 540 520 500 480 460 440 420 400 380 360 340 320 300 280 260 240 220 200 180 160 140 120 100 80 60 40 20 0

SYSTEM OBJECTIVE

Projected Improvement After Incremental Design Change Implementation

GOOD Goal Actual Reliability Improvement Projection

Mar-99 Apr-99 May-99 Jun-99 Jul-99 Aug-99 Sep-99 Oct-99 Nov-99 Dec-99 Jan-00 Feb-00 Mar-00 Apr-00 May-00 Jun-00 Jul-00 Aug-00 Sep-00 Oct-00 Nov-00 Dec-00 Jan-01 Feb-01 Mar-01 Apr-01 May-01 Jun-01 Jul-01 Aug-01 Sep-01 Oct-01 Nov-01 Dec-01 Jan-02 Feb-02 Mar-02 Apr-02

HOURS

OBIGGS IMPROVEMENT PROJECTION vs. ACTUAL RELIABILITY

MONTH

Section 1A.b. We discovered we were generally successful in fixing the original root causes of the component failures. Unfortunately, we also found when the parts lasted a little longer, that new failure modes appeared and prevented the breakthrough reliability improvement we had expected. The OBIGGS II Improvement project was selected to determine whether a different and simpler method of inerting the fuel tanks was feasible, because of the unsuccessful attempt to improve system reliability by improving the OBIGGS I components.

International Team Excellence Award Competition – 30 April, 2007

8

2007 ASQ World Conference for Quality and Improvement 1A.b

Reasons Why the Project was Selected

OBIGGS 1 Problems: – High repair costs – High labor hours – Airplanes not mission capable

Customer Ranked OBIGGS 1 Reliability Improvement as No. 1 C-17 Priority

Meanwhile, our stakeholders were dealing with the effects of an unreliable system ... … The asset managers were spending millions to repair failed OBIGGS 1 parts. The technicians were constantly troubleshooting and replacing failed components. The mission planners at headquarters couldn’t schedule missions for C-17s that were unavailable while OBIGGS maintenance was going on. All of this prompted the Air Force Council that sets funding priorities to rank OBIGGS reliability improvement as the number one priority for future C-17 funding.

International Team Excellence Award Competition – 30 April, 2007

9

2007 ASQ World Conference for Quality and Improvement 1A.c

Involvement of Potential Stakeholders in Project Selection

• Pilots and maintenance personnel helped quantify low OBIGGS 1 reliability • Team was expanded to include representatives of the following stakeholders – – – – – – – –

Pilots Maintainers Engineering Production Field engineers Support Systems Customer Engineering Supplier Management

Section 1A.c. Stakeholders were critically involved in project selection. Our customer stakeholders helped us quantify the low reliability of OBIGGS I. The measured reliability is not public information, but was used by the team to select the project. The customer also validated the need for the project by ranking OBIGGS reliability improvement as their top priority. Identifying the stakeholder universe was simple, because all of our customers work for the Air Force and have well-defined roles and responsibilities. We followed a company process to make sure we identified all affected stakeholders. The most important way stakeholders were involved in project selection….

International Team Excellence Award Competition – 30 April, 2007

10

2007 ASQ World Conference for Quality and Improvement 1A.c

Involvement of Potential Stakeholders in Project Selection

• Pilots and maintenance personnel helped quantify low OBIGGS 1 reliability • Team was expanded to include representatives of the following stakeholders – – – – – – – –

Pilots Maintainers Engineering Production Field engineers Support Systems Customer Engineering Supplier Management

Customer Approved and Funded OBIGGS II Improvement Project

… was that our project was customer-funded. We knew we had their buy-in, since they deferred other priorities to fund our project.

International Team Excellence Award Competition – 30 April, 2007

11

2007 ASQ World Conference for Quality and Improvement 1B.a

Affected Organizational Goals/ Performance Measures and Strategies

(US Air Force Photo)

Improved Reliability

Section 1B.a. We established three performance measures at project kick-off. The first was improved reliability and was determined by our Air Force customer. That metric became our top priority, since it was the reason for their investment.

International Team Excellence Award Competition – 30 April, 2007

12

2007 ASQ World Conference for Quality and Improvement 1B.a

Affected Organizational Goals/ Performance Measures and Strategies

(US Air Force Photo)

Improved Reliability

(US Air Force Photo)

Reduced Initialization Time

The second performance measure was to reduce initialization time, or the time to inert the tanks on start-up. This was also selected by our customer. Quantified targets for both metrics were established and the projections for each were updated and reported throughout the project.

International Team Excellence Award Competition – 30 April, 2007

13

2007 ASQ World Conference for Quality and Improvement 1B.a

Affected Organizational Goals/ Performance Measures and Strategies

(US Air Force Photo)

Improved Reliability

(US Air Force Photo)

Reduced Initialization Time

Increased Revenue

We picked a third performance measure, which was to achieve Excellent award fee ratings from our customer. The Air Force evaluates each project they fund semi-annually and those ratings determine an incentive payment to Boeing.

International Team Excellence Award Competition – 30 April, 2007

14

2007 ASQ World Conference for Quality and Improvement 1B.a

Affected Organizational Goals/ Performance Measures and Strategies

Stakeholder Requirements & Expectations

Value Creation Profitably Expand Markets

• • • • •

Customer Work Force Suppliers Community Shareholders

Run Healthy Business

Leverage to Emerging Opportunities

 Create Next Generation  Aggressively pursue a

sustainable competitive advantage

Customer Solutions

 Capture additional  Achieve aggressive,

 Create Agile Logistics

Mobility and Systems Solutions

Operational Efficiency

Create New Frontiers

sustainable improvements to safety, quality, schedule and cost   Strengthen stakeholder  relationships  Relentlessly improve and integrate processes 

C-17 business (C-17, BC-17X, International) Launch C-17A+ Capture Performance Improvement contracts Expand alliances and partnerships

Airlift/Support  Create Network-Centric

Capability Integration  Accelerate Technology

Integration Our Vision: People Working Together to Provide the World’s First Choice for Global Airlift and Mobility Solutions

Time

Now I’ll cover how those performance measures fit into our company goals and strategies. These are the company-level objectives for the C-17 program.

International Team Excellence Award Competition – 30 April, 2007

15

2007 ASQ World Conference for Quality and Improvement 1B.a

Affected Organizational Goals/ Performance Measures and Strategies

Stakeholder Requirements & Expectations

Value Creation Profitably Expand Markets

• • • • •

Customer Work Force Suppliers Community Shareholders

Run Healthy Business

Leverage to Emerging Opportunities

Customer Solutions

 Create Agile Logistics

Mobility and Systems Solutions  Create Next Generation  Aggressively pursue a

sustainable competitive advantage

Operational Efficiency

Create New Frontiers

 Capture additional  Achieve aggressive,

sustainable improvements to safety, quality, schedule and cost   Strengthen stakeholder  relationships  Relentlessly improve and integrate processes 

C-17 business (C-17, BC-17X, International) Launch C-17A+ Capture Performance Improvement contracts Expand alliances and partnerships

Airlift/Support  Create Network-Centric

Capability Integration  Accelerate Technology

Integration Our Vision: People Working Together to Provide the World’s First Choice for Global Airlift and Mobility Solutions

Time

The OBIGGS II project supported all three aspects of the organizational strategy to Run a Healthy Business: To improve safety, quality, schedule, and cost To strengthen stakeholder relationships And to improve and integrate processes

International Team Excellence Award Competition – 30 April, 2007

16

2007 ASQ World Conference for Quality and Improvement 1B.a

Affected Organizational Goals/ Performance Measures and Strategies

Achieve Aggressive Improvements in safety, quality, schedule, and cost

Strengthen Stakeholder Relationships

Relentlessly Improve & Integrate Process

Organizational Organizational Strategies Strategies

Improve Satisfaction Index Organizational Goals

Improve Mission Capable Rate

Capture Incentive Award Fee

Key Performance Measures: • Improve reliability • Reduce initialization time

Receive EXELLENT award fee ratings from customer

Project Performance Measures

Project Strategies: • Enhance customer satisfaction by developing a simpler, more reliable OBIGGS • Develop and implement innovative methods and processes to maximize return on investment

Our three performance measures directly support organizational goals which support the three organizational strategies I just highlighted.

International Team Excellence Award Competition – 30 April, 2007

17

2007 ASQ World Conference for Quality and Improvement 1B.b

Types of Impact the Project Will Have on Each Goal/Performance Measure

(US Air Force Photo)

Improved Reliability • Reduce repair cost • Reduce maintenance labor • Improve mission capable rate

For 1B.b, the project would have the following types of impact: Improving the OBIGGS reliability would reduce repair cost, reduce maintenance labor, and improve aircraft mission capable rate.

International Team Excellence Award Competition – 30 April, 2007

18

2007 ASQ World Conference for Quality and Improvement 1B.b

Types of Impact the Project Will Have on Each Goal/Performance Measure

(US Air Force Photo)

Improved Reliability • Reduce repair cost • Reduce maintenance labor • Improve mission capable rate

(US Air Force Photo)

Reduced Initialization Time • Improve aircraft availability

Reducing the fuel tank initialization time would make the airplane more available for the customer

International Team Excellence Award Competition – 30 April, 2007

19

2007 ASQ World Conference for Quality and Improvement 1B.b

Types of Impact the Project Will Have on Each Goal/Performance Measure

(US Air Force Photo)

Improved Reliability • Reduce repair cost • Reduce maintenance labor • Improve mission capable rate

(US Air Force Photo)

Reduced Initialization Time • Improve aircraft availability

Increased Revenue • Excellent performance captures potential incentive award fee • Customer confidence for future projects

Managing the project to meet the performance, schedule, and cost targets would result in greater incentive award fees to Boeing. The potential award fees were large, since they were a percentage of total project cost.

International Team Excellence Award Competition – 30 April, 2007

20

2007 ASQ World Conference for Quality and Improvement 1B.c

Degree of Impact on Each Goal/Performance Measure and How Determined

(US Air Force Photo)

Improved Reliability • Projected 1100% increase • Determined by system reliability analysis

Section 1B.c. We estimated the degree of impact to each of the three performance measures: To determine the reliability improvement, we performed a system reliability analysis that conservatively projected OBIGGS II parts would last 11 times longer than OBIGGS I.

International Team Excellence Award Competition – 30 April, 2007

21

2007 ASQ World Conference for Quality and Improvement 1B.c

Degree of Impact on Each Goal/Performance Measure and How Determined

(US Air Force Photo)

Improved Reliability • Projected 1100% increase • Determined by system reliability analysis

(US Air Force Photo)

Reduced Initialization Time • Projected to initialize five times faster • Determined by detailed component analysis and test

To project the OBIGGS II initialization time, we tested prototype hardware in a temperature chamber and created computer simulations of the nitrogen distribution in the fuel tanks. The increased capacity of OBIGGS II allows it to initialize the fuel tanks at least five times faster than OBIGGS I.

International Team Excellence Award Competition – 30 April, 2007

22

2007 ASQ World Conference for Quality and Improvement 1B.c

Degree of Impact on Each Goal/Performance Measure and How Determined

(US Air Force Photo)

Improved Reliability • Projected 1100% increase • Determined by system reliability analysis

(US Air Force Photo)

Reduced Initialization Time • Projected to initialize five times faster • Determined by detailed component analysis and test

Increased Revenue • Projected to capture 90% of available project incentive award fee • Determined by best performance on prior largescale integration projects

Our goal to achieve EXCELLENT ratings from the Air Force customer would qualify us to receive greater than 90 percent of the available award fee. This goal would be a stretch for a project of this complexity, but we had studied the lessons learned from past projects and were confident we could do it.

International Team Excellence Award Competition – 30 April, 2007

23

2007 ASQ World Conference for Quality and Improvement 1C.a

Affected Internal and External Stakeholders and How they were Identified

Stakeholders Internal Engineering Production Supplier Management Support Systems Training

How Affected Stakeholders were Identified • Internal stakeholders identified via project management process at kick-off meeting • External customer stakeholders identified by Boeing Field Services and USAF engineering customers • External supplier stakeholders identified through competitive bid process

Field Services Flight Test

External Pilots Maintainers Customer Engineering Suppliers

For section 1C.a, I’ll discuss the affected stakeholders. The internal stakeholders were self-identified per our Boeing project management process at a project kick-off meeting. Our Boeing field services organization and the customer engineers helped identify specific representatives of each customer group who could help us. We arranged a visit to Air Force headquarters to brief our project and ensure we had representation from all affected customer groups. Boeing supplier management helped identify and select the external suppliers who would participate on the project team through the formal Boeing competitive bid process.

International Team Excellence Award Competition – 30 April, 2007

24

2007 ASQ World Conference for Quality and Improvement 1C.b

Types of Impact on Stakeholders and How These were Determined

Stakeholders

Types of Impact

Internal Engineering

Create 750 new drawings for system and support equipment

Production

Plan, install, and test new system components

Supplier Management

Procure 1400 new parts

Support Systems

Create tech manuals and provision spares

Training

Create new training course

Field Services

Prepare to assist USAF maintenance

Flight Test

Install instrumentation and verify new system performance

External Pilots

Understand display changes and reduced initialization time

Maintainers

Use new maintenance procedures

Customer Engineering

Monitor project performance/verify specification compliance

Suppliers

Design and deliver new system components

How Types of Stakeholder Impact were Determined • Project plans were briefed to internal stakeholders and they estimated the technical and cost impact • Boeing developed specifications in coordination with potential suppliers, then requested formal proposals. Suppliers then determined their impact and Boeing selected the most favorable proposals. • Customer stakeholders were invited to design reviews and technical meetings • Team traveled to eight Air Force Bases to explain system impacts

1C.b. The different types of stakeholder impact are shown here. The internal stakeholder impacts were determined by the stakeholders themselves as part of our formal change process. Supplier stakeholder impacts were determined during the bid process. Our team determined the customer impacts and got concurrence we had adequately assessed them at the recurring technical meetings. We also traveled to eight different Air Force bases to explain system impact and ensure we had customer stakeholder support.

International Team Excellence Award Competition – 30 April, 2007

25

2007 ASQ World Conference for Quality and Improvement 1C.c

Degree of Potential Impact on Stakeholders and How These were Determined

Stakeholders

Degree of Impact

Internal Engineering

High

Production

High

Supplier Management

High

Support Systems

High

Training

Low

How Degree of Stakeholder Impact was Determined • Project plans were briefed to internal stakeholders and they estimated the technical and cost impact

High

• Boeing developed specifications in coordination with potential suppliers, then requested formal proposals. Suppliers then determined their impact and Boeing selected the most favorable proposals.

Pilots

Low

• Customer stakeholders were invited to design reviews and technical meetings

Maintainers

High

Field Services Flight Test

Moderate

External

Customer Engineering Suppliers

Moderate

• Team traveled to eight Air Force Bases to explain system impacts

High

For section 1C.c, the degree of stakeholder impact is shown in the table. We determined the degree of stakeholder impact in the same way we determined the type of impact. I’ve repeated that information on this slide. We didn’t expect much stakeholder resistance, beyond the normal resistance to change, because none of the stakeholders were negatively impacted That completes the story about how and why the project was selected. Now I’ll introduce John Watson, who will discuss the Current Situation Analysis when we started the project.

International Team Excellence Award Competition – 30 April, 2007

26

2007 ASQ World Conference for Quality and Improvement ASQ 2007 2

Current Situation Analysis

Thank You Don, I’m the Lead Reliability Engineer for the C-17. I’ll describe the methods, tools and analysis we used to determine the root causes of the OBIGGS 1 problems.

International Team Excellence Award Competition – 30 April, 2007

27

2007 ASQ World Conference for Quality and Improvement 2A.a

Methods and Tools Used to Identify Possible Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts & In-Service Evaluations

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

For section 2A.a, we used a number of methods and tools to determine the possible root causes.

International Team Excellence Award Competition – 30 April, 2007

28

2007 ASQ World Conference for Quality and Improvement 2A.a

Methods and Tools Used to Identify Possible Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts & In-Service Evaluations

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

Our main tool was the Air Force database that contains the C-17 maintenance records. This was the best source of data available for identifying OBIGGS 1 component failures, because the records were generated by the pilots & maintenance crew at the time of failure.

International Team Excellence Award Competition – 30 April, 2007

29

2007 ASQ World Conference for Quality and Improvement 2A.a

Methods and Tools Used to Identify Possible Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts & In-Service Evaluations

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

Next, our Boeing Failure Reporting, Analysis, and Corrective Action System (FRACAS) was used to correct, sort, analyze, and store the data from the Air Force records. This tool follows our company procedure for a closed loop corrective action system.

International Team Excellence Award Competition – 30 April, 2007

30

2007 ASQ World Conference for Quality and Improvement 2A.a

Methods and Tools Used to Identify Possible Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts & In-Service Evaluations

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

Using another tool called GOLD, we tracked each component returned to the supplier for repair.

International Team Excellence Award Competition – 30 April, 2007

31

2007 ASQ World Conference for Quality and Improvement 2A.a

Methods and Tools Used to Identify Possible Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts & In-Service Evaluations

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

We also used several tracking charts and In-Service evaluations to monitor the performance of OBIGGS as we implemented fixes to the system’s components.

International Team Excellence Award Competition – 30 April, 2007

32

2007 ASQ World Conference for Quality and Improvement 2A.a

Methods and Tools Used to Identify Possible Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts & In-Service Evaluations

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

We used a detailed step-by-step approach for analyzing each failure which occurred on the system. This degree of analysis is standard for every maintenance action that takes place on the C-17.

International Team Excellence Award Competition – 30 April, 2007

33

2007 ASQ World Conference for Quality and Improvement 2A.a

Methods and Tools Used to Identify Possible Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts & In-Service Evaluations

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

Performing Pareto Analyses of all of the failures helped us focus our efforts on the driving components for maximum benefit.

International Team Excellence Award Competition – 30 April, 2007

34

2007 ASQ World Conference for Quality and Improvement 2A.a

Methods and Tools Used to Identify Possible Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts & In-Service Evaluations

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

And finally, we used Brainstorming methods with our stakeholders and subject matter experts to help identify root causes.

International Team Excellence Award Competition – 30 April, 2007

35

2007 ASQ World Conference for Quality and Improvement Team Analysis of Data to Identify Possible Root Causes

2A.b

Detailed Step-by-step analysis DATE AIRCRAFT 10-Nov-98 90-0532 11-Nov-98 90-0532 12-Nov-98 90-0532

LOCATION / T/O TIME ACTUAL COUNTED QUESTIONNAIRE TURNED IN OBIGGS OBIGGS SUCCESSFUL SYSTEM RESETS FAULT PILOT (P) SQUAWK or FROM MAINT. FAILED FAULT LIST (FL) CODE COMMENTS DESTINATION SORTIE (ZULU) FLT HRS FLT HRS LOG # OPS MAINT USED LEFT RIGHT TOTAL # RESETS LEFT RIGHT # JCN P or FL ACTIONS WUC None 1 1 WSAP-FJDG 1 0947 4.6 4.6 FJDG-W SAP 1 2110 4.9 4.9 9 Y N 1 Y Y 1 0 0 0 W SAP-RJTY 1 0437 6.5 6.5 10 ? Y Y 1 Y Y 1 0 0 0 1 OBGULLAG-R 3171006 FL 0 dosen't count LOCATION / T/O TIME ACTUAL COUNTED QUESTIONNAIRE TURNED IN OBIGGS OBIGGS SUCCESSFUL SYSTEM RESETS FAULT PILOT (P) SQUAWK or FROM MAINT.C/AFAILED states ops checked good (this was re-opened as JCN FAULT LIST (FL) CODE DESTINATION SORTIE (ZULU) FLT HRS FLT HRS LOG # OPS MAINT USED LEFT RIGHT TOTAL # RESETS LEFT 2 RIGHT # ACTIONS3192431) WUC COMMENTS OBGXSET2-R 3171007 FL JCN 0P or FL 49SC0 None 1 1 WSAP-FJDG 1 0947 4.6 4.6 signed off as duplicate discrepancy to JCN 3171009, but Maint FJDG-W SAP 1 2110 4.9 4.9 9 Y N 1 Y Y 1 0 0 3 0OBGDXFVLV 3171008 FL 1 49TA0 actually found cannon plug disconnected W SAP-RJTY 1 0437 6.5 6.5 10 ? Y Y 1 Y Y 1 0 0 0 1 OBGULLAG-R 3171006 FL 0 C/A states ops dosen't count checked good (Neer says this was re-opened). G0LOCATION / T/O TIME ACTUAL COUNTED QUESTIONNAIRE TURNED IN OBIGGS OBIGGS SUCCESSFUL RESETS FAULT PILOT (P) SQUAWK or 1 FROM MAINT. FAILED states ops checked good (this 4SYSTEM OBGSXFVLV 3171009 FL 49LM0 81 also showsC/A a JCN 3171010 for same fault . was re-opened as JCN FAULT LIST (FL) CODE COMMENTS DATE AIRCRAFT DESTINATION SORTIE (ZULU) FLT HRS FLT HRS LOG # OPS MAINT USED LEFT RIGHT TOTAL # RESETS LEFT RIGHT # JCN P or FL 49SC0 ACTIONS WUC 2 OBGXSET2-R 3171007 FL 0 3192431) None 1 1 RJTY-PHIK 2 2250 7.3 7.3 None 1 WSAP-FJDG 10-Nov-98 4.6 4.6 signed off as duplicate discrepancy to JCN 3171009, but Maint 1 11 PHIK-KDMA 90-0532 1 0810 6.0 1 6.00947 None 11-Nov-98 FJDG-W 4.9 4.9 9 N Y Y 1 0 0 3 0OBGDXFVLV 3171008 FL 1 49TA0 actually found cannon plug disconnected 1 Y 11 KDMA-KCHS90-0532 2 1610 SAP3.5 1 3.52110 None 12-Nov-98 W SAP-RJTY 1 0437 6.5 6.5 10 ? Y Y 1 Y Y 1 0 0 0 1 OBGULLAG-R 3171006 FL 0 C/A states opsdosen't count checked good (Neer says this was re-opened). G0KCHS-no flts 90-0532 C/A states ops checked good (this 4 OBGSXFVLV 3171009 FL 1 T/S System 49LM0 on 8115th, also on shows aR2 JCN 3171010 for 0056 same fault . was re-opened as JCN 16th Controller (s/n out; 0068 None 1 1 2 OBGXSET2-R 3171007 FLin), ops ck0 bad, found 49SC0 3192431) RJTY-PHIK 2 2250 7.3 7.3 pins 1A & 1B in TB 3932TB082 required None 1 1 as duplicate discrepancy 90-0532 PHIK-KDMA 1 0810 6.0 6.0 reseating, then ops checkedsigned good.off Controller should come backto JCN 3171009, but Maint OBGDXFVLV found plug disconnected 3 Xmitter 3171008 49TA0 None 1 1 KDMA-KCHS 2 1610 3.5 3.5 OBIGGS Bottle Pressure This isactually probable howcannon AF reopened JCN 49LY0 FLfrom shop1 as RTOK. C/A states ops checked good (Neer says this was re-opened). G090-0532 KCHS-no flts KCHS-no flts 1 set 2R Faults on Ctrlr & MCD 3192431 Maint. 2 49LQ0 3171009. Contr 4 OBGSXFVLV FL 1 T/S 49LM0 8115th, alsoon shows JCN 3171010 for 0056 sameout; fault . System on 16thaR2 Controller (s/n 0068 WEEK 1 TOTAL 32.8 32.8 90-0532 MTBMc = 8.20 6.0 6.0 90-0532 OMS = 100.00 4.0 3171009 None 1 1 RJTY-PHIK 2 2250 7.3 7.3 in), ops ck bad, found pins 1A & 1B in TB 3932TB082 required KCHS Bravo Alrt None 1 1 13-Nov-98 90-0532 PHIK-KDMA 1 0810 6.0 6.0 reseating, then ops checked good. Controller should come back KCHS Bravo Alrt None 1 1 KDMA-KCHS 2 1610 3.5 3.5 OBIGGS Bottle Pressure Xmitter 49LY0 from shop as RTOK. This is probable how AF reopened JCN KCHS Bravo Alrt 14-Nov-98 set 2R Faults on Ctrlr & MCD 3192431 Maint. 2 49LQ0 3171009. Contr 90-0532 KCHS-no flts KCHS-no flts 1 KCHS-no flts 90-0532 T/Sdetailed Systemflight on 15th, 16th R2 WEEK 1 TOTAL 32.8 32.8 90-0532 MTBMc = 8.20 6.0 6.0 90-0532 OMS = 100.00 4.0 No squawks or faults. (G0-81 hourson(F8038 Y Controller (s/n 0056 out; 0068 in),and ops back ck bad, 1A belong & 1B in TB 3932TB082 required 90-0532 KCHS Bravo Alrt option) shows flights to KNBC for found 4.7 hrspins which reseating, then ops checked good. Controller should come back 90-0532 KCHS Bravo Alrt2325 KCHS-KDOV 1 1.3 1.3 27 Y N 1 Y Y 1 0 0 0 to 97-0042, there were removed.) OBIGGS Bottle Pressure Xmitter from questionnaire shop as RTOK. probable how AF reopened JCN 49LY0 90-0532 KCHS Bravo Alrt second sortie included on same asThis first is(#27) KDOV-ETAR 2 0355/21st 7.6 7.6 27 Y N 1 Y Y 1 0 0 0 15-Nov-98 90-0532 1 set 2R Faults on Ctrlr & MCD 3192431 Maint. 2 49LQ0 3171009. Contr 90-0532 KCHS-no flts KCHS-no flts write-up on discrepancy line as if pilot wrote it up, and fault code WEEK 1 TOTAL 32.8 32.8 90-0532 MTBMc = 8.20 6.0 6.0 90-0532OBIGGS OMS = Bleed 100.00 4.0 No checked. squawks or faults. (G0-81 detailed flight Reg Vlv - R on "OBGBRVLV-R" Don't think pilot would look at hours (F8038 Y 16-Nov-98 KCHS option) shows flights to KNBC and back for 4.7 hrs which belong ETAR-KW RI 90-0532 1 1605Bravo Alrt 9.5 9.5 28 Y Y? 1 Y Y 1 0 0 0 1 Controller. Sys didn't fail. 3261001 FL 1 49LD0 controller. 17-Nov-98 90-0532 KCHS Bravo Alrt2325 90-0532 KCHS-KDOV 1 1.3 1.3 27 Y N 1 Y Y 1 0 0 0 97-0042, theretowere Job closed asto"cleared cnt/rtn serv.removed.) Ops ck good" on day 326, 18-Nov-98 KCHS Bravo Alrt second sortie included same questionnaire as first (#27) KDOV-ETAR 2 0355/21st 7.6 7.6 Y 1 Y 0 2 0OBGBRVLV-? KWRI-KCHS 90-0532 2 0450/22nd 1.7 1.7 28 Y27 Y ?Y 1 N Y 1 Y 0 1 0 0 0 3261002 FL 0 49LD0 but this ties with above squawk (JCNon 3261001) 19-Nov-98 90-0532 KCHS-no flts write-up on discrepancy line as if pilot wrote it up, and fault code #3 Eng OBIGGS Bleed Pressure No checked. squawks Don't or faults. (G0-81 detailed OBIGGS "OBGBRVLV-R" think pilot would lookflight at hours (F8038 Y Regulator Fitting needsBleed to be Reg Vlv - R on option) shows flights to KNBC and back for 4.7 hrs which belong 90-0532 ETAR-KW RI 1 1605 9.5 9.5 28 Y Y? 1 Y Y 1 0 0 3 0reemed 1out Controller. Sys didn't3322843 fail. 3261001 0 FL 49LD0 1 ties with 49LD0 controller. above squawk (JCN 3261001) Maint 20-Nov-98 KCHS-KDOV 1.3 1.3 27 N Y Y 1 0 0 0 97-0042, theretowere Job closed asto"cleared cnt/rtn serv.removed.) Ops ck good" on day 326, 1 Y 11 KCHS-KCOF 90-0532 1 2105 1.0 1 1.02325 NONE second included same questionnaire as first (#27) 0355/21st 7.6 MTBMc 7.6= 1 Y 90-0532 Y 1 OMS Y 0 2 0OBGBRVLV-? KWRI-KCHS KDOV-ETAR 2 0450/22nd 1.7 1.7 28 Y27 1 N Y 5.0 0 32610021.0 FL 0 49LD0 but this ties with abovesortie squawk (JCNon 3261001) WEEK 2 TOTAL 21.1 2 21.1 90-0532 21.10 5.0Y ?Y =0 1100.00 0 0 write-up on discrepancy line as if pilot wrote it up, and fault code #3 Eng OBIGGS Bleed Pressure OBIGGS "OBGBRVLV-R" checked. Don't Regulator Fitting needsBleed to be Reg Vlv - R on Questionnaire states that right side only operated for 2.3 hrs.think No pilot would look at 21-Nov-98 ETAR-KW RI 2.9 1 Y ?N Y0 Y 0 1 0 0 0 3 0reemed out 1 Controller. Sys didn't3322843 fail. 3261001 0 FLsquawks 1or faults 49LD0 controller. ties with above squawk (JCN 3261001) Maint 49LD0 KCOF-TAPA 90-0532 1 1745 2.91605 40 ?9.5 Y 9.5 N 28 1 Y Y 11 recorded. Also, questionnaire shows 3.3 Flt hrs. Job closed as "cleared cnt/rtn to serv. Ops ck good" on day 326, NONE 1 1 90-0532 KCHS-KCOF 1 2105 1.0 1.0 NONE 1 1 TAPA-FHAW 2 0005/24th 7.3 7.3 KWRI-KCHS 2 0450/22nd 1.7 1.7 28 = Y 1 Y 5.0 Y 90-0532 1 OMS =0 100.00 0 0 2 OBGBRVLV-? 32610021.0 FL 0 49LD0 but this ties with above squawk (JCN 3261001) WEEK 2 TOTAL 21.1 21.1 90-0532 MTBMc 21.10 5.0Y ? FHAW -no flts #3 Eng OBIGGS Bleed Pressure NONE 1 1 FHAW -TAPA 1 0230 7.8 7.8 Regulator Fitting needs to be Questionnaire states that right side only operated for 2.3 hrs. No NONE 1 1 TAPA-KCOF 2 1245 3.6 3.6 reemed out ties with aboveAlso, squawk (JCN 3261001) 3 3322843 Maint 0 squawks 49LD0 90-0532 KCOF-TAPA 1745 2.9 2.9 40 ? Y 1 Y 1 N 1 0 0 0 or faults recorded. questionnaire shows 3.3 Flt hrs. NONE 1 N KCOF-KCHS 3 17501 1.0 1.0 NONE 1 1 22-Nov-98 1 2105 1.0 1.0 NONE 1 1 TAPA-FHAW KCHS-KCOF 2 0005/24th 7.3 7.3 KCHS-no flts 90-0532 WEEK 2 TOTAL 21.1 21.1 90-0532 MTBMc = 21.10 5.0 5.0 90-0532 OMS = 100.00 1.0 90-0532 KCHS-no flts FHAW -no flts NONE 1 1 90-0532 1 0230 7.8 7.8 KCHS-no flts FHAW -TAPA Questionnaire states that right side only operated for 2.3 hrs. No NONE 1 1 2 1245 3.6 3.6 KCHS-no flts TAPA-KCOF 23-Nov-98 90-0532 1745 2.9 MTBMc 2.9= #DIV/0! 40 ? 1 Y 1 N 0 0 0 NONE 1 N squawks or faults recorded. Also, questionnaire shows 3.3 Flt hrs. KCOF-KCHS KCOF-TAPA 3 17501 1.0 1.0 WEEK 3 TOTAL 22.6 22.6 90-0532 5.0 Y 5.0 90-0532 OMS = 1100.00 0.0 NONE 1 1 2 0005/24th 7.3 7.3 90-0532 KCHS-no flts TAPA-FHAW 24-Nov-98 90-0532 90-0532 KCHS-no flts FHAW -no flts NONE 1 1 25-Nov-98 90-0532 1 0230 7.8 7.8 90-0532 KCHS-no flts FHAW -TAPA NONE 1 1 2 1245 3.6 3.6 90-0532 KCHS-no flts TAPA-KCOF NONE= #DIV/0! 1 1 KCOF-KCHS 3 1750 1.0 1.0 WEEK 3 TOTAL 22.6 22.6 90-0532 MTBMc 5.0 5.0 90-0532 OMS = 100.00 0.0 26-Nov-98 90-0532 KCHS-no flts 27-Nov-98 90-0532 KCHS-no flts 28-Nov-98 90-0532 KCHS-no flts 29-Nov-98 90-0532 KCHS-no flts WEEK 3 TOTAL 22.6 22.6 90-0532 MTBMc = #DIV/0! 5.0 5.0 90-0532 OMS = 100.00 0.0

DATE AIRCRAFT 10-Nov-98 90-0532 11-Nov-98 90-0532 12-Nov-98 90-0532

13-Nov-98 14-Nov-98

90-0532 90-0532

13-Nov-98 15-Nov-98

14-Nov-98 90-0532

16-Nov-98 17-Nov-98 18-Nov-98 19-Nov-98

90-0532 90-0532 90-0532 15-Nov-98 90-0532

20-Nov-98

16-Nov-98 17-Nov-98 90-0532 18-Nov-98 19-Nov-98

21-Nov-98

22-Nov-98

23-Nov-98

90-0532 20-Nov-98

21-Nov-98 90-0532

24-Nov-98 25-Nov-98

90-0532 22-Nov-98 90-0532 90-0532

26-Nov-98 27-Nov-98 28-Nov-98 29-Nov-98

23-Nov-98 90-0532 24-Nov-98 90-0532 25-Nov-98 90-0532 90-0532 26-Nov-98 27-Nov-98 28-Nov-98 29-Nov-98

Maintenance Data

Match up the pieces of data with •

What failed



How was aircraft repaired



When it failed



How long did it take to repair the aircraft



Where in the world it failed •

How long was the aircraft out of service



On which aircraft



What parts were turned in for repair



Under what conditions



How the suppliers fixed the part

For section 2A.b, we analyzed all of the data we had collected using our various tools. We identified all of the components in the system which failed and were removed. We studied what, when and where they failed, on which aircraft and what parts were turned in for repairs. Our research revealed not only were the system’s components failing far too often, but also the time to initialize the system took way too long and added unnecessary stress on other systems. By performing this analysis, we identified the maintenance burden imposed by the OBIGGS 1. While the specific numbers cannot be released publicly, I can tell you the system’s drain on maintenance, both in time and money, was significant.

International Team Excellence Award Competition – 30 April, 2007

36

2007 ASQ World Conference for Quality and Improvement Team Analysis of Data to Identify Possible Root Causes

2A.b

Pareto Analysis OBIGGS COMPONENT REMOVALS 300

250

NUMBER OF REMOVALS

79.3% 200

150

100

50

0 1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

SYSTEM COMPONENTS

4 main problem components were focus of initial improvement attempts

The results of the Pareto analysis showed 4 components of the system accounted for almost 80% of the removals. However there were many other components which significantly contributed to the system’s problem.

International Team Excellence Award Competition – 30 April, 2007

37

2007 ASQ World Conference for Quality and Improvement 2A.b DATE AIRCRAFT 10-Nov-98 90-0532 11-Nov-98 90-0532 12-Nov-98 90-0532

Team Analysis of Data to Identify Possible Root Causes

LOCATION / T/O TIME ACTUAL COUNTED QUESTIONNAIRE TURNED IN OBIGGS OBIGGS SUCCESSFUL SYSTEM RESETS FAULT PILOT (P) SQUAWK or FROM MAINT. FAILED FAULT LIST (FL) CODE DESTINATION SORTIE (ZULU) FLT HRS FLT HRS LOG # OPS MAINT USED LEFT RIGHT TOTAL # RESETS LEFT RIGHT # JCN P or FL ACTIONS WUC COMMENTS None 1 1 WSAP-FJDG 1 0947 4.6 4.6 FJDG-W SAP 1 2110 4.9 4.9 9 Y N 1 Y Y 1 0 0 0 W SAP-RJTY 1 0437 6.5 6.5 10 ? Y Y 1 Y Y 1 0 0 0 1 OBGULLAG-R 3171006 FL 0 dosen't count LOCATION / T/O TIME ACTUAL COUNTED QUESTIONNAIRE TURNED IN OBIGGS OBIGGS SUCCESSFUL SYSTEM RESETS FAULT PILOT (P) SQUAWK or FROM MAINT.C/AFAILED states ops checked good (this was re-opened as JCN FAULT LIST (FL) CODE COMMENTS DESTINATION SORTIE (ZULU) FLT HRS FLT HRS LOG # OPS MAINT USED LEFT RIGHT TOTAL # RESETS LEFT 2 RIGHT # ACTIONS WUC OBGXSET2-R 3171007 FL JCN 0P or FL 49SC0 3192431) None 1 1 WSAP-FJDG 1 0947 4.6 4.6 signed off as duplicate discrepancy to JCN 3171009, but Maint FJDG-W SAP 1 2110 4.9 4.9 9 Y N 1 Y Y 1 0 0 3 0OBGDXFVLV 3171008 FL 1 49TA0 actually found cannon plug disconnected W SAP-RJTY 1 0437 6.5 6.5 10 ? Y Y 1 Y Y 1 0 0 0 1 OBGULLAG-R 3171006 FL 0 C/A states opsdosen't count checked good (Neer says this was re-opened). G0LOCATION / T/O TIME ACTUAL COUNTED QUESTIONNAIRE TURNED IN OBIGGS OBIGGS SUCCESSFUL RESETS FAULT PILOT (P) SQUAWK or 1 MAINT. FAILED states ops checked good (this 4SYSTEM OBGSXFVLV 3171009 FL 49LM0FROM 81 also showsC/A a JCN 3171010 for same fault . was re-opened as JCN FAULT LIST (FL) CODE FL JCN 0P or FL 49SC0 COMMENTS DATE AIRCRAFT DESTINATION SORTIE (ZULU) FLT HRS FLT HRS LOG # OPS MAINT USED LEFT RIGHT TOTAL # RESETS LEFT RIGHT # ACTIONS WUC 2 OBGXSET2-R 3171007 3192431) None 1 1 RJTY-PHIK 2 2250 7.3 7.3 None 1 WSAP-FJDG 10-Nov-98 4.6 4.6 signed off as duplicate discrepancy to JCN 3171009, but Maint 1 11 PHIK-KDMA 90-0532 1 0810 6.0 1 6.00947 None 11-Nov-98 FJDG-W 4.9 4.9 9 N Y Y 1 0 0 3 0OBGDXFVLV 3171008 FL 1 49TA0 actually found cannon plug disconnected 1 Y 11 KDMA-KCHS90-0532 2 1610 SAP3.5 1 3.52110 None 12-Nov-98 W SAP-RJTY 1 0437 6.5 6.5 10 ? Y Y 1 Y Y 1 0 0 0 1 OBGULLAG-R 3171006 FL 0 C/A states opsdosen't count checked good (Neer says this was re-opened). G0KCHS-no flts 90-0532 C/A states ops checked good (this 4 OBGSXFVLV 3171009 FL 1 T/S 49LM0 8115th, alsoon shows aR2 JCN 3171010 for 0056 same fault . was re-opened as JCN System on 16th Controller (s/n out; 0068 None 1 1 2 OBGXSET2-R 3171007 FL 49SC0 3192431) RJTY-PHIK 2 2250 7.3 7.3 in), ops ck0 bad, found pins 1A & 1B in TB 3932TB082 required None 1 1 as duplicate discrepancy 90-0532 PHIK-KDMA 1 0810 6.0 6.0 reseating, then ops checkedsigned good. off Controller should come backto JCN 3171009, but Maint OBGDXFVLV found plug disconnected None 1 1 3 Xmitter 3171008 49TA0 KDMA-KCHS 2 1610 3.5 3.5 from shop1 as RTOK. This isactually probable howcannon AF reopened JCN OBIGGS Bottle Pressure 49LY0 FL C/A states ops checked good (Neer says this was re-opened). G090-0532 KCHS-no flts KCHS-no flts 1 set 2R Faults on Ctrlr & MCD 3192431 Maint. 2 49LQ0 3171009. Contr 4 OBGSXFVLV FL 1 T/S System 49LM0 on 8115th, alsoon shows JCN 3171010 for0056 sameout; fault . 16thaR2 Controller (s/n 0068 WEEK 1 TOTAL 32.8 32.8 90-0532 MTBMc = 8.20 6.0 6.0 90-0532 OMS = 100.00 4.0 3171009 None 1 1 RJTY-PHIK 2 2250 7.3 7.3 in), ops ck bad, found pins 1A & 1B in TB 3932TB082 required KCHS Bravo Alrt None 1 1 13-Nov-98 90-0532 PHIK-KDMA 1 0810 6.0 6.0 reseating, then ops checked good. Controller should come back KCHS Bravo Alrt None 1 1 KDMA-KCHS 2 1610 3.5 3.5 OBIGGS Bottle Pressure Xmitter 49LY0 from shop as RTOK. This is probable how AF reopened JCN KCHS Bravo Alrt 14-Nov-98 90-0532 KCHS-no flts 90-0532 KCHS-no flts 1 set 2R Faults on Ctrlr & MCD 3192431 Maint. 2 49LQ0 3171009. Contr KCHS-no flts T/Sdetailed Systemflight on 15th, 16th R2 WEEK 1 TOTAL 32.8 32.8 90-0532 MTBMc = 8.20 6.0 6.0 90-0532 OMS = 100.00 4.0 No squawks or faults. (G0-81 hourson(F8038 Y Controller (s/n 0056 out; 0068 in),and opsback ck bad, found 1A belong & 1B in TB 3932TB082 required 90-0532 KCHS Bravo Alrt option) shows flights to KNBC for 4.7 hrspins which reseating, then ops checked good. Controller should come back 90-0532 KCHS Bravo Alrt2325 KCHS-KDOV 1 1.3 1.3 27 Y N 1 Y Y 1 0 0 0 to 97-0042, there were removed.) from questionnaire shop as RTOK. probable how AF reopened JCN OBIGGS Bottle Pressure Xmitter 49LY0 90-0532 KCHS Bravo Alrt second sortie included on same asThis first is (#27) KDOV-ETAR 2 0355/21st 7.6 7.6 27 Y N 1 Y Y 1 0 0 0 15-Nov-98 90-0532 1 set 2R Faults on Ctrlr & MCD 3192431 Maint. 2 discrepancy 49LQ0line3171009. 90-0532 KCHS-no flts KCHS-no flts write-up on as if pilotContr wrote it up, and fault code WEEK 1 TOTAL 32.8 32.8 90-0532 MTBMc = 8.20 6.0 6.0 90-0532OBIGGS OMS = Bleed 100.00 4.0 No checked. squawks Don't or faults. (G0-81 detailed Reg Vlv - R on "OBGBRVLV-R" think pilot would lookflight at hours (F8038 Y 16-Nov-98 KCHS option) shows flights to KNBC and back for 4.7 hrs which belong ETAR-KW RI 90-0532 1 1605Bravo Alrt 9.5 9.5 28 Y Y? 1 Y Y 1 0 0 0 1 Controller. Sys didn't fail. 3261001 FL 1 49LD0 controller. 17-Nov-98 90-0532 KCHS Bravo Alrt2325 90-0532 KCHS-KDOV 1 1.3 1.3 27 Y N 1 Y Y 1 0 0 0 97-0042, theretowere Job closed asto"cleared cnt/rtn serv.removed.) Ops ck good" on day 326, 18-Nov-98 90-0532 KCHS Bravo Alrt second included same questionnaire as first (#27) 2 0355/21st 7.6 7.6 Y 1 Y 0 2 0OBGBRVLV-? KWRI-KCHS KDOV-ETAR 2 0450/22nd 1.7 1.7 28 Y27 Y ?Y 1 N Y 1 Y 0 1 0 0 0 3261002 FL 0 49LD0 but this ties with abovesortie squawk (JCNon 3261001) 19-Nov-98 90-0532 KCHS-no flts write-up on discrepancy line as if pilot wrote it up, and fault code #3 Eng OBIGGS Bleed Pressure No checked. squawks Don't or faults. (G0-81 detailed OBIGGS "OBGBRVLV-R" think pilot would lookflight at hours (F8038 Y Regulator Fitting needsBleed to be Reg Vlv - R on option) shows flights to KNBC and back for 4.7 hrs which belong 90-0532 ETAR-KW RI 1 1605 9.5 9.5 28 Y Y? 1 Y Y 1 0 0 3 0reemed out 1 Controller. Sys didn't3322843 fail. 3261001 0 FL 49LD0 1 ties with 49LD0 controller. above squawk (JCN 3261001) Maint 20-Nov-98 KCHS-KDOV 1.3 1.3 27 N Y Y 1 0 0 0 97-0042, theretowere Job closed asto"cleared cnt/rtn serv.removed.) Ops ck good" on day 326, 1 Y 11 KCHS-KCOF 90-0532 1 2105 1.0 1 1.02325 NONE second included same questionnaire as first (#27) 0355/21st 7.6 MTBMc 7.6= Y 1 OMS Y 1 0 2 0OBGBRVLV-? KWRI-KCHS KDOV-ETAR 2 0450/22nd 1.7 1.7 28 Y27 1 N Y 5.01 Y 90-0532 0 0 0 32610021.0 FL 0 49LD0 but this ties with abovesortie squawk (JCNon 3261001) WEEK 2 TOTAL 21.1 2 21.1 90-0532 21.10 5.0Y ?Y =0 100.00 write-up on discrepancy line as if pilot wrote it up, and fault code #3 Eng OBIGGS Bleed Pressure OBIGGS "OBGBRVLV-R" checked. Don't Regulator Fitting needsBleed to be Reg Vlv - R on Questionnaire states that right side only operated for 2.3 hrs.think No pilot would look at 21-Nov-98 ETAR-KW RI 2.9 1 Y ?N Y0 Y 0 1 0 0 0 3 0reemed out 1 Controller. Sys didn't3322843 fail. 3261001 0 FL 1or faults 49LD0 controller. ties with above squawk (JCN 3261001) Maint 49LD0 KCOF-TAPA 90-0532 1 1745 2.91605 40 ?9.5 Y 9.5 N 28 1 Y Y 11 squawks recorded. Also, questionnaire shows 3.3 Flt hrs. NONE 1 1 Job closed as "cleared cnt/rtn to serv. Ops ck good" on day 326, 90-0532 1 2105 1.0 1.0 NONE 1 1 TAPA-FHAW KCHS-KCOF 2 0005/24th 7.3 7.3 KWRI-KCHS 2 0450/22nd 1.7 1.7 28 = Y 1 Y 5.0 Y 90-0532 1 OMS =0 100.00 0 0 2 OBGBRVLV-? 32610021.0 FL 0 49LD0 but this ties with above squawk (JCN 3261001) WEEK 2 TOTAL 21.1 21.1 90-0532 MTBMc 21.10 5.0Y ? FHAW -no flts #3 Eng OBIGGS Bleed Pressure NONE 1 1 FHAW -TAPA 1 0230 7.8 7.8 Regulator Fitting needs to be Questionnaire states that right side only operated for 2.3 hrs. No NONE 1 1 TAPA-KCOF 2 1245 3.6 3.6 reemed out ties with aboveAlso, squawk (JCN 3261001) 3 3322843 Maint 0 squawks 49LD0 or faults recorded. questionnaire shows 3.3 Flt hrs. 90-0532 1745 2.9 2.9 40 ? Y 1 Y 1 N 1 0 0 0 NONE 1 N KCOF-KCHS KCOF-TAPA 3 17501 1.0 1.0 NONE 1 1 22-Nov-98 90-0532 KCHS-KCOF 1 2105 1.0 NONE 1.0 1 1 2 0005/24th 7.3 7.3 KCHS-no flts TAPA-FHAW WEEK 2 TOTAL 21.1 21.1 90-0532 MTBMc = 21.10 5.0 5.0 90-0532 OMS = 100.00 1.0 90-0532 KCHS-no flts FHAW -no flts NONE 1 1 90-0532 1 0230 7.8 7.8 KCHS-no flts FHAW -TAPA Questionnaire states that right side only operated for 2.3 hrs. No NONE 1 1 2 1245 3.6 3.6 KCHS-no flts TAPA-KCOF 23-Nov-98 90-0532 KCOF-TAPA 1745 2.9 MTBMc NONE 2.9= #DIV/0! 40 ? 1 N Y 1 N 1 0 0 0 squawks or faults recorded. Also, questionnaire shows 3.3 Flt hrs. KCOF-KCHS 3 17501 1.0 1.0 WEEK 3 TOTAL 22.6 22.6 90-0532 5.0 Y 5.01 90-0532 OMS = 100.00 0.0 NONE 1 1 2 0005/24th 7.3 7.3 90-0532 KCHS-no flts TAPA-FHAW 24-Nov-98 90-0532 90-0532 KCHS-no flts FHAW -no flts NONE 1 1 25-Nov-98 90-0532 1 0230 7.8 7.8 90-0532 KCHS-no flts FHAW -TAPA NONE 1 1 2 1245 3.6 3.6 90-0532 KCHS-no flts TAPA-KCOF NONE= #DIV/0! 1 1 OMS = 100.00 KCOF-KCHS 3 1750 1.0 1.0 WEEK 3 TOTAL 22.6 22.6 90-0532 MTBMc 5.0 5.0 90-0532 0.0 26-Nov-98 90-0532 KCHS-no flts 27-Nov-98 90-0532 KCHS-no flts 28-Nov-98 90-0532 KCHS-no flts 29-Nov-98 90-0532 KCHS-no flts WEEK 3 TOTAL 22.6 22.6 90-0532 MTBMc = #DIV/0! 5.0 5.0 90-0532 OMS = 100.00 0.0

OBIGGS COMPONENT REMOVALS 300

DATE AIRCRAFT 10-Nov-98 90-0532 11-Nov-98 90-0532 12-Nov-98 90-0532

14-Nov-98

90-0532 90-0532

13-Nov-98 15-Nov-98

14-Nov-98 90-0532

16-Nov-98 17-Nov-98 18-Nov-98 19-Nov-98

90-0532 90-0532 90-0532 15-Nov-98 90-0532

20-Nov-98

16-Nov-98 17-Nov-98 90-0532 18-Nov-98 19-Nov-98

21-Nov-98

22-Nov-98

23-Nov-98

90-0532 20-Nov-98

21-Nov-98 90-0532

24-Nov-98 25-Nov-98

90-0532 22-Nov-98 90-0532 90-0532

26-Nov-98 27-Nov-98 28-Nov-98 29-Nov-98

23-Nov-98 90-0532 24-Nov-98 90-0532 25-Nov-98 90-0532 90-0532 26-Nov-98 27-Nov-98 28-Nov-98 29-Nov-98

Maintenance Data

250

79.3% NUMBER OF REMOVALS

13-Nov-98

200

150

Pareto Analysis

100

50

0 1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

SYSTEM COMPONENTS

Brainstorming – Inherent design weakness – Maintenance malpractice – Poor quality – Inadequate troubleshooting manuals and procedures

The brainstorming exercises established a list of possible root causes: • Like which components had inherent design weaknesses • Where maintenance malpractices were occurring in which an easy-to-remove component was constantly replaced to correct a problem, which just masked the real problem. • Some of the components would fail again shortly after being repaired. • And some of the trouble shooting procedures were lacking.

International Team Excellence Award Competition – 30 April, 2007

38

2007 ASQ World Conference for Quality and Improvement 2A.c

Stakeholder Involvement in Identifying Root Causes

Stakeholder

Involvement

Roles

Reliability & Maintainability Engineer

Collected & analyzed data

Performed main portion of analysis and reported findings

Design Engineers

Performed detailed failure analysis

Identified Root Causes of component failures

Support Systems

Collected & provided data

Provide Field Reports & spares quantities

Air Force Customers

Participation in Reliability Evaluation

Oversight & Concurrence

Pilots

Operated OBIGGS 1

Reported system failures

Maintainers

Repaired aircraft system

Recorded maintenance

Engineers

Evaluated data

Concurred with analysis

Turning to section 2A.c, we involved some of our stakeholders in identifying the root causes by having them participate in the analysis. Our Engineering Groups, both Reliability and Design, conducted most of the analysis to determine the possible root causes. Our Support Systems groups assisted by collecting detailed data, writing Field Reports, and reporting spares consumption. And …

International Team Excellence Award Competition – 30 April, 2007

39

2007 ASQ World Conference for Quality and Improvement Stakeholder Involvement in Identifying Root Causes

2A.c

(US Air Force Photo)

US Air Force Customer participated in OBIGGS 1 evaluations and analysis

… our Air Force customers were involved by participating in multiple reliability evaluations with the upgraded OBIGGS 1 components installed. Pilots reported failures, maintenance personnel recorded their repairs, and engineers helped evaluate the data and concurred with our findings.

International Team Excellence Award Competition – 30 April, 2007

40

2007 ASQ World Conference for Quality and Improvement 2B.a

Methods and Tools Used to Identify Final Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

Section 2B.a. The methods and tools used to identify the final root cause included all those mentioned earlier, plus: We gathered more detailed data so we could perform more Pareto Analyses.

International Team Excellence Award Competition – 30 April, 2007

41

2007 ASQ World Conference for Quality and Improvement 2B.a

Methods and Tools Used to Identify Final Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

Customer (field user) Interviews

Collect actual experiences

Reliability, Design & Field Engineers

To get the real story of what was going on

We added interviews with the Air Force Pilots and Maintainers to fully understand what was really happening with the system. This verified that we were on the right track.

International Team Excellence Award Competition – 30 April, 2007

42

2007 ASQ World Conference for Quality and Improvement 2B.a

Methods and Tools Used to Identify Final Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

Customer (field user) Interviews

Collect actual experiences

Reliability, Design & Field Engineers

To get the real story of what was going on

Supplier repair databases

Collect & Analyze repair records

Suppliers, Boeing Reliability & Design

Supplement FRACAS with more detail

We supplemented the repair data by contacting the suppliers directly to obtain detailed information about the specific cause of each failure.

International Team Excellence Award Competition – 30 April, 2007

43

2007 ASQ World Conference for Quality and Improvement 2B.a

Methods and Tools Used to Identify Final Root Causes

Method / Tool

How It Was Used

Who Used It

Why It Was Used

Air Force Maintenance Data

Collect maintenance activity of OBIGGS

Reliability Engineer

Best source of field failure data

FRACAS (Boeing database)

Store data from Air Force for Boeing analysis

Reliability Engineer

Boeing C-17’s closed loop system for tracking corrective actions

GOLD (Boeing database)

Collect data on component repairs

Reliability & Design Engineers

C-17 source of Supplier repair induction data

Tracking Charts

Weekly representation of field activity

Reliability Engineer

To track performance of OBIGGS

Step-by-step Detailed Analysis

Analyze each and every piece of data

Reliability & Design Engineers

Determine Root Causes of individual failures

Pareto Analysis

Ranking of components and failure modes

Reliability, Design Engineers & Suppliers

To Identify Failure Drivers within the system

Brainstorming

Free flow of ideas

All stakeholders

Formulate solutions

Customer (field user) Interviews

Collect actual experiences

Reliability, Design & Field Engineers

To get the real story of what was going on

Supplier repair databases

Collect & Analyze repair records

Suppliers, Boeing Reliability & Design

Supplement FRACAS with more detail

Failure Modes & Effects Analysis (FMEA)

Detailed study of all failure modes

Reliability & Design Engineers

Search for final Root Cause

And we used a Failure Modes and Effects Analysis during the search for a final root cause to identify failure modes which had not yet occurred.

International Team Excellence Award Competition – 30 April, 2007

44

2007 ASQ World Conference for Quality and Improvement 2B.b

Team Analysis of Data to Select the Final Root Causes

Team Analysis

Section 2B.b. Our team of stakeholders, which now included our suppliers, analyzed all of the detailed data to determine the final root causes. We started with our list of possible root causes and then dove deeper into our data.

International Team Excellence Award Competition – 30 April, 2007

45

2007 ASQ World Conference for Quality and Improvement 2B.b

Team Analysis of Data to Select the Final Root Causes

Suppliers’ analysis of removed parts

Our suppliers performed detailed analysis of what failed on each of their returned components and formulated ideas for solutions. They, in turn, involved their sub-tier suppliers for even more detailed analysis of how piece parts were failing and had them conduct further testing.

International Team Excellence Award Competition – 30 April, 2007

46

2007 ASQ World Conference for Quality and Improvement 2B.b

Team Analysis of Data to Select the Final Root Causes

Expanded Pareto analysis OBIGGS COMPRESSOR FAILURE MODES 68 66 64 62 60 58 56 54 52 50 48 46 44 42 40 38 36 34 32 30 28 26 24 22 20 18 16 14 12 10 8 6 4 2 0

C o m po ne nt s f a ilure

Pareto results for just one of the driving components shows multiple issues

With this added detail, we expanded our Pareto analysis down into problems within the individual components. The Pareto results shown here represent just one of the top 4 driving components, and illustrate the complexity of the system. You can see the multiple ways this single component was failing. Each of the other components had a similar list of issues.

International Team Excellence Award Competition – 30 April, 2007

47

2007 ASQ World Conference for Quality and Improvement Team Analysis of Data to Select the Final Root Causes

2B.b

580 560 540 520 500 480 460 440 420 400 380 360 340 320 300 280 260 240 220 200 180 160 140 120 100 80 60 40 20 0

SYSTEM OBJECTIVE

Projected Improvement After Design Change Implementation

GOOD

Goal Actual Reliability Improvement Projection

Mar-99 Apr-99 May-99 Jun-99 Jul-99 Aug-99 Sep-99 Oct-99 Nov-99 Dec-99 Jan-00 Feb-00 Mar-00 Apr-00 May-00 Jun-00 Jul-00 Aug-00 Sep-00 Oct-00 Nov-00 Dec-00 Jan-01 Feb-01 Mar-01 Apr-01 May-01 Jun-01 Jul-01 Aug-01 Sep-01 Oct-01 Nov-01 Dec-01 Jan-02 Feb-02 Mar-02 Apr-02

HOURS

OBIGGS IMPROVEMENT PROJECTION vs. ACTUAL RELIABILITY

MONTH

Plotted performance of the system showed less than desired results from changes made

Our tracking tools continued to show even after implementing multiple component design changes, we were not achieving the system-level reliability improvement we expected. We also realized that since the system was so inherently complex, the reliability goal we were shooting for would always be perceived as too low and the Air Force would be unhappy with it’s performance.

International Team Excellence Award Competition – 30 April, 2007

48

2007 ASQ World Conference for Quality and Improvement 2B.b

Team Analysis of Data to Select the Final Root Causes

Failure Modes & Effects Analysis (FMEA)

Identified many failure modes for the OBIGGS 1 components

We conducted a Failure Modes and Effects Analysis of the entire OBIGGS 1 using the results of our detailed analysis reviews. We concluded from this analysis that there were far too many failure modes.

International Team Excellence Award Competition – 30 April, 2007

49

2007 ASQ World Conference for Quality and Improvement 2B.c

Identification of Root Causes and How the Team Validated the Final Root Cause

Final Root Cause : The original design was inherently too complex and time consuming to fix to desired levels

For section 2B.c, the final root cause was identified. The entire OBIGGS 1 was inherently too complex to fix. And even if we could fix the reliability problem, we could not reduce the time it took to initialize the system due to it’s design methodology. We saw an improvement opportunity: completely redesign the system.

International Team Excellence Award Competition – 30 April, 2007

50

2007 ASQ World Conference for Quality and Improvement Identification of Root Causes and How the Team Validated the Final Root Cause

2B.c

580 560 540 520 500 480 460 440 420 400 380 360 340 320 300 280 260 240 220 200 180 160 140 120 100 80 60 40 20 0

SYSTEM OBJECTIVE

Projected Improvement After Design Change Implementation

GOOD

Goal Actual Reliability Improvement Projection

Mar-99 Apr-99 May-99 Jun-99 Jul-99 Aug-99 Sep-99 Oct-99 Nov-99 Dec-99 Jan-00 Feb-00 Mar-00 Apr-00 May-00 Jun-00 Jul-00 Aug-00 Sep-00 Oct-00 Nov-00 Dec-00 Jan-01 Feb-01 Mar-01 Apr-01 May-01 Jun-01 Jul-01 Aug-01 Sep-01 Oct-01 Nov-01 Dec-01 Jan-02 Feb-02 Mar-02 Apr-02

HOURS

OBIGGS IMPROVEMENT PROJECTION vs. ACTUAL RELIABILITY

MONTH

Actual performance tracking validated that Incremental improvements would not result in acceptable performance

Since the Air Force was an active participant on our team, they saw we were not obtaining the desired reliability improvements after numerous design changes. They now understood the operational burdens inherent to the design and concurred with our findings. The customer willingness to fund our project was evidence of their validation that we had determined the correct root cause. This concludes my portion. Now, I’d like to introduce Brent Theodore who will present our Solution Development.

International Team Excellence Award Competition – 30 April, 2007

51

2007 ASQ World Conference for Quality and Improvement ASQ 2007 3

Solution Development

I was the Systems Engineer on the OBIGGS II project. My part of the story is to describe how we developed the solution.

International Team Excellence Award Competition – 30 April, 2007

52

2007 ASQ World Conference for Quality and Improvement 3A.a

Methods and Tools Used to Develop Possible Solutions ASM check valve ASM failure shutof f valve diaphr ASM shutof agm f disbo valve fails nd ASM closed press ure regula ASM tor filter regula plugg tes ed (cont. on page 3) Low low air press ure out of OBIG GS heat excha nger

NEA inlet filter blocke Tubing d leak or blocka ge Low betwee NEA n ASM press and ure compr out of essor ASM

Low NEA press ure into compr essor

Fault Tree Analysis

Brainstorming

Possible Solutions Benchmark Suppliers

Section 3A.a. One method we used to identify lessons learned from OBIGGS I was fault tree analysis. We used Brainstorming throughout the process. To stimulate the brainstorming, we traced the OBIGGS 1 design back to the original requirements. In some cases, we found those requirements were based on overly conservative assumptions. We also considered technology that was immature during the initial design, but was now proven. We also visited multiple suppliers during the early phase to gain input on our preliminary concepts. We found out what contributions they could make, what we could do at the system level to simplify their design and lower risk. Using these methods, we successfully consolidated the ideas for improvement into four different concepts that could be more reliable than OBIGGS I. The consensus of the team verified the four solutions were viable. International Team Excellence Award Competition – 30 April, 2007

53

2007 ASQ World Conference for Quality and Improvement 3A.b

Team Analysis of Data to Develop Possible Solutions

Performance

For section 3A.b, the team analyzed the four possible solutions and defined the architecture and required performance for each. This effort resulted in a set of components for each solution that would be used in further detailed analysis. We also created analytical tools to determine how much nitrogen would be needed to inert the tanks for each architecture.

International Team Excellence Award Competition – 30 April, 2007

54

2007 ASQ World Conference for Quality and Improvement 3A.b

Team Analysis of Data to Develop Possible Solutions

Sizing Performance

Component size and weight were analyzed to meet baseline performance for each option. Then, component data was totaled for each system to use in comparing the options.

International Team Excellence Award Competition – 30 April, 2007

55

2007 ASQ World Conference for Quality and Improvement 3A.b

Team Analysis of Data to Develop Possible Solutions

Sizing Performance

Reliability

We analyzed the reliability of each component by combining supplier data with our aircraft operation experience. Then, we computed a system-level reliability for each solution.

International Team Excellence Award Competition – 30 April, 2007

56

2007 ASQ World Conference for Quality and Improvement 3A.b

Team Analysis of Data to Develop Possible Solutions

Sizing Performance

Reliability

Cost

Component costs were computed in the same way and then totaled for each solution. With solid estimates of the performance, sizing, reliability, and cost, we were ready to rate how well each option satisfied the selection criteria received from the customer.

International Team Excellence Award Competition – 30 April, 2007

57

2007 ASQ World Conference for Quality and Improvement 3A.c

Criteria the Team Decided to Use in Selecting the Final Solution

Design Requirements

5

1. Supports tank volume of 5110 cu ft Supports > 5110 2. Maintain tank and vent system inert Tanks and vent inert through all mission profiles through all profiles 3. Total engine flow within limits < 12 % 4. Initialization time < 40 min. t < 30 min. 5. Mean-Time Between Maintenance, MTBMc > 100 hrs corrective 6. Life Cycle Costs LCC ≤ 90% of current 7. No increase in pilot workload Decrease in workload 10. Qualified components Qualified 11. Fuel tank pressures Meets pressure settings 12. Single ASM failure does not limit All missions possible mission capability 13. Detect individual LRU failures LRUs identified and isolated by BIT 14. Capable of inert 2000 fpm descent 2000 fpm possible with with any single failure all single failure types 15. No two failures cause critical No critical double structural failure or prevent recovery failures 16. No Real Hazard I>11 All RHIs < 8 17. Current cockpit philosophy Integrated 18. Capability of retrofit Easy retrofit 20. General design practices Design standards followed in all areas 21. Production Cost Savings CS > $300K

3 Tanks inert through all profiles, vents most 30 min.≤ t < 180 min. 52.5 hrs ≤ MTBMc ≤ 100 hrs 90% of current < LCC < current Same workload Partially qualified 95% of missions still possible

1 Supports < 5110 Tanks and vents inert through most profiles > 12 % 180 min. ≤ t MTBMc < 52.5 hrs LCC ≥ Current Slight increase in workload Not qualified Doesn't meet pressure settings 90% of missions still possible

Failures identified, but fault tree required for isolation 2000 fpm possible with all except 2 failure types

Periodic ops checks and isolation required 2000 fpm possible with all except > 2 failure types Critical double failures exist

8 ≤ RHIs < 11 Pseudo Integrated Hard to retrofit Design standards followed in most areas $150K < CS ≤ $300K

Some RHIs ≥ 11 Not integrated Can't retrofit Design standards followed in some areas CS ≤ $150K

Note: Sensitive data blocked out

Section 3A.c shows the design criteria used to evaluate each of the possible solutions. We surveyed the customers to ensure we had captured and ranked the critical system level requirements. A Quality Functional Deployment (QFD) analysis then defined the relationship between each design criteria and those system requirements. Then, the design criteria was weighted to correlate with the system requirement weighting factors provided by the customer. The highest weighting was applied to criteria 4 and 5 for reliability and initialization time to support our company strategy to Run A Healthy Business.

International Team Excellence Award Competition – 30 April, 2007

58

2007 ASQ World Conference for Quality and Improvement Methods and Tools Used by the Team to Select the Final Solutions

3B.a

Possible Solutions Assembled Stakeholder Team

Sizing Performance

Reliability Design Requirements

5

1. Supports tank volume of 5110 cu ft Supports > 5110 2. Maintain tank and vent system inert Tanks and vent inert through all mission profiles through all profiles 3. Total engine flow within limits < 12 % 4. Initialization time < 40 min. t < 30 min. 5. Mean-Time Between Maintenance, MTBMc > 100 hrs corrective 6. Life Cycle Costs LCC ≤ 90% of current 7. No increase in pilot workload Decrease in workload 10. Qualified components Qualified 11. Fuel tank pressures Meets pressure settings 12. Single ASM failure does not limit All missions possible mission capability 13. Detect individual LRU failures LRUs identified and isolated by BIT 14. Capable of inert 2000 fpm descent 2000 fpm possible with with any single failure all single failure types 15. No two failures cause critical No critical double structural failure or prevent recovery failures 16. No Real Hazard I>11 All RHIs < 8 17. Current cockpit philosophy Integrated 18. Capability of retrofit Easy retrofit 20. General design practices Design standards followed in all areas 21. Production Cost Savings CS > $300K

3 Tanks inert through all profiles, vents most 30 min.≤ t < 180 min. 52.5 hrs ≤ MTBMc ≤ 100 hrs 90% of current < LCC < current Same workload Partially qualified 95% of missions still possible

Cost

1 Supports < 5110 Tanks and vents inert through most profiles > 12 % 180 min. ≤ t MTBMc < 52.5 hrs LCC ≥ Current Slight increase in workload Not qualified Doesn't meet pressure settings 90% of missions still possible

Failures identified, but fault tree required for isolation 2000 fpm possible with all except 2 failure types

Periodic ops checks and isolation required 2000 fpm possible with all except > 2 failure types Critical double failures exist

8 ≤ RHIs < 11 Pseudo Integrated Hard to retrofit Design standards followed in most areas $150K < CS ≤ $300K

Some RHIs ≥ 11 Not integrated Can't retrofit Design standards followed in some areas CS ≤ $150K

Presented Analysis

Performed Trade Study Final Solution

Section 3B.a. We conducted an extensive trade study to select the final solution following our standard Boeing Systems Engineering practice for optimizing a balanced trade-off of requirements among various engineering design alternatives. First, we expanded our team to include representatives of all of the stakeholders. We presented a description of each possible solution and the projected performance, sizing, reliability, and cost of each to the expanded team. This review gave an opportunity for the different stakeholders to suggest new ideas for consideration. Finally, the stakeholder team scored each option in the trade study before selecting the final solution.

International Team Excellence Award Competition – 30 April, 2007

59

2007 ASQ World Conference for Quality and Improvement 3B.b

Team Analysis of Data to Select the Final Solution Determine Options Option 1 Option 2 Option 3 Option 4

Score Options Compare Scores

Establish Criteria Identify Constraints System Requirements

FINAL SOLUTION

Weighting

In Section 3B.b the four design options and QFD analysis developed previously were then combined in our detailed trade study analysis. The QFD defined the weighting factors for the different design criteria that correlated with the customer weighted system requirements. This step ensured higher priority was given to the designs that best met the customer needs. The team then scored each possible solution against each of the different design criteria. After the scoring was complete, we compared the results to ensure they were objective and consistent. Then the weighting factors were applied and the results were totaled. The option with the highest score became our final solution.

International Team Excellence Award Competition – 30 April, 2007

60

2007 ASQ World Conference for Quality and Improvement Involvement of Stakeholders in the Selection of the Final Solution

3B.c

Flight Operators Crews Business Operations

Suppliers

Air Force Air Force Engineering

TRADE STUDY

Production

Logistics Support

Maintainers Engineering

3B.c. All stakeholders evaluated each option against the criteria and reached consensus on a score.

International Team Excellence Award Competition – 30 April, 2007

61

2007 ASQ World Conference for Quality and Improvement Involvement of Stakeholders in the Selection of the Final Solution

3B.c

Flight Operators Crews Business Operations

Suppliers

Air Force Air Force Engineering

TRADE STUDY

Production

Logistics Support

Maintainers Engineering

Customer engineers and pilots had provided the requirements that were inputs to the trade study and could clarify them during the scoring process.

International Team Excellence Award Competition – 30 April, 2007

62

2007 ASQ World Conference for Quality and Improvement Involvement of Stakeholders in the Selection of the Final Solution

3B.c

Flight Operators Crews Business Operations

Suppliers

Air Force Air Force Engineering

TRADE STUDY

Production

Logistics Support

Maintainers Engineering

Boeing stakeholders used their expertise to estimate the performance of the options.

International Team Excellence Award Competition – 30 April, 2007

63

2007 ASQ World Conference for Quality and Improvement Involvement of Stakeholders in the Selection of the Final Solution

3B.c

Flight Operators Crews Business Operations

Suppliers

Air Force Air Force Engineering

Final Solution

Logistics Support

TRADE STUDY Production

Maintainers Engineering

Each member participated in the trade study that led to the final solution.

International Team Excellence Award Competition – 30 April, 2007

64

2007 ASQ World Conference for Quality and Improvement 3C.a

Final Solution and How the Team Validated the Final Solution

Final Solution – Complete System Redesign • Continuous flow • Permeable membrane air separation • Boost compressor for rapid descents • Bleed air supply from environmental control system • Open architecture control • No fuel scrubbing

ER TANK SWEEP

VENT

D FW

OBIGGS 2 SYSTEM LEFT HAND SIDE

NEA SUPPLY HIGH POINT

NEA VENT AND SWEEP VALVES TANK 2 PENETRATION REAR SPAR

BILGE CRAWLER O2 SENSOR

RAM AIR SCOOP

NEA SUPPLY

NEA TO TANK1 AIR CROSSOVER

BLEED AIR INLET AIR FILTER OBIGGS HEAT EXCHANGER

NEA MANIFOLD AIR MANIFOLD

ECS PACK

ASM COMPRESSOR O2 OUT ASM SUPPLY OVERBOARD EXHAUST

Now for section 3C.a, the final solution was to completely re-design the OBIGGS with the characteristics listed on this slide. This solution offered the largest potential return on investment, even though the development cost was high.

International Team Excellence Award Competition – 30 April, 2007

65

2007 ASQ World Conference for Quality and Improvement 3C.a

Final Solution and How the Team Validated the Final Solution 25

50000

COST AS AN INDEPENDENT VARIABLE ANALYSIS

Tank 1 %O2

350

Tank 2 Fwd %O2

Tank 2 Aft %O2

ER Tank Aft %O2

ER Tank Fwd %O2

Vent %O2

%O2 Threshold

Altitude

45000

20

40000

300 35000

200 150 100

30000

OBIGGS I Option 1 Option 2 Option 3 Option 4

25000

10

Altitude (ft)

15 Ullage %O2

Performance

250

20000

15000

5

10000

50 5000

0 $0.00

$200,00 $400,00 $600,00 $800,00 $1,000,0 $1,200,0 $1,400,0 $1,600,0 $1,800,0 Cost 0,000.00 0,000.00 0,000.00 0,000.00 00,000.0 00,000.0 00,000.0 00,000.0 00,000.0 0 0 0 0 0

Cost As An Independent Variable

0

0 0

1

2

3

4

5

6

7

8

Time (hrs)

System Performance Modeling

System Lab Test

After completing the trade study, a Cost-As-An-Independent-Variable analysis was performed to validate the selected solution. Additional validation efforts included detailed system performance modeling and the assembly of an entire system for laboratory testing.

International Team Excellence Award Competition – 30 April, 2007

66

2007 ASQ World Conference for Quality and Improvement Tangible and Intangible Benefits Expected by Implementing the Team’s Solution

3C.b

Tangible Benefits OBIGGS II vs. OBIGGS I RELIABILITY DEMO RESULTS 5000

4500

4000

3500

GOOD

HOURS

3000

2500

2000

S IGG OB

1500

1000

II A

UA CT

LM

NT EA

IM

W ET EB

N EE

RE

V MO

AL

7,376% IMPROVEMENT REALIZED

OBIGGS II PROJECTED MEAN TIME BETWEEN REMOVAL

500

OBIGGS 1 ACTUAL MEAN TIME BETWEEN REMOVAL 0 1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

WEEKS

1100% Increase in system reliability

(US Air Force Photo)

Reduce Initialization Time by a factor of 5

(US Air Force Photo)

Reduce weight by 475 lbs to allow for increased cargo capability

20% system and 3:1 life cycle cost savings

Some of the tangible benefits we expected to realize for section 3C.b were: • Significantly improved reliability and reduced initialization time to make the airplane more available to fly equipment to the front lines, • Reduced system weight to increase cargo capability, • And, reduced production and logistics costs

International Team Excellence Award Competition – 30 April, 2007

67

2007 ASQ World Conference for Quality and Improvement 3C.b

Tangible and Intangible Benefits Expected by Implementing the Team’s Solution

Intangible Benefits

Customer Satisfaction OBIGGS II SYSTEM CONTROL/IMPACTS

SENSOR AND VALVE POSITION INPUTS

RIU

VALVE RELAY DRIVE OUTPUTS

(NEW BOX)

WCC WCC

W BUS W BUS CONTROLLER CONTROLLER

Industry Leader

OBIGGS II CONTROL FUNCTIONS

WACS BUS

MMP MODIFY SWITCHES H/W CHANGE ONLY

LFCP

MFDC

NO S/W OR H/W CHANGE REQ.

NEW BLEED MANIFOLD

APDMC APDMC

EEC EEC

PASS THROUGH DATA MAINTENANCE MODE DISPLAY

ENGINE BLEED STATUS

ECS ECS BLEED CONTROL TEMP CONTROL

MISSION BUS

GRU MODIFY SWITCHES H/W CHANGE ONLY

CIP CIP

M BUS M BUS CONTROLLER CONTROLLER

MISSION STATUS TIME TO INERT MAX DESCENT NON-AVIONICS FAULT DATA

8/26/03

Open System Architecture

Intangible benefits we expected were: • to improve customer satisfaction • to be the industry leader in inerting system design, and • to incorporate an open architecture design that would reduce the cost of future improvements. All benefits aligned with our organizational strategies.

International Team Excellence Award Competition – 30 April, 2007

68

2007 ASQ World Conference for Quality and Improvement How the Team Used Data to Justify Implementation of the Team’s Solution

3C.c

NOMENCLATURE PART NUMBER WUC QPA 17B2N9017-1 49WA0 Valve, OBIGGS Shutoff 2 17B2N9014-1 49WB0 Compressor Assy, Boost, OBIGGS 2 NOMENCLATURE PART NUMBER 1003812-1 49WBC WUC Compressor, Boost, OBIGGS 2 17B2N9017-1 Valve, OBIGGS Shutoff 1003813-1 49WBE 49WA0 Heat Exchanger, Boost Compressor 2 Assy, Boost, various 17B2N9014-1 49WBG 49WB0 Ducting,Compressor Boost Compressor Assy OBIGGS 6 1003812-1 49WBA49WBC Compressor, Boost,- Boost OBIGGS 1003815-1 Valve, Pressure Regulator Compresor 2 1003813-1 49WBE Assy Heat Exchanger, Boost Compressor various Ducting, Boost Compressor Assy 2341872-1-1 49WE0 49WBG Heat Exchanger, OBIGGS 2 1003815-1 49WF0 49WBA Valve, Pressure - Boost Compresor 3291628-1-1 Valve, Bypass, OBIGGSRegulator Heat Exchanger 2 Assy 17B2N9020-1 49WJ0 Filter Assembly, OBIGGS 2 2341872-1-149WL0 49WE0 Heat Exchanger, OBIGGS 17B2N9017-501 Valve, OBIGGS Crossfeed 1 3291628-1-1 49WF0 Valve, Bypass, OBIGGS Heat Exchanger various 49WP0 Ducting (upstream of ASMs) 54 17B2N9020-1 Filter Assembly, OBIGGS Valve, Shutoff, ASM 8 17B2N9017-503 49WM0 49WJ0 17B2N9017-501 OBIGGS Crossfeed 17B2N9015-1 49WN0 49WL0 Module,Valve, Air Separation 8 various 17B2N9021-1 49XB0 49WP0 Sensor,Ducting Oxygen(upstream of ASMs) 2 17B2N9017-503 Valve, Shutoff, ASM 17B2N9017-505 49XF0 49WM0 Valve, Low Flow 2 17B2N9015-149XG0 49WN0 Module, Air Separation 17B2N9017-507 Valve, Vent Supply, Fuel Tank 2 17B2N9021-149XH0 49XB0 Sensor,Fuel Oxygen 17B2N9017-507 Valve, Sweep, Tank 2 17B2N9017-505 LowUnit Flow 17B1U1019-1 49YA0 49XF0 RemoteValve, Interface (RIU), OBIGGS 1 17B2N9017-507 49XG0 Valve, Vent Supply, Fuel Tank OBIGGS II SYSTEM 17B2N9017-507 49XH0 Valve, Sweep,TOTAL Fuel Tank 17B1U1019-1 49YA0 Remote Interface Unit (RIU), OBIGGS PTP-111 THRESHOLD OBIGGS II SYSTEM TOTAL OBJECTIVE PTP-111 THRESHOLD OBJECTIVE

Op/Fh Hr Ratio 3.00 0.05 QPA 0.1 2 0.1 2 0.1 2 1.30 2 6 1.30 2 1.30 1.30 2 1.30 2 1.30 2 1.30 1 1.30 54 3.00 8 1.30 8 1.30 2 1.30 2 3.00 2 2 1

HST-CDR SYSTEM CDR PDR MTBR MTBR MTBR PREDICTION PREDICTION PREDICTION CDR (Shipset - Flt Hrs) (Shipset - FltHST-CDR Hrs) (Shipset -SYSTEM Flt Hrs) Op/Fh PDR MTBR 7,000 MTBR 7,000 MTBR 6,250 Hr20,000 PREDICTION PREDICTION PREDICTION 20,000 (Shipset - Flt Hrs) Ratio (Shipset - Flt Hrs) (Shipset - Flt Hrs) 132,181 3.00 6,250 % 7,000 % 408,272 7,000 % 0.05 20,000 % 20,000 %728,624 0.1 8,250 8,850 8,850 132,181 % 0.1 408,272 % 0.189,012 40,000 32,154 728,624 % 1.3016,000 8,250 % 16,000 8,850 % 12,789 8,850 % 1.3042,000 1.30 1.303,875 1.301,313 1.3012,500 1.3015,500 1.3021,000 3.0021,000 1.309,000 1.30 1.30 556 3.00

21,728 89,012 %42,000 16,000 % 80,000 13,125 42,000 % 1,875 12,500 % 3,875 18,000 1,313 % 18,000 12,500 %18,000 15,500 % 9,000 21,000 % 21,000 % 693

9,000 % 500 975% 556 600

21,728 40,000 % 42,000 16,000 % 73,148 21,728 % 13,125 42,000 % 1,954 80,000 % 12,500 % 13,125 13,739 1,875 % 13,739 12,500 % 13,739 18,000 % 9,091 18,000 % 18,000 % 761 9,000 %

1050% 693

32,154 % 12,789 % 21,728 % 42,000 % 73,148 % 13,125 % 1,954 % 12,500 % 13,739 % 13,739 % 13,739 % 9,091%

1100% 761

600% 500 500% 600

Final Reliability Analysis

Computational Fluid Dynamics Analysis

(US Air Force Photo)

3C.c. The team used data from various analyses to justify the selection of OBIGGS II. Final reliability analysis used detailed inputs from actual supplier experience to predict the reliability for each component. The Computational Fluid Dynamic analysis proved the oxygen in the tanks would be evenly distributed and our initialization time goal would be met.

International Team Excellence Award Competition – 30 April, 2007

69

2007 ASQ World Conference for Quality and Improvement How the Team Used Data to Justify Implementation of the Team’s Solution

3C.c

NOMENCLATURE PART NUMBER WUC QPA 17B2N9017-1 49WA0 Valve, OBIGGS Shutoff 2 17B2N9014-1 49WB0 Compressor Assy, Boost, OBIGGS 2 NOMENCLATURE PART NUMBER 1003812-1 49WBC WUC Compressor, Boost, OBIGGS 2 17B2N9017-1 Valve, OBIGGS Shutoff 1003813-1 49WBE 49WA0 Heat Exchanger, Boost Compressor 2 Assy, Boost, various 17B2N9014-1 49WBG 49WB0 Ducting,Compressor Boost Compressor Assy OBIGGS 6 1003812-1 49WBA49WBC Compressor, Boost,- Boost OBIGGS 1003815-1 Valve, Pressure Regulator Compresor 2 1003813-1 49WBE Assy Heat Exchanger, Boost Compressor various Ducting, Boost Compressor Assy 2341872-1-1 49WE0 49WBG Heat Exchanger, OBIGGS 2 1003815-1 49WF0 49WBA Valve, Pressure - Boost Compresor 3291628-1-1 Valve, Bypass, OBIGGSRegulator Heat Exchanger 2 Assy 17B2N9020-1 49WJ0 Filter Assembly, OBIGGS 2 2341872-1-149WL0 49WE0 Heat Exchanger, OBIGGS 17B2N9017-501 Valve, OBIGGS Crossfeed 1 3291628-1-1 49WF0 Valve, Bypass, OBIGGS Heat Exchanger various 49WP0 Ducting (upstream of ASMs) 54 17B2N9020-1 Filter Assembly, OBIGGS Valve, Shutoff, ASM 8 17B2N9017-503 49WM0 49WJ0 17B2N9017-501 OBIGGS Crossfeed 17B2N9015-1 49WN0 49WL0 Module,Valve, Air Separation 8 various 17B2N9021-1 49XB0 49WP0 Sensor,Ducting Oxygen(upstream of ASMs) 2 17B2N9017-503 Valve, Shutoff, ASM 17B2N9017-505 49XF0 49WM0 Valve, Low Flow 2 17B2N9015-149XG0 49WN0 Module, Air Separation 17B2N9017-507 Valve, Vent Supply, Fuel Tank 2 17B2N9021-149XH0 49XB0 Sensor,Fuel Oxygen 17B2N9017-507 Valve, Sweep, Tank 2 17B2N9017-505 LowUnit Flow 17B1U1019-1 49YA0 49XF0 RemoteValve, Interface (RIU), OBIGGS 1 17B2N9017-507 49XG0 Valve, Vent Supply, Fuel Tank OBIGGS II SYSTEM 17B2N9017-507 49XH0 Valve, Sweep,TOTAL Fuel Tank 17B1U1019-1 49YA0 Remote Interface Unit (RIU), OBIGGS PTP-111 THRESHOLD OBIGGS II SYSTEM TOTAL OBJECTIVE

Op/Fh Hr Ratio 3.00 0.05 QPA 0.1 2 0.1 2 0.1 2 1.30 2 6 1.30 2 1.30 1.30 2 1.30 2 1.30 2 1.30 1 1.30 54 3.00 8 1.30 8 1.30 2 1.30 2 3.00 2 2 1

PTP-111 THRESHOLD OBJECTIVE

HST-CDR SYSTEM CDR PDR MTBR MTBR MTBR PREDICTION PREDICTION PREDICTION CDR (Shipset - Flt Hrs) (Shipset - FltHST-CDR Hrs) (Shipset -SYSTEM Flt Hrs) Op/Fh PDR MTBR 7,000 MTBR 7,000 MTBR 6,250 Hr20,000 PREDICTION PREDICTION PREDICTION 20,000 (Shipset - Flt Hrs) Ratio (Shipset - Flt Hrs) (Shipset - Flt Hrs) 132,181 3.00 6,250 % 7,000 % 408,272 7,000 % 0.05 20,000 % 20,000 %728,624 0.1 8,250 8,850 8,850 132,181 % 0.1 408,272 % 0.189,012 40,000 32,154 728,624 % 1.3016,000 8,250 % 16,000 8,850 % 12,789 8,850 % 1.3042,000 1.30 1.303,875 1.301,313 1.3012,500 1.3015,500 1.3021,000 3.0021,000 1.309,000 1.30 1.30 556 3.00

21,728 89,012 %42,000 16,000 % 80,000 13,125 42,000 % 1,875 12,500 % 3,875 18,000 1,313 % 18,000 12,500 %18,000 15,500 % 9,000 21,000 % 21,000 % 693

9,000 % 500 975% 556 600

21,728 40,000 % 42,000 16,000 % 73,148 21,728 % 13,125 42,000 % 1,954 80,000 % 12,500 % 13,125 13,739 1,875 % 13,739 12,500 % 13,739 18,000 % 9,091 18,000 % 18,000 % 761 9,000 %

1050% 693

32,154 % 12,789 % 21,728 % 42,000 % 73,148 % 13,125 % 1,954 % 12,500 % 13,739 % 13,739 % 13,739 % 9,091%

1100% 761

600% 500 500% 600

Final Reliability Analysis

Computational Fluid Dynamics Analysis 25

50000 Tank 1 %O2

Tank 2 Fwd %O2

Tank 2 Aft %O2

ER Tank Aft %O2

ER Tank Fwd %O2

Vent %O2

%O2 Threshold

Altitude

45000

20

40000

35000

25000

10

Altitude (ft)

30000

Ullage %O2

15

20000

15000

5

10000

5000

0

0 0

1

2

3

4

5

6

7

8

Time (hrs)

OBIGGS Mission Analysis Program

(US Air Force Photo)

Weight Analysis Life Cycle Cost Analysis

The weight analysis was a summation of the added components and structural changes less the weight of equipment removed. The Life Cycle Cost Analysis showed the total cost benefit over time. The OBIGGS Mission analysis Program was a computer tool developed to simulate the performance of the entire OBIGGS. This simulation confirmed the tanks would remain inert through 28 different mission profiles. In all cases, the results confirmed the earlier estimates used during the trade study. Now I’d like to introduce Rick Morey who will present project implementation and results.

International Team Excellence Award Competition – 30 April, 2007

70

2007 ASQ World Conference for Quality and Improvement ASQ 2007 4

Project Implementation and Results

Hello, I was the OBIGGS II project manager and will talk about the project implementation and results.

International Team Excellence Award Competition – 30 April, 2007

71

2007 ASQ World Conference for Quality and Improvement 4A.a

Types of Internal and External Stakeholder Involvement in Implementation

FORMAL DESIGN REVIEWS • • • •

System Requirements Review System Design Review Preliminary Design Reviews (Supplier and Customer) Critical Design Reviews (Supplier and Customer) DESIGN FOR MANUFACTURING AND ASSEMBLY • Assembly Simulations • Prototype Fit Checks on Aircraft • Document Quality Inspections

Teamwork

PRODUCTION SUPPORT • Proactive Issue Resolution • First Article Inspections VALIDATION / VERIFICATION

Communication

• Combined Validation/Verification Component Reviews • Flight Test • In Service Evaluation

For section 4A.a, we had 4 general types of internal and external stakeholder involvement on our project. Formal Design Reviews were conducted to present the design requirements, concepts and status to all stakeholders. The project team worked closely with manufacturing personnel to ensure a seamless implementation into the production line. Engineers, co-located with manufacturing personnel, supported production during the first article assembly. Issues were documented and status was provided daily. All stakeholders were involved in the validation and verification of the final product.

International Team Excellence Award Competition – 30 April, 2007

72

2007 ASQ World Conference for Quality and Improvement 4A.b

How Various Types of Resistance Were Identified and Addressed

Type

How Identified

How Addressed

Customer reluctance to fund project due to high cost

Customer feedback during negotiations

Detailed estimates, competitive pricing & life cycle cost analysis

Supplier not willing to control interfaces to requested tolerances

Interface Key Characteristic reviews

Negotiated compromise during weekly supplier coordination meetings

Production schedule impact from late parts

Feedback from production stakeholder on team

Established agreed-to lead times for parts

Production schedule impact from learning curve

Feedback from production stakeholder on team

Fit checks, dedicated engineering support

Production concern about part damage on installation

Feedback from production stakeholder on team

Assembly simulation and created protective covers

Cluttered production work space

Lean initiatives coordination meetings with Production

Created point-of-use carts to transport selected parts

Flight test airplane out of service too long

Customer feedback during flight test planning

Installed instrumentation in production

Resistance to Model Based Definition from QA

QA feedback at first article inspection

Generated 2D inspection sheets from 3D models

Section 4A.b. Various types of resistance identified during implementation are shown in this table. These issues were identified through coordination with stakeholder representatives. Each issue was identified as an action item and worked by the team until the affected stakeholder concern was addressed.

International Team Excellence Award Competition – 30 April, 2007

73

2007 ASQ World Conference for Quality and Improvement 4A.b

How Various Types of Resistance Were Identified and Addressed

Type

How Identified

How Addressed

Customer reluctance to fund project due to high cost

Customer feedback during negotiations

Detailed estimates, competitive pricing & life cycle cost analysis

Supplier not willing to control interfaces to requested tolerances

Interface Key Characteristic reviews

Negotiated compromise during weekly supplier coordination meetings

Production schedule impact from late parts

Feedback from production stakeholder on team

Established agreed-to lead times for parts

Production schedule impact from learning curve

Feedback from production stakeholder on team

Fit checks, dedicated engineering support

Production concern about part damage on installation

Feedback from production stakeholder on team

Assembly simulation and created protective covers

Cluttered production work space

Lean initiatives coordination meetings with Production

Created point-of-use carts to transport selected parts

Flight test airplane out of service too long

Customer feedback during flight test planning

Installed instrumentation in production

Resistance to Model Based Definition from QA

QA feedback at first article inspection

Generated 2D inspection sheets from 3D models

As an example, our Air Force customer was concerned the flight test airplane would be out of service too long. This concern was expressed during an early design review and assigned as an action item. The team coordinated with production to install the flight test instrumentation during aircraft assembly, instead of after delivery. This plan reduced the flight test schedule by 6 weeks and resolved the customer concern.

International Team Excellence Award Competition – 30 April, 2007

74

2007 ASQ World Conference for Quality and Improvement How Stakeholder Buy-in Was Ensured 4A.c Stakeholders

Plan to Ensure Buy-in:

Validated By:

Engineering

Developing own implementation plans. Reported progress to them regularly.

Dedicated support to the project. Commitment to plan evident during regular status reviews.

Production

Early involvement for development of installation plans. Collocated engineers on first assembly. Full scale mockups of large parts.

Requests for manufacturing features on designs. Strong participation in mockup trial installations. Positive feedback during first installations.

Supplier Management

Early close coordination with engineering, participation in drawing release reviews

Strong participation. Provided part-by-part status weekly. Aggressive resolution of issues.

Support Systems

Development of own performance metrics and reporting progress to stakeholders

Enthusiastic participation in design reviews. Early coordination of validation impacts with customer.

Training

Early coordination with engineering aided course development

Early development of plan, communication with project team and customer

Field Services

Early visibility from design reviews. Aided planning of future customer support

Initiative in learning the system prior to first delivery

Flight Test

Full time interaction with design team, from development through test flights

Outstanding management of installation of instrumentation in production. Close coordination with engineering when developing test plans.

Pilots

Dramatic potential improvement of inerting system

Affirmation during base visits

Maintainers

Design reviews at bases prior to implementation. Participation in mockup installation.

Enthusiastic participation at bases during reviews, mockup installation, follow-up communication

Customer Engineering

Involvement in project selection. Frequent, regular communication. Full system lab test.

Strong support for project. Teamwork in decisions addressing challenges, regular communication.

Suppliers

Frequent communication, design reviews,– they were team members

Strong participation in developing design solutions. Commitment to schedule needs.

Section 4A.c. All stakeholders were involved early in the project. They determined their impacts and gave input to help shape certain decisions. They also developed their own implementation plans and performance metrics and reported status regularly. This table lists the different ways we ensured we had stakeholder buy-in.

International Team Excellence Award Competition – 30 April, 2007

75

2007 ASQ World Conference for Quality and Improvement How Stakeholder Buy-in Was Ensured 4A.c Stakeholders

Plan to Ensure Buy-in:

Validated By:

Engineering

Developing own implementation plans. Reported progress to them regularly.

Dedicated support to the project. Commitment to plan evident during regular status reviews.

Production

Early involvement for development of installation plans. Collocated engineers on first assembly. Full scale mockups of large parts.

Requests for manufacturing features on designs. Strong participation in mockup trial installations. Positive feedback during first installations.

Supplier Management

Early close coordination with engineering, participation in drawing release reviews

Strong participation. Provided part-by-part status weekly. Aggressive resolution of issues.

Support Systems

Development of own performance metrics and reporting progress to stakeholders

Enthusiastic participation in design reviews. Early coordination of validation impacts with customer.

Training

Early coordination with engineering aided course development

Early development of plan, communication with project team and customer

Field Services

Early visibility from design reviews. Aided planning of future customer support

Initiative in learning the system prior to first delivery

Flight Test

Full time interaction with design team, from development through test flights

Outstanding management of installation of instrumentation in production. Close coordination with engineering when developing test plans.

Pilots

Dramatic potential improvement of inerting system

Affirmation during base visits

Maintainers

Design reviews at bases prior to implementation. Participation in mockup installation.

Enthusiastic participation at bases during reviews, mockup installation, follow-up communication

Customer Engineering

Involvement in project selection. Frequent, regular communication. Full system lab test.

Strong support for project. Teamwork in decisions addressing challenges, regular communication.

Suppliers

Frequent communication, design reviews,– they were team members

Strong participation in developing design solutions. Commitment to schedule needs.

As an example, we ensured buy-in by production mechanics and maintainers by creating full scale mock-ups of large parts to demonstrate their installation.

International Team Excellence Award Competition – 30 April, 2007

76

2007 ASQ World Conference for Quality and Improvement How Stakeholder Buy-in Was Ensured 4A.c

Stakeholder participation in design development

The mockups were installed on a trial basis during the design phase by the mechanics who would do the work in the future. They enthusiastically participated in this opportunity to validate the design at an early stage and gave their feedback and buy-in.

International Team Excellence Award Competition – 30 April, 2007

77

2007 ASQ World Conference for Quality and Improvement How Stakeholder Buy-in Was Ensured 4A.c Stakeholders

Plan to Ensure Buy-in:

Validated By:

Engineering

Developing own implementation plans. Reported progress to them regularly.

Dedicated support to the project. Commitment to plan evident during regular status reviews.

Production

Early involvement for development of installation plans. Collocated engineers on first assembly. Full scale mockups of large parts.

Requests for manufacturing features on designs. Strong participation in mockup trial installations. Positive feedback during first installations.

Supplier Management

Early close coordination with engineering, participation in drawing release reviews

Strong participation. Provided part-by-part status weekly. Aggressive resolution of issues.

Support Systems

Development of own performance metrics and reporting progress to stakeholders

Enthusiastic participation in design reviews. Early coordination of validation impacts with customer.

Training

Early coordination with engineering aided course development

Early development of plan, communication with project team and customer

Field Services

Early visibility from design reviews. Aided planning of future customer support

Initiative in learning the system prior to first delivery

Flight Test

Full time interaction with design team, from development through test flights

Outstanding management of installation of instrumentation in production. Close coordination with engineering when developing test plans.

Pilots

Dramatic potential improvement of inerting system

Affirmation during base visits

Maintainers

Design reviews at bases prior to implementation. Participation in mockup installation.

Enthusiastic participation at bases during reviews, mockup installation, follow-up communication

Customer Engineering

Involvement in project selection. Frequent, regular communication. Full System lab test.

Strong support for project. Teamwork in decisions addressing challenges, regular communication.

Suppliers

Frequent communication, design reviews,– they were team members

Strong participation in developing design solutions. Commitment to schedule needs.

Another example that helped ensure customer buy-in was the assembly of an entire functioning system …

International Team Excellence Award Competition – 30 April, 2007

78

2007 ASQ World Conference for Quality and Improvement How Stakeholder Buy-in Was Ensured 4A.c

Validation of system performance provided confidence in design

… in a lab to simulate operational performance during all phases of flight. The test proved the system would meet requirements. This reduced risk for the customer. Any needed adjustments could have been made before a large percentage of the project budget was spent. This test inspired customer confidence and validated their investment

International Team Excellence Award Competition – 30 April, 2007

79

2007 ASQ World Conference for Quality and Improvement Plan Developed by the Team to Implement its Solution

4B.a

Team Plan

Project Task Plan

Stakeholders

Integrated Master Schedule

Types of Impact

Internal Engineering Production Supplier Management Support Systems Training Field Services Flight Test

Create 750 new drawings for system and support equipment Plan, install, and test new system components Procure 1400 new parts Create tech manuals and provision spares Create new training course Prepare to assist USAF maintenance Install instrumentation and verify new system performance

External Pilots Maintainers Customer Engineering Suppliers

Understand display changes and reduced initialization time Use new maintenance procedures Monitor project performance/verify specification compliance Design and deliver new system components

Stakeholders

Risk Mitigation Plans

Integrated Master Plan

4B.a. A contractual document called a Project Task Plan (PTP) was developed by all of the stakeholders and approved by the Air Force customer. The PTP included a technical overview of the project, describing aircraft system, structural and avionics changes, and a high-level team plan. The team plan identified key project milestones with target completion dates. Entry and exit criteria were identified for each milestone. The plan was further developed in the Integrated Master Schedule and the Integrated Master Plan. Mitigation plans were developed and implemented for each risk identified during the project life cycle.

International Team Excellence Award Competition – 30 April, 2007

80

2007 ASQ World Conference for Quality and Improvement 4B.b

Procedure, System or Other Changes Made to Implement the Solution and Sustain the Results

Procedure/System Change Drawing Quality Inspection – All drawings were reviewed by stakeholders before release

Project Drawing Report – Online spreadsheet with real time status

Project Program Directives – Documented approaches to technical and project management subjects

Production Tag-Up Data Base – Spreadsheet on project server to status production issues. Contained links to artifacts.

Reliability Evaluation Plan – Evaluation of OBIGGS before and after project to assess technical effectiveness

How Evaluated

Evidence of Sustainment

Resulted in drawing quality metric 33% better than any previous large project

Procedure has been adopted by other projects

Source for drawing status statistics and drawing quality metric

Procedure has been adopted by other projects

Stakeholders affected concurred that each directive would ensure desired results

Directives followed throughout project and adopted for OBIGGS II retrofit project

Valuable communication tool for daily meeting with production while first ship was assembled

Used by all departments as first ship progressed through the assembly line. Action items quickly resulted in permanent producibility improvements.

Customer concurrence that plan would verify effectiveness of design change

Reliability continues to be monitored following evaluation format. Issues are quickly identified and addressed.

Section 4B.b. Several effective procedure and system changes were developed to implement OBIGGS II. They were sustained throughout the project and some have been adopted by other projects.

International Team Excellence Award Competition – 30 April, 2007

81

2007 ASQ World Conference for Quality and Improvement 4B.b

Procedure, System or Other Changes Made to Implement the Solution and Sustain the Results

Procedure/System Change Drawing Quality Inspection – All drawings were reviewed by stakeholders before release

Project Drawing Report – Online spreadsheet with real time status

Project Program Directives – Documented approaches to technical and project management subjects

Production Tag-Up Data Base – Spreadsheet on project server to status production issues. Contained links to artifacts.

Reliability Evaluation Plan – Evaluation of OBIGGS before and after project to assess technical effectiveness

How Evaluated

Evidence of Sustainment

Resulted in drawing quality metric 33% better than any previous large project

Procedure has been adopted by other projects

Source for drawing status statistics and drawing quality metric

Procedure has been adopted by other projects

Stakeholders affected concurred that each directive would ensure desired results

Directives followed throughout project and adopted for OBIGGS II retrofit project

Valuable communication tool for daily meeting with production while first ship was assembled

Used by all departments as first ship progressed through the assembly line. Action items quickly resulted in permanent producibility improvements.

Customer concurrence that plan would verify effectiveness of design change

Reliability continues to be monitored following evaluation format. Issues are quickly identified and addressed.

One example was the creation of project program directives. They defined strategies to be followed during the project for various subjects, like drawing requirements or communication management.

International Team Excellence Award Competition – 30 April, 2007

82

2007 ASQ World Conference for Quality and Improvement 4B.b

Procedure, System or Other Changes Made to Implement the Solution and Sustain the Results

Program Directives Were Developed for OBIGGS II

Each directive was developed by the affected stakeholders to ensure that it would meet the desired results. The directives were followed throughout the project and have been adopted for the OBIGGS II retrofit projects as well.

International Team Excellence Award Competition – 30 April, 2007

83

2007 ASQ World Conference for Quality and Improvement 4B.c

Creation and Installation of a System for Measuring and Sustaining Results Drawing Status

Established drawing completion status to monitor on-time release

OBIGGS II-Aircraft Systems IPT

Propulsion & Environmental Control Systems COR Burndown 350 300

Drawings

250 200 150 100 50 0

Data Current as of 19 May, 2004

Jul- Aug- Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan- Feb03 03 03 03 03 03 04 04 04 04 04 04 04 04 04 04 04 04 05 05

Jul- Aug- Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan- Feb03 03 03 03 03 03 04 04 04 04 04 04 04 04 04 04 04 04 05 05

COR Burndown Plan

318 318 318 317 317 311 299 282 237 185 134 90

COR Burndown at CDR

259 259 259 258 258 252 240 223 178 117

Delta From CDR COR Burndown Actual

0

0

0

0

0

0

0

0

0

0

63

16

12

10

10

0

0

0

87

76

60

16

12

10

10

0

0

0

13

33

11

3

0

0

0

0

0

0

44

27

47

4

2

0

10

0

0

318 318 318 317 317 316 296 279 227 179 133

Burndown Variance to Plan

0

0

0

0

0

-5

8

0

7

-4

-5

Planned COR Count

0

0

0

1

0

6

12

17

45

52

51

Actual COR Count

0

0

0

1

0

1

20

17

52

48

46

TIM #3 May 27, 2004 5

Boeing Proprietary, Confidential and/or Trade Secret Copyright © 2003 The Boeing Company. Unpublished Work - All Rights Reserved. Third Party Disclosure Requires Boeing’s Written Approval and if to Foreign Persons, Written Export Authorization WARNING: Export Controlled - This document contains technical data whose export is restricted by the Arms Export Control Act (Title 22, U.S.C. Sec 2751, et seq.) or the Export Administration Act of 1979, as amended, Title 50, U.S.C., App 2401 et seq. Violators of these export laws are subject to severe criminal penalties. Disseminate in accordance with the provisions of DoD Directive 5230.25.

OBIGGS II

4B.c. The OBIGGS II teams used both existing and new systems to measure and sustain the project results. Project specific reports and metrics were developed to measure such parameters as: • Engineering Drawing Creation • Technical Manual Creation • And, Part Procurement

International Team Excellence Award Competition – 30 April, 2007

84

2007 ASQ World Conference for Quality and Improvement 4B.c

Creation and Installation of a System for Measuring and Sustaining Results

First team to utilize a combined schedule and performance tool (IPAS)

EVMS performance input weekly

Note: Sensitive data blocked out

Performance and schedule were integrated into one common tool. It was updated weekly by the stakeholders. This data was used to generate performance metrics to manage the project and to report results to executive leadership and the customer. Project tasks that were not progressing to the plan were easily identified for corrective action.

International Team Excellence Award Competition – 30 April, 2007

85

2007 ASQ World Conference for Quality and Improvement Creation and Installation of a System for Measuring and Sustaining Results

4B.c

OBIGGS II RELIABILITY EVALUATION MEAN TIME BETWEEN REMOVAL (MTBR) TRACKING

3500

3000

Projected Flt Hrs

FLIGHT HOURS (Flt Hrs)

2500

2000

Projected Cum Demo Flt Hrs

Actual Cum Demo Flt Hrs 1500 MTBR Threshold

Actual OBIGGS II MTBR

1000

MTBR THRESHOLD

500

GOOD 14-Aug-06

7-Aug-06

31-Jul-06

24-Jul-06

17-Jul-06

10-Jul-06

3-Jul-06

26-Jun-06

19-Jun-06

5-Jun-06

12-Jun-06

29-May-06

22-May-06

15-May-06

8-May-06

1-May-06

24-Apr-06

17-Apr-06

0

WEEK STARTING

System reliability was demonstrated during the project and continues to be monitored

After several ships were delivered with OBIGGS II, a new reliability evaluation was conducted to verify that reliability targets were met. The team reviewed the actual reliability data for the in-service airplanes weekly. Even though the team completed the reliability verification requirement when the evaluation ended, we have continued to monitor the system reliability. Through this monitoring, one potential issue has been identified, and a solution has been developed.

International Team Excellence Award Competition – 30 April, 2007

86

2007 ASQ World Conference for Quality and Improvement Types of Tangible and Intangible Results That Were Realized

4C.a

Tangible Benefits OBIGGS II vs. OBIGGS I RELIABILITY DEMO RESULTS 5000

4500

4000

3500

GOOD

HOURS

3000

2500

2000

S IGG OB

1500

1000

II A

UA CT

LM

NT EA

IM

W ET EB

N EE

RE

V MO

AL

7,376% IMPROVEMENT REALIZED

OBIGGS II PROJECTED MEAN TIME BETWEEN REMOVAL

500

OBIGGS 1 ACTUAL MEAN TIME BETWEEN REMOVAL 0 1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

WEEKS

Achieved 7400% Increase in system reliability vs. 1100%

(US Air Force Photo)

Reduced Initialization Time by a factor of 11 vs. 5

(US Air Force Photo)

Reduced weight by 517 lbs. vs. 475 lbs. allowing for increased cargo capability

20% system and 3:1 life cycle cost savings as predicted

Section 4C.a shows the tangible results we achieved during the project. They greatly exceeded our expectations. The measured system reliability for OBIGGS II is 74 times better than for OBIGGS I. The initialization time was reduced by a factor of 11. The cost and weight savings were as good or better than predicted.

International Team Excellence Award Competition – 30 April, 2007

87

2007 ASQ World Conference for Quality and Improvement 4C.a

Types of Tangible and Intangible Results That Were Realized

Intangible Benefits

(U.S. (U.S. Air Air Force Force photo photo by by Airman Airman1st 1st Class Class Samantha Samantha Willner) Willner)

Customer Satisfaction OBIGGS II SYSTEM CONTROL/IMPACTS

RIU

SENSOR AND VALVE POSITION INPUTS

(NEW BOX)

VALVE RELAY DRIVE OUTPUTS

WCC WCC

W BUS W BUS CONTROLLER CONTROLLER

OBIGGS II CONTROL FUNCTIONS

Industry Leader

WACS BUS

MMP MODIFY SWITCHES H/W CHANGE ONLY

LFCP

MFDC

NO S/W OR H/W CHANGE REQ.

NEW BLEED MANIFOLD

APDMC APDMC

EEC EEC

PASS THROUGH DATA MAINTENANCE MODE DISPLAY

ENGINE BLEED STATUS

ECS ECS BLEED CONTROL TEMP CONTROL

MISSION BUS

GRU MODIFY SWITCHES H/W CHANGE ONLY

CIP CIP

M BUS M BUS CONTROLLER CONTROLLER

MISSION STATUS TIME TO INERT MAX DESCENT NON-AVIONICS FAULT DATA

8/26/03

Open System Architecture

We achieved the intangible benefits we expected. Our customer is delighted with the performance of OBIGGS II and the open architecture design. Our team has been recognized as an industry leader and recruited to assist in other inerting system development projects within Boeing.

International Team Excellence Award Competition – 30 April, 2007

88

2007 ASQ World Conference for Quality and Improvement How Results Link with Organization Goals, Performance Measures and Strategies

4C.b Proposed First Article Inspection Process For New Projects Design Engineering

Material and Process Engineering

Develop Pr eliminar y Design

Add Firs t Artic le Ins pec tion Requir ements To The Drawings Per TA-PD-233

Rev ise Drawings

Identify Cr itic al Ins pection Items In The Sys tem, Present And Obtain Concurrenc e At PDR And CDR

Process Improvements

A

Ins pect Des ignated Ins pection Points Per TA-PD-233

Elec tric al Bonding Cogniz ant Engineer

No Document The Res ults Of The Ins pection

Quality Assuranc e

Add First Article Ins pection Requir ements To AOs And AAOs Per TA- PD-233

Planning

Pr oduc tion A

Correct Installation

Engineering Correct?

Yes

Inst al lation Correc t?

End

Yes

No



A

Complete Installation And Notify Nec ess ar y Stakeholders Per AOs And AAO s

Four different processes

Reduced Initialization Time •

Improved by a factor of 11

(US Air Force Photo)

Stakeholder Requirements & Expectations

Value Creation Profitably Expand Markets

• • • • •

Customer Work Force Suppliers Community Shareholders

Run Healthy Business

Create New Frontiers Leverage to Emerging Opportunities

Mobility and Systems Solutions

 Create Next Generation  Aggressively pursue a

  Achieve aggressive,

 Create Agile Logistics

sustainable improvements to safety,  quality, schedule and cost   Strengthen stakeholder relationships  Relentlessly improve and  integrate processes

sustainable competitive advantage Capture additional C-17 business (C-17, BC-17X, International) Launch C-17A+ Capture Performance Improvement contracts Expand alliances and partnerships

Airlift/Support

 Create Network-Centric

Capability Integration

 Accelerate Technology

Integration

Our Vision: People Working Together to Provide the World’s First Choice for Global Airlift and Mobility Solutions

(US Air Force Photo)

Improved Reliability •

Improved by a factor of 74

Time Increased Revenue •

Captured excellent rating for every award fee period throughout the project

Section 4C.b. The project’s results directly supported the C17 20Year Strategy as planned. We achieved aggressive financial improvement by reducing logistic and production costs and earning excellent ratings during all award fee periods. Stakeholder relationships were strengthened with the reduced initialization time and improved reliability. The process for drawing stakeholder review was improved. All drawings were reviewed by production, support systems, supplier management, and other engineering disciplines before release. As a result, our drawing quality metric was 33% better than any previous large project.

International Team Excellence Award Competition – 30 April, 2007

89

2007 ASQ World Conference for Quality and Improvement 4C.c

How Results Were Shared with Stakeholders Internal Stakeholders

Bi-Monthly

Design Reviews

Various early in project

Flight Test Report

After Flight Test

Reliability Evaluation Report

After Evaluation

Suppliers

Internal Project Reviews

Maintainers

Bi-Monthly

Customer Engineering

Video Conference Reviews

Pilots

Bi-Monthly

Flight Test

Technical Interchange Meetings

Cust Exec Leadership

Weekly

Field Services

Status Meeting

Training

Weekly

Support Systems

Action Item call with customer

Supplier Management

Daily

Production

Project Team StandUp Meeting

Engineering

Frequency

Boeing Exec Leadership

Communication Vehicle

External Stakeholders

For section 4C.c, the team communicated results with the stakeholders regularly, following our program directive for Communication Management. This table shows the different ways they were communicated to the stakeholder groups.

International Team Excellence Award Competition – 30 April, 2007

90

2007 ASQ World Conference for Quality and Improvement 4C.c Communication Vehicle

How Results Were Shared with Stakeholders Effectiveness

Effectiveness Indication

Project Team StandUp Meeting

High

High participation. Interaction of engineering and internal stakeholder leads ensured required attention to action items.

Action Item call with customer

High

Technical and project issues documented as action items. Status was provided and closed with concurrence from stakeholders.

Status Meeting

High

Fostered collaborative environment. All production drawings released by baseline date. No parts late to assembly start dates.

Technical Interchange Meetings

High

High interest for all TIMs, in Long Beach and at bases. Fostered internal-external teamwork. New relationships continue thru today.

Video Conference Reviews

High

Key project management information exchange with Boeing and customer leadership. Project health led to excellent award fee ratings.

Internal Project Reviews

Med

Communication of project technical, schedule and cost status with internal stakeholders. AI’s created to address issues.

Design Reviews

High

Highly attended. Provided key Boeing and supplier design information. AI’s documented. Worked and statused at follow-up meetings.

Flight Test Report

Med

Documented that tests verified the system met its performance requirements. Resulted in release of OBIGGS II capability in fleet.

Reliability Evaluation Report

Med

Documented that system met reliability during evaluation. Customer concurred that key project milestone closure criteria was met.

This communication was very effective. It established an environment that fostered teamwork among all stakeholders.

International Team Excellence Award Competition – 30 April, 2007

91

2007 ASQ World Conference for Quality and Improvement 4C.c Communication Vehicle

How Results Were Shared with Stakeholders Effectiveness

Effectiveness Indication

Project Team StandUp Meeting

High

High participation. Interaction of engineering and internal stakeholder leads ensured required attention to action items.

Action Item call with customer

High

Technical and project issues documented as action items. Status was provided and closed with concurrence from stakeholders.

Status Meeting

High

Fostered collaborative environment. All production drawings released by baseline date. No parts late to assembly start dates.

Technical Interchange Meetings

High

High interest for all TIMs, in Long Beach and at bases. Fostered internal-external teamwork. New relationships continue thru today.

Video Conference Reviews

High

Key project management information exchange with Boeing and customer leadership. Project health led to excellent award fee ratings.

Internal Project Reviews

Med

Communication of project technical, schedule and cost status with internal stakeholders. AI’s created to address issues.

Design Reviews

High

Highly attended. Provided key Boeing and supplier design information. AI’s documented. Worked and statused at follow-up meetings.

Flight Test Report

Med

Documented that tests verified the system met its performance requirements. Resulted in release of OBIGGS II capability in fleet.

Reliability Evaluation Report

Med

Documented that system met reliability during evaluation. Customer concurred that key project milestone closure criteria was met.

For example, when delivery of the first ship was imminent, meetings were held with maintainers to communicate the latest results. They enthusiastically participated and asked insightful questions. These meetings established contacts between the project team and the end users that continue today. This has contributed to efficient maintenance of the system in the field. Now, Ben Canfield will be our concluding speaker.

International Team Excellence Award Competition – 30 April, 2007

92

2007 ASQ World Conference for Quality and Improvement ASQ 2007 5

Team Management

Good afternoon, I’m representing Program Management and will cover how the team managed the project. This was the largest design change for the C-17, so team selection and management were key to its success.

International Team Excellence Award Competition – 30 April, 2007

93

2007 ASQ World Conference for Quality and Improvement 5A

How the Team Members were Selected and Involved Throughout the Project

Identified functional impacts within each department – Work Breakdown Structure created – Detailed Statement of Work created

For section 5A we will review how impacts to the functional engineering groups were determined, selection of key representatives, and how we maintained the high level of performance through ownership. With the PTP in hand, a detailed Work Breakdown Structure (WBS) was created to cover the entire scope, schedule and budget for the project. Statements of work were created for all tasks to identify which functional groups were impacted and to what degree. The WBS inputs were traced back to the PTP to help each group manage their effort. This trace was also used by project management to ensure all tasks supported the C-17 master schedule.

International Team Excellence Award Competition – 30 April, 2007

94

2007 ASQ World Conference for Quality and Improvement 5A

How the Team Members were Selected and Involved Throughout the Project

Representatives identified within each organization

Internal customers

Suppliers

Air Force customer

Representatives were selected from various organizations that would cover all major functional groups. Production, tooling, release, and suppliers, all had individual points of contact. Air Force customers also played a role in meeting the project milestones with various contract and report approvals.

International Team Excellence Award Competition – 30 April, 2007

95

2007 ASQ World Conference for Quality and Improvement 5A

How the Team Members were Selected and Involved Throughout the Project

Involvement was maintained by establishing ownership from each team member and matching skills with needs Supplier partnerships Agreed to team plans

Control account responsibility

Team members were committed to working towards one goal, not individual agendas. Suppliers were not just contracted to build a spec part; rather, they signed on as partners to play a role in developing the system that would benefit all stakeholders. The control accounts were managed by the team members who were responsible for the work itself, instead of by someone who didn’t have ownership of the task.

International Team Excellence Award Competition – 30 April, 2007

96

2007 ASQ World Conference for Quality and Improvement 5A

How the Team Members were Selected and Involved Throughout the Project

Organization

Responsibility

Aircraft Systems

OBIGGS design

Avionics

Aircraft integration

Airframe

Structural analysis and design

Test

Conduct lab and flight tests

Business Operations

Maintain control accounts

Schedules

Develop and integrate schedules

Systems Engineering

Requirements maintenance

Production

System assembly

Supplier Management

Supplier contract adherence

Support Systems

USAF logistics support

Organizations identified individuals to oversee their department responsibility throughout the project. The team members were selected with management concurrence as experts in their respective fields. They were assigned to the project full time and provided all the resources to meet the project goals and performance.

International Team Excellence Award Competition – 30 April, 2007

97

2007 ASQ World Conference for Quality and Improvement 5B

How the Team was Prepared to Work Together in Addressing the Project Executive Leadership OBIGGS II DIRECTOR

Engineering

Production

Supplier Management

Team Co-located Facilities

Support Systems

Training

Field Services

Flight Test

Dedicated Personnel

Now I’ll address section 5B on how we prepared the team to work together. C-17 executives created a separate organization to perform as a single unit without competing priorities. The new Integrated Product Team was led by a director who reported to executive leadership. New facilities were constructed to house up to 80 full time staff. The facilities had state of the art computer equipment and conferencing amenities to support the staff. Core functional organizations provided dedicated staff to work in a collaborative environment. The co-location of the staff encouraged teamwork and provided the freedom from every day interruptions that would have occurred while working within their functional departments.

International Team Excellence Award Competition – 30 April, 2007

98

2007 ASQ World Conference for Quality and Improvement 5B

How the Team was Prepared to Work Together in Addressing the Project

Training Class

Benefit

System Engineering Workshop

Requirements definition

Model Based Definition

Eliminate 2-D drawings

Earned Value Management

Performance and Cost control

Integrated Performance and Scheduling

Schedule adherence

Employee Involvement

Address barriers as a team

Accelerated Improvement Workshops

Tool use for root cause analysis

Project leadership provided training to the team members to enhance the system development. Various classes were attended by the team throughout the project life cycle. The training focused on areas to assist the team in several disciplines and improved individual skill sets.

International Team Excellence Award Competition – 30 April, 2007

99

2007 ASQ World Conference for Quality and Improvement 5B

How the Team was Prepared to Work Together in Addressing the Project

REVIEW Project Team Stand-Up Action item review

OCCURRENCE

ATTENDEES

Daily

Internal – Supplier Management, Systems Engineering, Project Management

Weekly

Customer, Project management

Open communication was emphasized and key Weekly Internal Stakeholders to project success!

Program review

Technical Interchange

Bi-monthly in person

Customer, Project management

Internal project review

Bi-monthly

Boeing executive leadership

Program review

Bi-Monthly video conference

Boeing and customer executive leadership

The team set-up a communication plan to ensure both internal and external stakeholders were informed of the project status and results at all times. Open communication was emphasized and a key to the project success.

International Team Excellence Award Competition – 30 April, 2007

100

2007 ASQ World Conference for Quality and Improvement 5C

How the Team Managed its Performance to Ensure it was Effective as a Team SUCCESSFUL PROJECT PERFORMANCE RESULTS FROM EFFECTIVE TEAM MANAGEMENT AND ACTION TO RESOLVE ISSUES EARLY

Note: Sensitive data blocked out

INDIVIDUAL ACCOUNTS MONITORED WEEKLY FOR COST AND SCHEDULE PERFORMANCE

For 5C, we established and monitored project metrics to manage performance and to ensure we were effective as a team. By delegating responsibility for reporting progress to the team members themselves, they were continuously aware of the team’s performance to plan. As a result, the team met all cost, schedule, and performance targets. For example, our drawing quality metric was 33% better than any previous large project. The total cost of the project came in 0.5% under budget.

International Team Excellence Award Competition – 30 April, 2007

101

2007 ASQ World Conference for Quality and Improvement 5C

How the Team Managed its Performance to Ensure it was Effective as a Team

The team identified 66 risks during the project life cycle. Each risk was assessed for technical, cost and schedule impacts. The team analyzed and developed a mitigation plan and owner for each risk and monitored the risks until closure. The status of the design, estimates of the project performance measures, risks, and cost performance were reported on a regular basis per the communication plan.

International Team Excellence Award Competition – 30 April, 2007

102

2007 ASQ World Conference for Quality and Improvement 5C

How the Team Managed its Performance to Ensure it was Effective as a Team

Team unity was in place

Communication

Commitment

Ownership

Common Goal

Team member communication, commitment, and ownership fostered the common goal to deliver an OBIGGS to the customer that would meet or exceed the required performance measures. The result was a spirit of unity and teamwork that enabled the team to set new benchmarks for project success.

International Team Excellence Award Competition – 30 April, 2007

103

2007 ASQ World Conference for Quality and Improvement

Conclusion

Thank You!

Mission Accomplished!

(US Air Force Photo)

Our customer is extremely pleased with the results and rated the OBIGGS II project as EXCELLENT in every semi-annual award fee period for all four years, and are making plans to retrofit the 141 OBIGGS 1 aircraft . Each member is proud to have been part of the OBIGGS II improvement team. Thank you for the opportunity to share our story.

International Team Excellence Award Competition – 30 April, 2007

104