The Taxonomy of Systems Engineering Competency for the New Millennium

The Taxonomy of Systems Engineering Competency for the New Millennium E. R. Widmann Hughes Space & Communications Co. PO Box 92919 Los Angeles, CA 900...
Author: Baldric Preston
45 downloads 0 Views 123KB Size
The Taxonomy of Systems Engineering Competency for the New Millennium E. R. Widmann Hughes Space & Communications Co. PO Box 92919 Los Angeles, CA 90009

G. J. Hudak, PE Stevens Institute of Technology Castle on the Hudson Hoboken, NJ 07030

G. E. Anderson Honeywell International PO Box 52199 Phoenix, AZ 85072

T. A. Hudak Stevens Institute of Technology Castle on the Hudson Hoboken, NJ 07030

Abstract. This paper provides notable results of the continued development The Taxonomy Of Systems Engineering Competency, which is aligned with the EIA/IS 731.1 Systems Engineering Capability Model (SECM). The INCOSE Corporate Advisory Board has set a high priority on the development of this Taxo nomy, which complements the use of the standard. For an organization to grow in systems engineering capability when implementing the EIA/ IS 731.1 SECM, individuals within that organization must also grow in increasing levels of competency. Key features of the EIA/IS 731.1 SECM, to which the taxonomy is to be aligned, are summarized. A foundation is presented for the development of the taxonomy’s architecture, which is comprised of three highly aligned models. Each model is described in terms of its purpose, functions, inputs, and outputs. The basis of the Systems Engineering Competency Model is described in detail. The Taxonomy is linked closely to elements of the EIA/IS 731.1 SECM. An example demonstrates the application of the taxonomy to the development of a set of enabling skills for the conduct of quality management per the EIA/IS 731.1 SECM. The paper concludes with a description of future activities and a clarification of the relationship of this Taxonomy to the Systems Engineering Body of Knowledge being developed by INCOSE. An appendix at the end of the paper provides a completed example of a set of enabling skills for the conduct of risk management per the EIA/IS 731.1 SECM.

be a high priority for INCOSE. The EIA/IS 731.1 SECM is a new industry balloted and formally released standard for assessing and improving systems engineering capability in both industry and Government as we enter the 21st Century. The standard was developed under the auspices of the EIA with participation from INCOSE. A special team was formed in June 1998 within the INCOSE Technical Board, reporting directly to the Chair of the Board, to develop this needed skills and knowledge taxonomy for systems engineering aligned with the EIA/IS 731.1 SECM. After project planning, the core author team developed an architecture for the taxonomy. Because the architecture transcended the lower level of cognitive skills and knowledge to the higher level of competency, the final work product was designated as The Taxonomy Of Systems Engineering Competency, with subtitle Enabling Growth In Systems Engineering Capability Aligned With The EIA/IS 731.1 Systems Engineering Capability Model. As used here, competency is: “The quality or state of having marked or sufficient aptitude, knowledge, judgement, strength or skill needed to perform an individual action, or of possessing a range of ability or capability.” [1] Results of the initial effort were presented to the INCOSE CAB at the International Workshop in January 1999. In June 1999, a paper [2] was published that provided a preliminary look at the taxonomy and served as a basis for soliciting feedback on the approach. Based upon positive feedback, the team moved forward with development.

INTRODUCTION Background. At the INCOSE International Workshop in January 1998, the INCOSE Corporate Advisory Board (CAB) decided that the development of a skills and knowledge taxonomy aligned with the Electronic Industries Alliance Interim Standard (EIA/IS) 731.1 Systems Engineering Capability Model (SECM) should

51

Purpose. This paper provides notable results of the continued development of The Taxonomy Of Systems Engineering Competency. This Taxonomy is intended to complement the EIA/IS 731.1 Systems Engineering Capability Model by enabling growth in systems engineering capability as defined by the standard. The

INCOSE CAB considers the development of this Taxonomy to be a high priority. When completed, the Taxonomy will provide a foundation for organizations to improve their systems engineering capability as measured by the EIA/IS 731 SECM. For an organization to grow in systems engineering capability when implementing the EIA/IS 731.1 SECM, individuals within that organization must also grow in increasing levels of competency. The Taxo nomy could be used by organizations for the professional development of systems engineering competencies that support organizational goals and program needs. It would serve as an objective basis for developing standard training, education modules, and a curriculum of courses specifically targeted for individuals or groups assigned specific roles or tasks. Figure 1 presents the organization of the paper. • • • • • • • • • • • • •

Introduction Task And Product Descriptions Key Features of the EIA/IS 731.1 SECM Architecture Development Strategic Enterprise Model Competency Needs Model Systems Engineering Competency Model Developing The Systems Engineering Competency Model Application of the Taxonomy Future Activities References Biographies Appendix: Manage Risk Focus Area

for measuring and improving systems engineering capability. It references recognized standards, e.g., EIA 632 Processes for Engineering a System and IEEE1220-1994 IEEE Trial-Use Standard for Application and Management of the Systems Engineering Process, as the appropriate processes for performing systems engineering. The reader may refer to [5], [6], and [7] for a more complete understanding of the concepts presented herein as well as a description of some of the relationships between the two source models and the EIA/IS 731.1 SECM. EIA/IS 731 can be used by all product development organizations independent of size, product complexity, product maturity, and product technology. By implementing EIA/IS 731, organizations can expect to obtain the benefits presented in Figure 2. [3] a) b) c) d) e) f) g) h)

Reduce cycle time from concept to deployed system products; Improve the match of deployed system product capability with stakeholder requirements; Reduce total ownership costs of system products; Reduce the number of engineering changes; Improve system quality; Improve communications among personnel involved in the engineering of a system; Improve ability to sustain and upgrade system products after deployment; and Reduce development risks.

Figure 2. Benefits of Using EIA/IS 731

Figure 1. Organization of Paper

TASK AND PRODUCT DESCRIPTIONS Task. Develop a knowledge and skills taxonomy and framework that may be employed by systems engineering managers and technical practitioners to guide professional development. • The knowledge and skill elements should be aligned with the practices in the SECM. • The taxonomy should provide an objective basis for d eveloping standard training, education modules, and a curriculum of courses that could be specifically targeted for individuals or groups assigned specified roles or tasks.

Organization. The EIA/IS 731.1 SECM is organized into 19 Focus Areas (FAs), which are grouped into three Categories as presented in Figure 3. The names of the 19 FAs and the 3 Categories were chosen to be consistent with the terminology in the EIA 632 Processes for Engineering a System. EIA/IS 731.1 SECM 1.0 Systems Engineering Technical Category FA 1.1 FA 1.2 FA 1.3 FA 1.4 FA 1.5 FA 1.6 FA 1.7

Define Stakeholder & System Level Requirements Define Technical Problem Define Solution Assess and Select Integrate System Verify System Validate System

2.0 Systems Engineering Management Category

Products. Two products are to be developed: • A knowledge and skill taxonomy and framework that relates individual knowledge and skills to standard practices or processes specified in the EIA/IS 731. The resulting pro duct should include a process description for applying the taxo nomy either at a group (e.g., organization) or individual level. • A survey of available systems engineering training courses and a mapping of these to the knowledge and skill levels of the taxonomy.

KEY FEATURES OF THE EIA/IS 731.1 SECM General Description And Benefits. The EIA/IS 731.1 SECM [3] is a model, to be applied to an organization using the EIA/IS 731-2 SECM Appraisal Method [4],

52

FA 2.1 FA 2.2 FA 2.3 FA 2.4 FA 2.5 FA 2.6 FA 2.7 FA 2.8

Plan and Organize Monitor and Control Integrate Disciplines Coordinate with Suppliers Manage Risk Manage Data Manage Configurations Ensure Quality

3.0 Systems Engineering Environment Category FA 3.1 FA 3.2 FA 3.3 FA 3.4

Define & Improve the Sys. Engineering Process Manage Competency Manage Technology Manage Sys. Engineering Support Environment

Figure 3. Categories and Focus Areas

Capability Levels. The EIA/IS 731.1 SECM has six ascending Capability Levels, Levels 0 through 5, with Level 0 being the default level. Capability levels are plateaus of performance that organizations achieve as they improve both their business and technical processes. The levels can be used as organizational goals in structuring improvement plans. The capability levels are synopsized in Figure 4 with complete information available in [3]. Optimizing (Level 5): The organization establishes quantitative performance goals for both process effectiveness and efficiency based on its business goals. The organization continuously improves both its standard and tailored processes by gathering quantitative data. The activities are effectively balanced and work products effectively provide their intended utility. Measured (Level 4): Metrics are defined for the organization, programs and projects and mechanisms are in place to track each quantitatively, and when necessary, take corrective action based on these measures. Performance of the t ailored process is predictable. The activities are measurably effective and work products are of measurably significant utility. Defined (Level 3): Activities are planned, performed, and managed according to well defined processes using an approved and tailored version of a standard documented process. The organization’s standard process is improved qualitatively using customer (both internal and external) feedback. The activities are significantly effective and work products are of significant utilit y. Managed (Level 2): Practices are performed according to approved plans and processes. Activities are planned, tracked, measured and verified. The activities are adequately effective and work products are of adequate utility. Performed (Level 1): Specific practices are performed, but done on an informal basis. There are no defined processes, with the successful tasks performed by “heroes.” The activities are marginally effective and work products are of marginal utility. Initial (Level 0): Activ ities are not performed or performed poorly. Activities have little effectiveness and work products have little value.

Figure 4. Capability Level Descriptions

Elements Of Capability. The capability structure in the EIA/IS 731.1 SECM is composed of process elements, called “practices,” and non-process elements, called “attributes.” The capability structure of each FA contains three classes of capability elements, and potentially a fourth class in future releases: • Specific Practices (SPs) are process e lements unique to a particular FA and, along with GPs, describe how well the process is defined, institutionalized, and followed. • Generic Practices (GPs) are process elements common to all FAs. • Generic Attributes (GAs) are non-process elements common t o all FAs and describe the appropriateness of the process, e.g., how effective is the process and how valuable are the products of the process. GAs provide a “sanity check” on the level of capability indicated by the process elements. For instance, it would not make much sense for an organization to have a high level of process capability if it produced mediocre products. • A fourth class of capability elements, comprised of nonprocess elements specific to a FA, was present in the initial review version o f EIA/IS 731.1 SECM (draft Version 0.5),

53

but were removed prior to the final release of Version 1.0. The capability elements were initially structured to incorp orate the full range of capability as measured by both source models. Using the nomenclature above, members of this class would be called Specific Attributes (SAs). They are me ntioned here for completeness and understanding in anticip ation of their future re -incorporation; this is a potential e nhancement for future EIA/IS 731.1 SECM releases should that o ccur.

Population Of Capability Levels And Themes. Each Capability Level, i.e., Levels 1 through 5, can potentially contain elements representative of all classes of capability elements. As a minimum, Levels 2 through 5 contain generic practices (GPs) and Levels 1 through 5 contain generic attributes (GAs). Specific practices (SPs) can generally be found in Levels 1 through 5. Level 0 is a default level that contains no practices or attributes. Within each FA, the SPs are placed into structures called “themes.” As presented pictorially in Figure 5, SPs may be present within a theme in one of two ways: • As a stand-alone practice(s) at a particular capability level as illustrated on Level 3 in Figure 5. • As a set of related practices that are pres ent in two or more capability levels. The set of related practices represents a core or kernel idea that shows a different manifestation d epending upon the capability level, hence forming a “vertical theme” spanning multiple capability levels. This is illustrated in Figure 5 by the vertical arrow. Vertical themes spanning multiple capability levels are quite common; most FA specific practices reside within these vertical themes.

FA EIA/IS 731 Theme Level 5 SP’s FA Theme

4

3 SP’s 2 1 0 Figure 5. Themes Within EIA/IS 731.1 SECM

The generic elements of the mo del can also be thought of as forming themes, though they are not specifically addressed as such. For instance, the two GAs are each present in the form of a theme spanning Capability Levels 1 through 5. Among the eleven GPs, several themes span multiple Capability Levels. ISO SPICE BPG. The GPs are aligned with the International Organization for Standardization (ISO) Software Process Improvement Capability dEtermination

(SPICE) Base Practices Guide (BPG). However, the GPs have been streamlined by significantly reducing their number from those found in both of the source models. The other elements of capability of the EIA/ IS 731.1 SECM were incorporated into the capability levels that are aligned with the ISO SPICE BPG. ARCHITECTURE DEVELOPMENT Initial Considerations. As suggested by the INCOSE CAB’s task description, Bloom’s Taxonomy of Cognitive Skills was initially examined as the basis for The Taxonomy Of Systems Engineering Competency aligned with the EIA/IS 731.1 SECM. This examination concluded that, while Bloom’s Taxonomy was essential, it had to be considered as part of a more robust architecture in order to be fully aligned with EIA/IS 731.1 SECM. Architecture Requirements. For The Taxonomy Of Systems Engineering Competency to be aligned with EIA/IS 731.1 SECM, it must address both individual and organizational (company, functional, and program) knowledge and skills needs in a product development environment. Bloom’s Taxonomy of Cognitive Skills very effectively addresses the needs of individuals in an academic learning environment, but it cannot adequately serve as the sole basis for developing an architecture that must also include organizational needs. A more encompassing architecture was needed to guide the professional development of systems engineering competency for organizations seeking to achieve the benefits offered by the EIA/IS 731.1 SECM (see Figure 2). In a product development environment, systems engineering takes place within the larger context of an organization, even though individuals apply their expertise on programs. Capabilities associated with the practice of systems engineering are a result of aligning an organization’s strategy with its environments, the organization’s decision on how to implement systems engineering expertise on programs, and the use of a systematic approach for developing the systems engineers’ competencies. Selected Architecture. The robust, integrated architecture presented in Figure 6 was developed for The Taxonomy of Systems Engineering Competency. This architecture, which is comprised of three well-aligned models, addresses both organizational and individual competency development needs in alignment with EIA/IS 731.1 SECM. The hierarchical relationship of the three models requires the downward flow of sufficient needs information to assure that individual systems engineer’s competencies support the enterprise goals. Each of the models’ processes and their inputs outputs are d escribed below.

54

Strategic Enterprise Model • A systems model. • External environment drives the enterprise strategy to enable adaptability. • Strategy, including mission and goals, drives enterprise work including processes, human capability development, and technology. • Enterprise work creates results which enable goals to be met. • Enterprise characteristics include a shared understanding across the enterprise regarding the enterprise direction, organization structure, measurements, forums for decisionmaking and problem-solving, and an existing process for participant growth and development.

Competency Needs Model • Program and project competency needs are determined by enterprise strategy. • Competency needs are customized by each program/project. • A needs analysis is performed to determine the level of participants’ competencies related to the program or project needs. • If a gap exists between needed program/project competencies and participants’ competencies, a development program is instituted.

Systems Engineering Competency Model • An inclusive hierarchy of progressive learning based on Bloom’s Taxonomy. It is used to identify abilities of systems engineers in accordance with EIA/IS 731. Classes are: 1 Knowledge 4 Assessment 2 Comprehension 5 Synthesis 3 Application • SE abilities are measured using a Competency Level (E-A) linked to the above five classes of cognitive learning. • A mix of individual SE Competency Levels must adequately support the organization’s desired EIA/IS 731 capability level. • The Taxonomy of Systems Engineering Competency defines the SE practitioner’s abilities needed to perform the practices and attributes at each capability level of EIA/IS 731.

Figure 6. Taxonomy Architecture Consists of Three Aligned Models

STRATEGIC ENTERPRISE MODEL Purpose. The Strategic Enterprise Model is a conceptual framework for understanding the strategy, environments, and alignment among the major parts of the organization. This theoretical construct has several generic characteristics that can be supported in detail by many currently available business models. The Taxonomy of Systems Engineering Competency will not endorse or encourage the use of any specific business model; instead, it is up to the organization using the Taxonomy to decide upon the business model most appropriate for its own application. Functions. The generic characteristics of the Strategic Enterprise Model are presented in Figure 6. Decisions are made using the Strategic Enterprise Model about the specific processes, technology, and human skills that

will be needed, i.e., the organization’s work that is addressed by the FAs of EIA/IS 731.1 SECM. After that, the Competency Needs Model can be used to define the curriculum for developing the skills needed to support the necessary organizational processes and technology. For example, an organization may have as its overall strategy the need to gain more market share for a particular product by reducing costs, and the goals to support that strategy may be to reduce cycle time from concept to deployed system products and to improve system quality. Since the practice of systems engineering should support those goals, it can be said that the work of systems engineering supports the strategy, and is aligned with the strategy. Also, the practice of systems engineering should lead to the goals being attained in the results area. Inputs. The inputs to the Strategic Enterprise Model include a description of external environmental forces relevant to the enterprise, including needs/demands of consumers, regulations, impact of competitors, and the economy. These external environmental forces form the base from which enterprise managers create a set of assumptions regarding how work will be done to meet the strategic mission and goals of the enterprise. This means assumptions are made about how the enterprise will function within the external environment taking into consideration the resources contained in the internal environment: wealth, human capability, and technological capability. Outputs. Since the Strategic Enterprise Model is a “systems ” model addressing the organization and its programs as a whole, the various outputs from this model address all aspects of the organization’s strategies and needs, of which developing competency is one of many. The top-level outputs of the Strategic Enterprise Model are (1) the results attained from the work accomplished by the programs within the enterprise and (2) an assessment of the results to determine if the organization’s goals have been attained. The practice of systems engineering, which is resident within each program, is an integral part of this assessment to determine its contribution to the results and the need for redirection and enhancement, including the development of competency. If the results of the programs lead to the accomplishment of enterprise level goals, then generally the enterprise’s strategy can remain the same as long as the environment doesn’t change. However, if program results do not attain the desired enterprise-level goals, then either the strategy must change or the way work is accomplished on the programs must change. If the external environment changes, or if the internal environment changes, the strategy will need to be re-examined.

55

Part of this change process and re-examination will address competency needs. The outputs of the Strategic Enterprise Model that serve as inputs to the Competency Needs model is a statement of the enterprise’s mission and goals, and a description of management’s assumptions made explicit concerning resources—human and otherwise. The overall output of the Enterprise Model is derived though information gained in the Competency Needs model—which is if the work that was done as a result of matching competencies to programs, projects was effective. COMPETENCY NEEDS MODEL Purpose. The Competency Needs Model is a conceptual framework for assessing and developing systems engineering competency, and developing a curriculum to provide the skills needed to support the practice of systems engineering identified by the Strategic Enterprise Model. The Taxonomy of Systems Engineering Competency will not endorse or encourage the use of any specific competency needs model; instead, it is up to the organization using the Taxonomy to decide upon the competency needs model most appropriate for its own application. Functions. The generic characteristics of the Comp etency Needs Model are presented in Figure 6. The first action taken in the application of the Competency Needs Model requires the identification and understanding of the organization’s strategic goals and business objectives as key information for the identification, analysis, assessment and planning for associate performance improvement needs. This can be provided by input from the Strategic Enterprise Model assessment of organizational alignment with the practice of systems engineering. In the Competency Needs Model, an organization’s strategic goals and business objectives drive the needs for organization and employee competence and performance improvement. A front-end needs analysis is done, with associates learning needs being identified at the macro, rather than micro-level of analysis. Here an in-depth analysis is performed of the characteristics of the learners with respect to the demands of the practice of systems engineering within the organization and project or program. This analysis may include an assessment of the learner’s prior and present work experiences, formal academic knowledge, and skills levels. A listing is made of the competencies to be included in learning interventions, and the sequence in which they should be attained. Performance objectives are created, with assessment activities planned to evaluate the learner’s performance following instruction. Inputs. The inputs of the Competency Needs Model include the mission and goals of the enterprise, and as-

sumptions from enterprise management concerning resources contained in the internal environment. Subgoals for individual programs and projects may also constitute input. Outputs. The outputs of the Competency Needs Model are an assessment of the human capabilities contained in the enterprise related to the capabilities required to meet the enterprise’s goals, including any sub-goals for individual programs or projects. Assumptions made by leaders in the enterprise model are tested here. When capabilities match needs, individuals are assigned to appropriate programs and projects. When capabilities do not match needs, a developmental program is instituted. SYSTEMS ENGINEERING COMPETENCY MODEL Purpose. The Systems Engineering Competency Model is used for developing the abilities of systems engineers to the levels of individual competency required by the program or product for organizational performance at the enterprise capability level. The Model is based upon a unique relationship between the individual learning classifications of Bloom’s Taxo nomy of Cognitive Skills and the abilities required to perform the specific practices, generic practices, and generic attributes of the EIA/IS 731.1 SECM. Functions. The process of the Systems Engineering Competency Model is to first analyze the abilities of each engineer allocated to the project against the Competency Tables in the Taxonomy of Systems Engineering Competency. A determination will be made for minimum skill enhancements that can be economically acquired by each engineer to provide an adequate mix of engineers at the appropriate SECLs to support the project. The process will also analyze and determine what abilities cannot be made available from within the organization. This will determine how many new engineers at what SECL must be hired.. Inputs. There are two inputs to the Systems Engineering Competency Model: (1) the Competency Development Requirements created by the Competency Needs Model, and (2) the Taxonomy of Systems Engineering Competency. The first defines the project’s systems engineering needs, existing abilities within the organization, and those abilities that must be acquired. The second is the solution tool. The input elements include: • Current EIA/IS 731 assessed capability level. • Desired/required EIA/IS 731 capability level. • Estimated labor hours, by Practice and Attribute, to perform all SE tasks through the desired capability level. • Number of systems engineers needed at each SECL. • Systems engineers available, listed by their SECL. • Taxonomy criteria for a systems engineer’s abilities to perform each Practice and Attribute.

56

Outputs. Outputs of The Systems Engineering Comp etency Model will vary considerably depending on the nature of the organization’s needs. The following is a list of typical products: • Project (or organization) training plan. • Individual engineer’s training development plans. • Recruitment strategy and plan. • Requirements for a new hire’s resume abilities. • Training needs assessment and plan. • Requirements for a course curriculum or class develo pment. • Assessment evidence for FA 3.2, Manage Competency. • Input to an organization’s training budget planning. • Input to an organization’s product capability ma rketing.

DEVELOPING THE SYSTEMS ENGINEERING COMPETENCY MODEL Bloom’s Taxonomy. A method of classifying individual learning was developed over a period of years by a group of college examiners. Their work, Taxonomy of Educational Objectives [8], was edited by one of the developers, Benjamin S. Bloom, and published in 1956. It became and remains a recognized authority on how people learn in the cognitive domain. Their research determined that learning is hierarchical, i.e., mastery of lower level cognitive skills is necessary before the next higher level skills can be fully mastered. Bloom’s Taxonomy, where ability = knowledge + skill [8], is applicable to the systems engineer’s hierarchical abilities to perform the Specific Practices, Generic Practices, and Generic Attributes of the increasing, five capability levels of EIA/IS 731. Figure 7 presents the six ascending, hierarchical classifications of cognitive learning in Bloom’s Taxonomy, with “Evaluation” being the highest. Comparing EIA/IS 731 and Bloom’s Taxonomy. When comparing the five increasing capability levels of EIA/IS 731 to the six progressive ability classifications of Bloom’s taxonomy, it is observed that an individual must increase their competency in some relationship to a maturing organization’s increasing capability. Analysis does not reveal a clear mapping or correlation between the two. However, more recent research in the educational community provided the key to a solution. Creating The Learning Relationship. Although the educational community agrees with the hierarchical premise of Bloom, there are ongoing studies and debates about the number of learning classifications, and their content and sequence. For instance, in the Quellmalz Framework of Thinking Skills [9], Bloom’s Knowledge and Comprehension classifications are combined into a single classification, Recall. Similarly, portions of Bloom’s Synthesis is merged into the highest classification, Evaluation. Evaluation: Judgment about the value of material and methods for given

purposes. Quantitative and qualitative judgments about the extent to which material and methods satisfy criteria and standards. Synthesis: The putting together of elements and parts to form a whole. Arranging and combining pieces, parts, elements, etc. in such a way as to constitute a pattern or structure not clearly there before. Analysis: The breakdown of information into its parts such that the ideas are clear, are organized, and the relationship between ideas are made explicit. Able to understand hypotheses, relate assumptions, see patterns, and recognize hidden elements. Application: The use of abstractions, information, methods and skills to solve problems in particular and concrete situations. Abstractions may be general ideas, rules of procedures, technical principles and theories that must be remembered and applied. Comprehension: Comprehension is the lowest level of understanding. The ability to translate, interpret, and extrapolate from a knowledge base without necessarily relating it to other material or seeing its fullest implications. Knowledge: The recall of specifics and universals, the recall of methods and processes, or the recall of a pattern structure, or setting. Usually a low level of abstraction, and little more than bringing to mind and reorganizing material from previously remembered experiences. Although some alteration of material may be required, it is a small part of the task.

Figure 7. Six Classes of Bloom’s Taxonomy

After examining Bloom’s two analytical classifications, Analysis and Evaluation, it can be seen that they closely correspond to those abilities needed to perform the Practices of the SECM Level 4, Measured. By adopting Synthesis as the highest cognitive level, and combining Analysis and Evaluation into a new classification, Assessment, this results in five learning classifications that contextually relate one for one to the five capability levels of the EIA/IS 731 SECM. The resulting relationship is illustrated in Figures 8a. Bloom’s Classifications

Classifications 6 & 4 Combined

6 - Evaluation

……..

5 - Synthesis

Synthesis

4 - Analysis 3 - Application

Assessment Application

EIA/IS 731 SECM Levels

Level 4 Measured Level 3 Defined Level 2 Managed

Increasing Systems Engineering Ability

Level 1 Performed

Knowledge Compre- Appli- Assess- Synthesis hension cation ment

Figure 8b. Relationship Between Organizational Capability and a System Engineer’s Ability

By examining the FA Practices in EIA/IS 731 at each Capability level, and the corresponding abilities defined for each learning classification it becomes clear that these are not merely points of intersection, but rather a continuum of areas along the diagonal. Each area represents a level of individual systems engineering competency. This concept is illustrated in Figure 8c as the graphical representation of the Systems Engineering Competency Model showing the five Systems Engineering Competency Levels (SECL), E-A. The letter names were chosen in the academic context where a grade of A is the highest individual achievement. Increasing Capability Level 5 Optimizing

5 Optimizing

Level 4 Measured Level 3 Defined

4 Measured 3 Defined

C IN

Level 2 Managed

2 - Comprehension Comprehension

2 Managed

1 - Knowledge

1 Performed

Knowledge

Increasing EIA/IS 731 Organizational Capability

Level 5 Optimizing

Level 1 Performed

Figure 8a. Relationship Between Bloom’s Taxonomy and EIA/IS 731.1 SECM

Defining the Competency Relationship. If the EIA/IS 731 Capability Levels are placed on a vertical axis and this new set of learning classifications are placed on a horizontal axis with intersections plotted you will create the simple relational map shown in Figure 8b.

57

A RE

SI

NG

SE

CO

M

P

E ET

NC

A

Y

B

C

D

E

Knowledge Compre - Applihension cation

SE Competency Levels E → →A Represent Cumulative Abilities Based on Bloom’s Taxonomy That Enable EIA/IS 731 Practices Assess- Synthesis ment

Increasing SE Ability

Figure 8c. Illustrated View of the Systems Engineering Competency Model

For a practitioner to adequately function in an organization at a given EIA/IS 731 capability level, he/she must possess a minimum set of knowledge and skills (abilities) to perform the Specific Practices within the Focus Areas (FA) at that capability level along with the Generic Practices and Attributes. Their competency to perform at a given EIA/IS 731 capability level, defined as their Systems Engineering Competency Level (SECL), depends on their learned abilities and experiences. A practitioner’s current ability (shown on the horizontal axis) defines him/her to be at a given SECL.

If the SECL’s corresponding EIA/IS 731 capability level (vertical axis) is below the organization’s assessed (or desired) EIA/IS 731 capability level, the need for skills training is indicated. Note that learned abilities (horizontal axis) are inclusive, all knowledge and skill to the left of a given SECL are a prerequisite for the additional abilities to be performed at that SECL. The Systems Engineering Competency Model Supports The Architecture of the EIA/IS 731. Figure 9 is an adaptation of Figure 2, Model Architecture, in EIA/IS 731 [3] illustrating how Systems Engineering Competency Levels relate to Categories, Focus Areas, Themes, Practices, and Attributes. For each Focus Area Theme there is a set of Specific Practices that must be performed at each of the five capability levels of EIA/IS 731. Performance of these practices requires corresponding abilities at the five levels of systems engineering competency, E-A, as shown in the bottom row of Figure 9.

each SECL is given in Figure 10. The training verbs articulate the needs prescribed by the organization capabilities and SE abilities at each SECL as shown to the left of the verbs in Figure 10. The Taxonomy for Systems Engineering Competency will contain the full population of Specific Practices for each Focus Area linked to one or more required skills, each with a training verb statement. An example of this is give in Figures 11 and 12 for the Ensure Quality Focus Area. SE CL

A

Category "N"

Focus Area "N.N"

Generic Practices

B

Ability to put together separate ideas to form a new whole with new relationships Ability to separate information, show relationships, and judge worth against standard criteria

APPLICATION

C D

Practices performed to specific procedures with work products of adequate utility

E

Practices performed informally by learned “heroes” with marginal work product utility

MANAGED

Theme "N.N"

PERFORMED Specific Practices Level 1 Practices Enabling E Skills

Level 2 Practices Enabling D Skills

Level 3 Practices Enabling C Skills

Level 4 Practices Enabling B Skills

Level 5 Practices

Training Verbs “The SE shall be able to…”

ASSESSMENT

Practices performed using defined metrics at all organizational levels against measurable criteria Practices performed to tailored procedures that are improved via experience

Generic Attributes

Focus Area

Theme

Practices performed for continuous improvement to meet changing business goals

DEFINED

Capability Domain

Category

SE Abilities (knowledge and skills) SYNTHESIS

MEASURED

EIA 731 Systems Engineering Capability Model Systems Engineering Domain

Organization’s Capabilities EIA/IS 731 OPTIMIZING

Ability to use learned information and apply it to new situations

COMPREHENSION Ability to grasp the meaning of information and explain ideas

KNOWLEDGE Ability to recall learned information and reapply it to the same or similar situations

adapt, anticipate, formulate, invent, plan, create, design, hypothesize, modify, collaborate, initiate, negotiate, combine correlate, diagram, analyze, distinguish, categorize, compare, evaluate, critique, judge, appraise, rank, contrast apply, compute, use, instruct, predict, r ecord, solve, demonstrate, show, experiment, utilize, discover, relate classify, confirm, convert, describe, discuss, estimate, explain, illustrate, paraphrase, cite, translate, interpret define, enumerate, list, describe, draw, identify, label, match, repeat, quote, recognize, select, write, view, recite, memorize, state

Figure 10. Capabilities and Abilities of the Model Define SE Training Verbs at Each SECL

Enabling A Skills

Figure 9. Systems Engineering Competency Related to Architecture of EIA/IS 731 The System Engineer Shall Be Able To … To effectively function at a given SECL the systems engineer must be skilled in performing the applicable Specific Practices at the corresponding EIA/IS 731 organizational capability level. This may not be the same as the currently assessed level of the organization. It may be higher or lower. Because all lower level Specific Practices up to and including the current assessed level must be performed, a lower skilled engineer at a lower SECL may effectively perform the lower tasks. Likewise, due to process improvement, some higher level Specific Practices may need executing by a higher SECL engineer. One output of the Systems Engineering Comp etency Model is a training plan that uses a set of training verbs applied to the skills needed to perform each task in every Focus Area of EIA/IS 731. A needed skill for an individual SE may be characterized with a statement beginning, “The systems engineer shall be able to [training verb]……” A sample list of training verbs for

58

Ensure Quality Example. Figure 11 shows how the three Taxonomy Models are applied to the Ensure Quality Focus Area. Figure 12 presents a detailed illustrative example of enabling skills for the SPs keyed to Figure 11. Goal at Enterprise Level All projects will improve System Quality per benefit "e" of EIA/IS 731

Enterprise Level

Goal at Project Level Manage Quality on Project X per FA 2.8

Goal at Project Level Manage Quality on Project Y per FA 2.8

Goal at Project Level Manage Quality on Project Z per FA 2.8

2.8-1 Leadership/Involve Communicate mgmt roles Assign responsibilities Create quality environment

2.8-2 Quality Process Evaluate work products Establish corr action process Do other SPs, GPs, GAs

2.8-3 Tools & Techniques Use tools to reduce defects Provide just-in-time training Do other GPs and GAs

See enabling skills in Figure 12

See enabling skills in Figure 12

See enabling skills in Figure 12

Project or Program Level

Individual or Skill Level

Figure 11. Models Applied to Ensure Quality

FA 2.8 Ensure Quality Th 2.8-1

Leadership and Involvement SE Specific Practices

Enabling Skills

CL

SP 2.8-1-2b

A B C D

SP 2.8-1-2a

D

SP 2.8-1-1

E

Th 2.8-2

None None None Create an environment that encourages employee participation in identifying, reporting, and solving quality issues quality issues Assign responsibility for product quality activities and improvements to the program team Communicate management’s role in quality improvement activities

Quality Process SE Specific Practices

N/A N/A N/A Explain problem solving process for solving process quality issues

FUTURE ACTIVITIES Describe roles associated with product quality activities and improvements and explain selection of associates for each role Enumerate quality im provement activities and management’s role associated with each activity

Enabling Skills

CL

SP 2.8-2-4

A B

SP 2.8-2-3b

C

SP 2.8-2-3a

C

SP 2.8-2-2

D

SP 2.8-2-1

E

Th 2.8-3

None Feed back lessons learned into processes for robustness of future designs Perform in-progress or incremental evaluations of work products and system elements against requirements Evaluate processes for adherence to standards and policies throughout the system life cycle Establish a process to detect the need for corrective actions to products and processes Evaluate work products and system elements against requirements

Tools and Techniques SE Specific Practices

N/A Analyze lessons learned to influence future process designs Demonstrate conformance of work product and system elements against requirements Record standards and policies used throughout system life cycle Describe problems requiring corrective actions to products and process List requirements for products and system elements

Enabling Skills

CL

SP 2.8-3-2

A B C D

SP 2.8-3-1

E

None None None Provide readily available, just-in-time training on the use of advanced quality improvement tools Use quality improvement tools in a disciplined manner to reduce defects and im prove productivity

Competency Needs Model, through a needs analysis, whether the skills exist or whether they need to be acquired at the individual level. If the skills exist, then no further training needs to be accomplished. However, if the skills need to be acquired through training, courses, or hiring, then using the Systems Engineering Comp etency Model, the objectives are developed and the skills acquired.

N/A N/A N/A Describe readily available training for the use of advanced quality tools Select quality tools in the appropriate sequence to initially determine potential root cause problems affecting defects and productivity

Figure 12. Ensure Quality Focus Area

At the Enterprise Level, a goal from the Strategic Enterprise Model of practicing quality management is established based upon the “improve system quality” benefit of EIA/IS 731 (see item “e” in Figure 2). Since this is a corporate goal, every program is obligated to incorporate quality management at some level. Depending upon the type of program and life cycle phase, e.g., concept exploration or full-scale development, different strategies and levels of quality management are developed. The program would then assess using the

59

Near Term Activities. A detailed outline of The Taxonomy of Systems Engineering Competency will be finalized. The team plans to complete and prototype several sections of The Taxonomy of Systems Engineering Competency. These prototype sections will be given to key reviewers to solicit additional feedback and concurrence. Mid-Term Activities. An author’s guide will be generated and the core team will be expanded; additional sections of The Taxonomy of Systems Engineering Competency will be completed. At appropriate times, key reviewers will be asked to provide further feedback and concurrence. The document will be completed using this iterative “generate, review, incorporate comments” process. After the products of the CMMI effort are released, the concepts presented in this paper will be examined for applicability to the CMMI products. Completion Activities. When The Taxonomy of Systems Engineering Competency is close to being completed, per discussions in the INCOSE Technical Board, the author team anticipates receiving a survey of systems engineering courses from the INCOSE Education and Training Technical Committee. This survey will be augmented as needed and mapped to The Taxonomy of Systems Engineering Competency. Both of these products requested by the INCOSE CAB will be provided to the INCOSE Technical Board and general INCOSE membership for review, incorporation of comments, and subsequent Technical Board approval. The Taxonomy of Systems Engineering Competency should not be misconstrued as the Systems Engineering Body of Knowledge (SEBOK) being developed by INCOSE. However, the Taxo nomy has the potential of being a significant citation by the SEBOK as a valuable part of the body of knowledge for systems engineering. This would especially be the case if the SEBOK was aligned with the EIA/IS 731 SECM, as is The Taxonomy of Systems Engineering Competency. REFERENCES [1] Merriam Co.; Webster’s Third New International Dictionary, Unabridged; Encyclopedia Britannica; Illinois; 1976. [2] E. R. Widmann, G. J. Hudak, T. A. Hudak, and G. E. Anderson; “Development of a Skills and Knowledge Taxonomy Aligned with the EIA/IS 731.1 Systems Engineering Capability Model”;

[3]

[4]

[5]

[6]

[7]

[8] [9]

Proceedings of the Ninth Annual International Symposium of the International Council on Systems Engineering; Brighton, United Kingdom; June 6 – 11, 1999. EIA Interim Standard 731, Systems Engineering Capability, EIA 731.1, Part 1: Model, Version 1.0; Electronic Industries Alliance; Arlington, VA; December 1998. EIA Interim Standard 731, Systems Engineering Capability, EIA 731-2, Part 2: Appraisal Method; Version 1.0; Electronic Industries Alliance; Arlington, VA; December 1998. G. J. Hudak; “Using the Capability Maturity Model to Improve Organizations”; Proceedings of the 5th Conference on Concurrent Engineering and Integrated Product Design; Hanover, NJ; Sept. 1998. E. R. Widmann; “Key Features of the ‘Merged’ EIA/IS 731.1 Systems Engineering Capability Model”; Proceedings of the Eighth Annual International Symposium of the International Council on Systems Engineering; Pages 597 – 604; Vancouver, British Columbia, Canada; July 26 – 30, 1998. “Section 7 Measuring Systems Engineering Capability”; Systems Engineering Handbook; International Council on Systems Engineering, Seattle, WA; January 1998. Bloom, Benjamin S.; Taxonomy of Educational Objectives; Longman, New York; 1956, 1984. R. J. Stiggins; Measuring Thinking Skills in the Classroom; National Education Association; 1988

BIOGRAPHIES E. R. Widmann is in the Satellite Systems Engineering Operation of Hughes Space and Communications Company (HSC). He is currently involved with an effort to improve the management of requirements across several product lines of commercial communications and government satellites in o rder to achieve increased efficiencies. Previously, at Hughes Aircraft Company (HAC), n ow Raytheon Systems Company since December 1997, he served as a systems engineering manager on a number of electro -optical product development efforts for both land and airborne applications. He was also a key participant in systems engineering process a ctivities, serving as the: (1) HAC-wide systems engineering representative for improving the HAC-developed, common Product Development Process (PDP) and (2) HAC-wide systems engineering lead responsible for deployment of a common systems engineering process, which was an integral part of the PDP. He also supported process-related activities associated with the transition of Hughes Aircraft Company to the Ra ytheon Systems Company. Prior to joining HAC, he was an officer in the US Army and spent four years in the Office of the Secretary of the Air Force, Office of Special Projects, and two years at the US Army Satellite Communications Agency. Mr. Widmann is currently a Member of the Technical Board of INCOSE, where he is Chair of a special Technical Board Team to develop The Taxonomy of Systems Engineering Competency. He previously served as a Member of the INCOSE Technical Board and Chair of the INCOSE Meas urement Technical Committee from 1995 to 1998. He was the Chair of the INCOSE Capability Assessment Working Group from 1992 to 1995, where he o rganized and significantly contributed to the development of the INCOSE Systems Engineering Capability Assessment Model. He was the Associate Chair (US) of the INCOSE Standards Task Force from 1997 to 1998. He was the lead INCOSE representative on the EIA SECM Working Group and a co-author of the EIA/IS 731 Systems Engineering Capability. He is currently the HSC

60

representative to the EIA G-47 (Systems Engineering) Committee and the HSC alternate representative to the INCOSE Corporate Advisory Board. Mr. Widmann has published e xtensively on systems engineering subjects. He has a M.S. Degree in Electrical Engineering from Purdue University and both a B.S. Degree in Electrical Engineering and a B.A. Degre e with a concentration in Physics from Ru tgers University. G. E. Anderson is in the Space Systems Division of Honeywell International as the Technical Director for Space Shuttle Multiplexers/Demultiplexers. He chairs Honeywell's corp orate Systems Engineering Council and is responsible for developing and deploying systems engineering processes, tools, training and certification guidelines. His 30 years of industry experience spans consumer electronics, urban transportation systems, military cockpit systems, and human space vehicle systems. Mr. Anderson has received many awards for his systems management and technical contributions to the industry. Applied technology experience includes advance research and patents in color television, development of systems for the Washington DC Metro, advanced cockpit display systems on major military aircraft, airborne real time video map optical memory system, and modernized data control & glass cockpit systems aboard the Space Shuttle. He is a recognized mentor and coaches both individuals and groups in develo ping their domain and process skills. Mr. Anderson represents Honeywell International on the INCOSE Corporate Advisory Board, served as an active member of Working Groups in the Measurement Technical Committee, and presently participates on the Educational Measurement Working Group. He is also on a special Technical Board Team developing The Taxonomy of Systems Engineering Competency, and has served as a Director of the Central Arizona Chapter. He has a BSEE degree from the J. B. Speed Engineering School with advanced studies at Purdue University and Arizona State Un iversity. G. J. Hudak joined the faculty at Stevens Institute of Technology in Hoboken, NJ as a Distinguished Professor and is Director of the Integra ted Product Development (IPD) Program. Previously, he was a Senior Member of the Technical Staff at AT&T Laboratories and served as an internal consultant in Project Management. In addition, he was involved in process optimizations at several manufacturing facilities prior to trivestiture at AT&T. At Stevens Institute of Technology, Professor Hudak is a member of the Mechanical Engineering Department and teaches graduate courses in Mechanical Engineering and Concurrent Engineering Programs. He is a registered professional engineer; a Senior Member of American Society for Quality; a member of the International Council on System Engineering and a Senior Member of IEEE. He was one of the co-authors of EIA/IS 731 Systems Engineering Capability and one of the participants in the development of the INCOSE Systems Engineering Capability Assessment Model. He is also on a special Technical Board Team developing The Taxonomy of Systems Engineering Competency. He has a BSEE Degree from Fairleigh Dickinson University, an MS Degree in Technology Management from Stevens Institute of Technology, and is currently in a Doctoral program at Stevens Institute of Technology. T. A. Hudak is an adjunct professor in the School of Management of Stevens Institute of Technology where she teaches Organization Behavior to graduate students. Previously she

was a member of the Organization Development and Effe ctiveness Group of Bell Laboratories Advanced Technologies at Lucent Technologies. She is a past Chair of the Princeton Section of t he American Society for Quality, and a member of the Association for the Management of Organization Design. She is also on the special INCOSE Technical Board Team Developing The Taxonomy of Systems Engineering Comp etency. She has an MS Degree in Organiza tion Development from Mankato State University and ABD in Adult Learning from Rutgers University.

APPENDIX: MANAGE RISK FOCUS AREA In [2] the authors provided an illustrative example consisting of only three themes for the Manage Risk Focus Area. Based upon the positive feedback received and to provide enhanced continuity with the concepts described in this paper and [2], a completed example for all themes within the Manage Risk Focus Area is provided.

terms of cost, schedule, and perform ance to the program. Th 2.5-4

Risk Analysis SE CL

A

None

N/A

B

Use collected metrics regarding identified risks and examine them in light of previous risk analyses, and when established thresholds are exceeded, initiate corrective action.

Correlate and analyze data for magnitude and trends against established criteria, and distinguish data threshold values from “noise” in the data.

SP 2.5-4-3a

C

Review the analysis of risks for adequacy and completeness.

Apply standard criteria using a checklist.

SP 2.5-4-3b

C

For each risk, establish cause and effect relationships.

Relate events and outcomes using a decision tool.

SP 2.5-4-3c

C

Analyze each risk for potential coupling to all other identified risks.

Show the relationship between each risk and all other risks using sensitivity analysis.

SP 2.5-4-3d

C

Develop alternative courses of action, workarounds, and fall-back positions with a recom mended course of action for each risk.

Apply a decision tree method and impact analysis using a planning tool to minimize negative consequences.

SP 2.5-4-2

D

None

N/A

SP 2.5-4-1

E

None

N/A

Th 2.5-5

Development of a Risk Mitigation Strategy

Risk Management (RM) Plan SE CL

Specific Practices

Enabling Skills

SP 2.5-1-3

A B C

None None Implement RM for key processes within program: design, test, manuf., etc.

N/A N/A Predict risk event impact upon processes.

SP 2.5-1-2

D

Provide an approved RM plan containing risk levels and expected mgmt responses for each level.

Classify risk events to expected management responses.

Plan RM activities.

Know the elements of a Risk Mgmt Plan.

SP 2.5-1-1 Th 2.5-2

SP 2.5-2-3

E

Specific Practices

Enabling Skills

A B C

None None Review all elements of the work breakdown structure as part of the risk identification process in order to help ensure that all program aspects have been considered.

N/A N/A Relate each work breakdown structure element to all other elements to identify all interactions and any high leverage sensitivities.

Identify cost and schedule risks. Identify performance risks.

Estimate parametric costs and schedules. List performance parameters and tolerances.

SP 2.5-2-2

D

SP 2.5-2-1

E

Th 2.5-3

Risk Quantification

SP 2.5-3-2

SP 2.5-3-1

SE CL

SE CL

Specific Practices

Enabling Skills

A B C D

None None None Assess each risk and determine the probability of occurrence and quantified consequence of impact for the program.

N/A N/A N/A Estimate the likelihood of the risk occurring and communicate im pact.

Assess risks qualitatively.

Describe the impact in terms of cost, schedule,

E

61

Specific Practices

Enabling Skills

SP 2.5-5-5

A

None

N/A

SP 2.5-5-4

B

None

N/A

SP 2.5-5-3a

C

Document risk reduction profiles and re-view them for appropriateness.

Relate risk profile attributes to determine changes and risk triggers.

SP 2.5-5-3b

C

Review risk mitigation (handling) including risk reduction profile for adequacy and com pleteness.

Record data and apply it to risk mitigation and risk profiles.

SP 2.5-5-2

D

Categorize risk into those that can be avoided, controlled, or accepted.

Classify risks into a risk rating schema.

SP 2.5-5-1

E

None

N/A

Th 2.5-6

Implementation of th e Risk Mitigation Strategy

Identification of Performance, Cost, and Schedule Risks SE CL

Enabling Skills

SP 2.5-4-4

FA 2.5 Manage Risk Th 2.5-1

Specific Practices

SP 2.5-4-5

SE CL

Specific Practices

Enabling Skills

SP 2.5-6-5

A

None

N/A

SP 2.5-6-4

B

None

N/A

SP 2.5-6-3

C

Document risk analysis results and mitigation plans.

Record data and plans in the prescribed format and location.

SP 2.5-6-2

D

Implement the risk mitigation strategy for the program.

Convert strategic plan wording into implementing actions.

SP 2.5-6-1

E

None

N/A

Th 2.5-7

Monitoring of Risk Mitigation Action SE CL

Specific Practices

Enabling Skills

SP 2.5-7-5

A

None

N/A

SP 2.5-7-4

B

Use collected metrics regarding identified risks and examine them in light of previous risk analyses, and when established

Distinguish new risks based on data analysis and categorize them for needed action.

thresholds are exceeded, initiate corrective action SP 2.5-7-3a

C

Monitor and re-evaluate risks at appropriate milestones.

Record and relate actual vs. predicted outcomes.

SP 2.5-7-3b

C

Provide the results of risk monitoring activities to affected personnel and disciplines.

Apply data sorting, data retrieval, and publication methods based on flag criteria.

SP 2.5-7-3c

C

Provide a mechanism for monitoring corrective actions taken and tracking open risk items to closure.

Utilize a data capture, data organization, data retrieval, and publication process, procedure, and tool.

SP 2.5-7-2

D

None

N/A

SP 2.5-7-1

E

None

N/A

Th 2.5-8

Communication and Coordination of Risk Status and Risk Mitigation Efforts Across Affected Groups SE CL

Specific Practices

Enabling Skills

SP 2.5-8-5 SP 2.5-8-4 SP 2.5-8-3

A B C

SP 2.5-8-2a

D

None None Include risk management as a part of program fo rmal reviews. Involve a multi-functional group for risk management that spans both technical and business specialties.

N/A N/A Show the status of risk elements and relate their significance. Describe and illustrate inter-relationships between multi-functional groups.

SP 2.5-8-2b

D

Integrate risk management both vertically and horizontally across the program.

Describe and illustrate inter-relationships between all program elements.

SP 2.5-8-1

E

Establish a communication path between the risk management team and the program management team.

Define a tailored process and describe a step-wise procedure for format, content, distribution, and archiving risk data.

62

Suggest Documents