AC : A QC-SYSTEMS APPROACH TO IE PROGRAM OUTCOMES ASSESSMENT

AC 2007-112: A QC-SYSTEMS APPROACH TO IE PROGRAM OUTCOMES ASSESSMENT Robert Batson, University of Alabama Robert G. Batson is Professor and Head of In...
Author: Nicholas Sims
42 downloads 0 Views 249KB Size
AC 2007-112: A QC-SYSTEMS APPROACH TO IE PROGRAM OUTCOMES ASSESSMENT Robert Batson, University of Alabama Robert G. Batson is Professor and Head of Industrial Engineering at The University of Alabama, where he teaches and performs research in statistical quality control, quality engineering, risk assessment, and reliability. In 22 years at Alabama, he has published over 45 refereed journal articles and has held research contracts and grants worth over two million dollars with organizations such as BellSouth, Mercedes-Benz, the FAA, and NASA. Prior to joining UA, he worked for five years as a systems engineer with Lockheed Corporation. He received an M.S. in Mathematics from Florida State in 1974, and two degrees from Alabama in 1979: a Ph.D. in Mathematics and an M.S. in Industrial Engineering. Bob is a Registered Professional Engineer in California, and was elected Fellow of the American Society for Quality Control in 1996. He is past president of the Central Alabama Chapter 9 of IIE, and serves as an IE Program Evaluator for ABET. He is also past president of the Southeastern Section of ASEE.

© American Society for Engineering Education, 2007

A QC-Systems Approach to IE Program Outcomes Assessment Abstract An effective approach to the current expectations for ABET Outcomes Assessment is to view the associated indicators, measurements, and corrective action as a quality control (QC) system. This paper is a case study of how an IE program planned, designed, and implemented such a system consisting of 91 performance indicators, with measurement obtained via six distinct instruments. Measurement of these 91 indicators are captured at time intervals varying from semester-to-semester to every three years, and entered in archival spreadsheets, programmed to present the cumulative data in the forms of tables and line graphs. These informative graphs are reviewed annually by an Outcomes Assessment Committee, which rates each indicator into a status of red, yellow, or green. Red indicators call for immediate action, yellow indicators are to raise awareness in the department, and green indicators signal “no problems” at this time. Assessment Memos are written to the Curriculum Committee, Department Head, and individual faculty members who are responsible for a particular course, topic, or skill. Therefore the Assessment Memos serve a role analogous to Corrective Action Requests in industrial QC. The 91 performance indicators are linked to 17 outcomes adopted by the department. These 17 outcomes include the eleven ABET Criterion 3 a-k statements1, as well as six outcomes specific to the BSIE program at this institution. Each outcome has at least two indicators and the average number of indicators per outcome is five. At least one indicator must be directly measurable, and another must be indirect. These diverse indicators enables “triangulation” as the faculty view the performance of the program in a given outcome from multiple perspectives. In turn, outcomes map back to one or more program objectives, so that the department can evaluate its performance in meeting its stated program objectives. An innovative approach to obtaining “after graduation” evaluations of how well objectives are met by recent program graduates will also be revealed. We will explain how the QC-System Approach is fully compatible with ABET’s “two-loop” assessment and evaluation process1, which the department adopted in May 1999. The approach described above has been in operation for over seven years. This approach was fullydemonstrated in our 2000-2001 Self-Study, was reviewed by our program visitor for the Fall 2001 Accreditation Visit, and continues to produce useful information in a consistent and highly efficient way. Only ten hours per year of faculty/staff time are spent on data collection and entry, and five hours per year are spent on assessment. Students are involved in the time they devote to completing the senior exit interview, and taking the Fundamentals of Engineering (FE) Exam. We have recently modified the performance indicators based on changes in the BSIE curriculum and the IE portion of the FE exam, and are preparing to present the results of the past six years of system utilization in another general review. We conclude by recommending such an approach as natural for any IE program, and certainly feasible for any engineering program. Introduction The Accreditation Board for Engineering and Technology (ABET) has adopted as its motto “Quality Assurance in Engineering, Computing, and Technology Education.” ABET’s Engineering Criteria 2000 (EC 2000)1 was used voluntarily in accreditation visits starting with the Fall 1998 visit cycle, with full implementation in Fall 2001. The Department of Industrial

Engineering (IE) at The University of Alabama (UA) was one of the programs visited in Fall 2001, though curriculum and assessment changes here literally began immediately after the previous ABET visit in October 1995. This paper focuses on the development, implementation, and results of using a formal EC 2000 Outcomes Assessment System for the BSIE program at Alabama. Sarin2 has observed that “EC 2000 is a focused attempt to bring quality assurance to the field of engineering education in a very formal and direct manner.” He goes on to explore similarities and differences between EC 2000 and the well-known international quality system standard ISO 9000, concluding that there is a strong resemblance. Sarin’s paper1 had the objective “to establish EC 2000 as a vehicle for quality assurance in engineering education,” which he observed as requiring a significant shift from merely documenting program objectives to establishment of a system of outcomes, performance indicators, measurements, evaluation, and feedback linked to program objectives. Our paper extends Sarin’s conceptual insights into practice by describing a real quality control (QC) system, in use for EC 2000 outcomes assessment within the UA IE Department since August 1999. An effective approach to EC 2000 Outcomes Assessment is to view the associated indicators, measurements, and corrective action as a quality control (QC) system. This paper is a case study of how an IE program planned, designed, and implemented such a system consisting of 91 performance indicators, with measurement obtained via six distinct instruments. Measurement of these 91 indicators are captured at time intervals varying from semester-to-semester to every three years, and entered in archival spreadsheets. Excel Macros present the cumulative data in three forms; tables, bar graphs, and line graphs. These informative graphs are reviewed annually by an Outcomes Assessment Committee, which rates each indicator into a status of red, yellow, or green. Red indicators call for immediate action, yellow indicators are to raise awareness in the department, and green indicators signal “no problems” at this time. Assessment Memos are written to the Curriculum Committee, Department Head, and individual faculty members who are responsible for a particular course, topic, or skill. Therefore the Assessment Memos serve the role of Corrective Action Requests in industry. The 91 performance indicators are linked to 17 outcomes adopted by the department several years ago. These 17 outcomes include EC 2000 Criterion 3 a-k, as well as six outcomes specific to the BSIE program at Alabama. Each outcome has at least two indicators and the average number of indicators per outcome is five. This enables “triangulation” as we view the performance of the program in a given outcome from multiple perspectives. Outcomes in turn map back to one or more program objectives, so that on annual basis the department can evaluate its performance in meeting its stated program objectives. We will explain that the QC-System Approach is fully compatible with ABET’s “two-loop” assessment and evaluation process (see Figure 1), which the Department adopted in May 1999. The approach described above has been in operation since August 1999. We captured data from departmental archives, then continued entering new data throughout 1999-2000. In Fall 2000, we used the system to produce six Assessment Memos which were then acted upon by the responsible faculty member or the Curriculum Committee. This approach was fullydemonstrated in our 2000-2001 Self-Study, was reviewed by our program visitor for the Fall 2001 Accreditation Visit, and continues to produce useful information in a consistent and efficient way.

Development of IE Department’s EC 2000 Outcomes Assessment System To prepare to develop the department’s Outcomes Assessment System, it was clear that at least one faculty member needed training. The department head and departmental quality engineering expert, Robert G. Batson, attended the following two training sessions: ‚

ABET Regional Faculty Workshop on EC 2000, December 4-6, 1998, Atlanta, GA



EC 2000 Program Evaluator Training, May 21, 1999, Phoenix, AZ

The most useful result of the December 1998 workshop was the construction at the workshop of an eight-step plan for system development (Table 1) which was briefed to the IE faculty in January 1999. Essentially, we had completed steps 1-3 and most of step 4 in this plan already, in the course of a curriculum revision completed in January 1999. The last bullet of step 4, “detail performance criteria for each outcome” is where we had to begin new system development. We adopted the terminology “performance indicator” for a specific performance criterion linked to an outcome. The rest of the plan, steps 5-8, provided detail for what we knew we must do to complete the cycle around the ABET Two-loop Process shown in Figure 1. In parallel to the final stages of curriculum revision, we began to design a formal assessment system. In Table 2, we list the entire process to establish and begin data entry for the system.

Figure 1. Evaluation and Assessment Cycle adopted by IE Department Determine educational objectives

2

3 Determine Outcomes Required to Achieve Objectives Determine How Outcomes will be Achieved

Evaluate/Assess

8 Input from Constituencies

1

4 Determine How Outcomes will be Assessed

Formal Instruction Student Activities Establish Indicators that Objectives are Being Achieved

7 6

5

Table 1. “Steps in Plan3” 1.

Identify constituents and survey needs for future graduates of the program

2.

Locate and review mission statements ‚ ‚ ‚

3.

Define Objectives ‚ ‚ ‚

4.

document linkages to objectives consistent with ABET Criteria (a) - (k) consistent with institutional resources detail performance criteria for each outcome

Identify “instruments” to generate the assessment data ‚ ‚ ‚ ‚ ‚

6.

consistent with mission(s) consistent with constituent needs document linkages

Define Outcomes ‚ ‚ ‚ ‚

5.

University College Department (consider revision if necessary)

measurement scales, units for which performance criteria? frequency of measurement? data storage and analysis media? who does the evaluations? (e.g., committee, individual faculty, student advisor, dept. head)

Corrective action ‚ ‚

responsibility documentation

7.

Faculty training and build support

8.

Examine each (current) course ‚ ‚ ‚

How it meets outcomes, and objectives What sort of assessment are performed now (to assign grades) Map these to performance measures and/or outcomes

The most useful result of the May 1999 Program Evaluator Training was to confirm that we were on the right track to begin formal data collection on performance indicators, using well-defined instruments on a consistent basis, and to store this data in electronic form for ease of analysis. In Table 3, we show the Educational Objectives and Program Outcomes that were determined at the August 1997 faculty meeting for that purpose, mentioned in Table 2 (Assessment System Development). These Objectives and Outcomes were the starting place for both curriculum revision and outcomes assessment system development. Returning to Table 2, by May 16, 1999 a total of 91 outcome indicators had been identified for the 17 student outcomes; some outcomes had as many as 12 indicators, others as few as two indicators. Each of these indicators is now measured within one of six “IE Outcome Indicator Measurement Instruments” listed in Table 4. The implemented EC 2000 Outcomes Assessment System is best viewed as a combination of: ‚

A Measurement Subsystem, consisting of: – 17 program outcomes – 91 performance indicators – A unit of measurement for each indicator – A frequency of measurement for each indicator – An instrument that provides each measurement ‚

An Assessment Subsystem consisting of: – Data storage and analysis by Excel applications (six instruments) – An Outcomes Assessment Committee to review the assessment data annually – A series of Assessment Memos publishes by the Assessment Committee to specific committees and/or individuals in the department, always including the department head as a recipient.

Table 2. Process to Establish IE Department EC 2000 Outcomes Assessment System Planned Completion Date January 1996

Actual Completion Date August 1997

August 1996

August 1997

Dec. 15, 1998

May 15, 1999

Feb. 1, 1999

June 1, 1999

March 1, 1999

June 15, 1999

6. Implement the system on a selected IE PC.

May 1, 1999

July 1, 1999

7. Test the system with fictitious data.

May 15, 1999

July 15, 1999

8. Populate the system with real data.

July 1, 1999

August 1, 1999

9. Train staff to perform data entry and use the system.

August 1, 1999

August 15, 1999

10. Start formal use.

August 15, 1999

August 15, 1999

Steps 1. Identify any objectives of the BSIE that are “local” in nature and differ from the seven objectives (already agreed to) derived from the “ABET 11" criteria. Result: Comprehensive list of educational objectives for UA BSIE recipients. 2. Translate objectives into measurable outcomes we want to demonstrate for individual or groups of BSIE recipients, courses, or instructors. Result: Comprehensive list of outcome measures, each linked to one or more objective. 3. Identify the sensor or instrument that will be used to measure each outcome, frequency of measurement, scale, etc. Result: Comprehensive list of sensors and instruments to be applied in BSIE assessment effort; with frequency and scale/units of measure. 4. Develop a data collection and storage system specification. Result: A system specification for the data bases that will hold all measurements described in Step 3, including frequency of data updates, data labels and fields, storage medium (e.g., EXCEL file, ASCII file, paper folder, videotape, audiotape). 5. Develop the logic for a data analysis system and design of standard reports/screens plus special data sorting/analysis options. Result: A system specification for the data query, data analysis, and information presentation element of the IE Outcomes Assessment System.

Table 3. UA B.S. Industrial Engineering Program Educational Objectives with Linkage to Program Outcomes Graduates should demonstrate/possess: 1.

An understanding of the mathematical and scientific foundations of industrial engineering as well as the ability to apply this foundation material to engineering problems. (a), (e), (k), (l), (m)

2.

The ability to apply an engineering design methodology to unstructured problems and to evaluate alternative solutions in the broader context of an organization or society. (c), (e), (h), (j), (n)

3.

The ability to plan and conduct analytical and experimental studies that incorporate statistical, computer, and other appropriate techniques. (b), (e), (k), (m), (o)

4.

The ability to communicate effectively for presentation and persuasion using oral, written, and electronic media. (g), (p), (q)

5.

The ability to organize, lead, coordinate, and participate in industrial engineering and multi-disciplinary teams. (d), (l), (n)

6.

An appreciation of the humanities, social sciences, and contemporary issues for the general education of the individual and as resources for engineering studies and professional behavior. (h), (j)

7.

An appreciation of the ethical and professional responsibilities of Industrial Engineers and the benefits of a commitment to life-long learning. (f), (i)

The eleven letters (a) - (k) refer to ABET 2000 Criterion 3, Program Outcomes and Assessment. To be accredited, engineering programs must demonstrate that minimally their graduates have: (a)

an ability to apply knowledge of mathematics, science, and engineering

(b)

an ability to design and conduct experiments, as well as to analyze and interpret data

(c)

an ability to design a system, component, or process to meet desired needs

(d)

an ability to function on multi-disciplinary teams

(e)

an ability to identify, formulate, and solve engineering problems

(f)

an understanding of professional and ethical responsibility

(g)

an ability to communicate effectively

(h)

the broad education necessary to understand the impact of engineering solutions in a global and societal context

(i)

a recognition of the need for, and an ability to engage in life-long learning

(j)

a knowledge of contemporary issues

(k)

an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice.

The six letters (l) - (q) refer to the following six outcomes, which are unique abilities to be found in every graduate of the University of Alabama B.S.I.E. program: (l)

an ability to understand the human components of a system

(m)

an ability to apply statistical process control and continuous improvement tools

(n)

an understanding of production and information systems

(o)

an ability to construct conceptual and mathematical models of operational and economic decision situations

(p)

an ability to prepare and present a professional, computer-assisted briefing

(q)

an ability to prepare a professional-style engineering report

Table 4. IE Outcomes Indicator Measurement Instruments Data Source

Database Description

IE Alumni Surveys

“IE Alumni Survey Database” 9 Performance Indicators (1-5 scale, strongly disagree to strongly agree) Spring 1996 and every three years thereafter: ‚ Total BSIE response ‚ “Last decade” graduates response ‚ Questions keyed to EC 2000 Criteria 3d,f,h,i,j,k

IE Senior Exit Interviews

“IE Graduating Senior Exit Interview Database”

12 Performance Indicators (1-5 scale, poor to excellent)

Spring 1999 and every semester thereafter: ‚ Total senior class response ‚ Questions keyed to EC 2000 Criteria a-k FE Exam Scores

“IE Seniors’ FE Exam Scores Database”

20 Performance Indicators (percentage correct scale)

Spring 1997 and twice per year thereafter: ‚ Morning: Math, Science, Eng. Science, Ethics ‚ Afternoon: 16 individual IE Topics Student Curriculum Worksheets (in each student’s file)

“IE Graduating Seniors’ Average GPA 18 Performance Indicators Database” (4.0 scale) Spring 1999 and every semester thereafter: Grouped courses: ‚ Math (MATH 125, 126, 227, 238) (4) ‚ Natural Science (PH 105, PH 106, CH 131) (3) ‚ Non-IE Eng. Sciences (ESM 201, ESM 250/264, ME 215, ECE 320, MTE 271) (5) ‚ EC 110/111 and HU/L/FA electives (5) ‚ PY 101, IE 253, IE 351 (Work science courses) (3) ‚ IE 321, IE 473 (467) (Production/manuf. courses) (2) Individual courses ‚ GES 126, GES 255, GES 257, DR 125, IE 203, IE 363, IE 364, IE 431 (461), IE 464, IE 475, EH 319

Note: All course titles for the above listed courses are provided in the Appendix.

Table 4. (continued) IE Faculty Course Gradebooks

“IE Course Requirements Database”

23 Performance Indicators (percentage)

Student Evaluation of Courses

“Testing Services’ Database of Student 9 Performance Indicators Course Evaluations” (1-5 scale, poor to excellent) For each section taught: ‚ Average student response to “assign this course a grade”

Fall 1999 and every semester thereafter: Course average lab score for: ‚ IE 253 ‚ IE 351 ‚ IE 321 Course writing skills average score for: ‚ IE 351 ‚ IE 431 (461) ‚ IE 463 Project I ‚ IE 463 Project II ‚ IE 485 Project Course oral presentation average score for: ‚ IE 464 ‚ IE 463 Project II (client) ‚ IE 485 (faculty jury) ‚ IE 485 (client) ‚ IE 485 (instructor) IE 463/485 requirements average score: ‚ IE 463 Ethics Challenge Game completion ‚ IE 463 ABET Ethics Code homework ‚ IE 463 Contemporary Issues homework ‚ IE 463 Final Exam (comprehensive IE topics from sophomore and junior year) ‚ IE 463 Project I overall (by instructor) ‚ IE 463 Project II overall (by instructor) ‚ IE 485 Project overall (by instructor) ‚ IE 485 Project overall (by client) ‚ IE 485 Teaming Skills (by instructor) ‚ IE 485 Teaming Skills (by student team members)

Note: All course titles for the above listed courses are provided in the Appendix.

Implementation of the BSIE Assessment System The Department of Industrial Engineering has established seventeen (17) Program Outcomes (see Table 3) based on seven (7) Program Educational Objectives. The first eleven Program Outcomes (lettered a-k) are identical in wording to the eleven criteria called out in Criterion 3 of Engineering Criteria 2000. The departmental faculty reviewed the wording and intent of outcomes (a)-(k), and voted to adopt them verbatim as valid outcomes for our students. The outcomes (l) - (q) were carefully crafted by the IE faculty in August 1997 to express what would be unique about a UA BSIE graduate, as compared to other UA undergraduate engineering majors or BSIE recipients at other institutions. Note that each of seven Objectives is linked to at least two Program Outcomes. For example, Educational Objective 5 is “The ability to communicate effectively for presentation and persuasion using oral, written, and electronic media.” This Objective is linked to one “ABET 11" outcome and two “Unique BSIE” outcomes: (g) (p) (q)

an ability to communicate effectively an ability to prepare and present a professional, computer-assisted briefing an ability to prepare a professional-style engineering report.

Each outcome is measured on a regular continuing basis by at least two of the Performance Indicators. For example, considering outcome (g) above, the following seven Performance Indicators are used to determine the performance of our students in “an ability to communicate effectively”: g.1 g.2 g.3

g.4 g.5 g.6 g.7

“Oral communication skills” “Written communication skills” “Your BSIE curriculum helped you develop effective oral, written, and graphical communication skills “EH 319 Technical English” “IE 351 Human Factors Writing Skills: “IE 464 Project Oral Presentation” “IE 431 Systems Simulation Writing Skills”

1-5 rating 1-5 rating 1-5 rating

Senior Exit Interview Senior Exit Interview Alumni Survey

0-4 GPA percentage

Graduating Seniors’ Worksheets IE Course Requirements

percentage percentage

IE Course Requirements IE Course Requirements

The Department of Industrial Engineering maintains a large Excel spreadsheet that documents the Performance Indicators used to measure and assure each group of students has achieved the Program Outcomes specified by the faculty. The listing g.1 through g.7 above is an example of a small portion of this matrix. The position of the department is that by measuring these 91 Performance Indicators on a regular continuing basis, comparing individual data points and collective trends to faculty-defined minimal standards, and preparing written interpretations of performance versus standard, the Program Outcomes are assessed and quality of the program is assured. We assure that graduates have achieved the Program Outcomes in two ways. As individuals, each student must enroll in and progress through the specified BSIE curriculum. A “C or better

in all prerequisites” rule, a “D or better in all courses” rule, and a 2.0 GPA requirement in all coursework and all professional courses assures that the individual student has met Program Outcomes. This can best be seen by studying a table which summarizes how courses required in the BSIE curriculum support the 17 Program Outcomes (a) - (q). Students also are assessed in groups to determine the extent to which they collectively achieve the Program Outcomes. In the case of each of 91 Performance Indicators, an average measure of performance for a group (e.g., all graduating seniors in Fall 2000) is captured, entered into a spreadsheet, and stored for future evaluation. Current data is always compared with past, stored values of the same indicator for purposes of trend analysis. Minimal standards for these group measurements have been established by the IE faculty and are applied with each subsequent assessment cycle. Current standards determined by consensus of the IE faculty are: Database of BSIE Performance Indicators

Standard

IE Alumni Survey Database

4.0 or better on 5-point scale

IE Graduating Seniors’ Exit Interview Database IE Seniors FE Exam Score Database

4.0 or better on 5-point scale

IE Graduating Seniors’ GPA Database

UA IE graduates’ average score should exceed national IE graduates’ average score 3.0 or better on a 4-point scale

IE Course Requirements Database

80% or higher

Database of Student Course Evaluation

4.0 or better on a 5-point scale

The Department has created a permanent BSIE Outcomes Assessment Committee with the following three annual responsibilities: ‚

Assure that all BSIE Program Outcomes data is captured and entered in the appropriate spreadsheet-based Measurement Instrument (see Tables 4).



Each Fall semester, review tabular and graphical output of each of the six Measurement Instruments, draw conclusions from these data, and write formal “Assessment Memos” containing Assessment Committee conclusions addressed to: 1)Individual faculty members who may need to know a specific finding; 2)BSIE Curriculum Committee; 3)Head of Industrial Engineering.



Each Spring semester, determine whether the IE Outcomes Assessment System needs modification, consult with affected faculty and staff, and implement acceptable changes.

Conclusions are presented in the form of statement of the department standard, what was observed, and a rating of the observation as green, yellow, or red. As a reminder, yellow indicates the need for caution and future observations, but does not require anyone to take action. A red rating carries with it the expectation that the Curriculum Committee and/or the responsible faculty member(s) will plan and implement appropriate corrective action, with the expectation that the evaluation will improve the next time measurement is performed. Of course, certain

corrective action will require discussion and approval by the full IE faculty before it may be implemented. Results of Use of the BSIE Assessment System In this section, we describe the use of assessment system results to evaluate program objectives and to guide corrective action to improve courses and the curriculum. a.

Use of assessment results to confirm EOs are achieved One form of proof that EOs are being met will come from the assessment of outcomes. Since each objective maps to at least two outcomes, and each outcome is measured by at least two outcomes indicators, it is ultimately the collective evaluation of each indicator linked to a specific objective that measures how well that objective is being satisfied. In order to illustrate how objectives may be evaluated for achievement through their respective indicators, consider the case of BSIE Educational Objective 7: “An appreciation of the ethical and professional responsibilities of Industrial Engineers and the benefits of commitment to life-long learning.” This objective is linked to two outcomes, Outcome f which has five indicators and Outcome i which has two indicators, in the BSIE Outcomes Assessment System. Table 5 shows how EO7 is being well-achieved in the current BSIE program, because six of seven outcome indicators measured in Fall 2000 are “green.” The one that is “yellow” is explained in the table. Even if an objective had one or two of its multiple indicators in a “red” state, it would not necessarily mean that the objective is not being met, only that some improvement is possible. However, even one “red” indicator is a clear signal from the Assessment Committee to the Curriculum Committee and/or course coordinator that immediate investigation and corrective action is necessary. A table for each BSIE Educational Objective, similar to Table 5, was available for review during the Fall 2000 visit. In all cases these tables showed through Fall 2000 outcomes indicator data that each EO is being wellachieved by the curriculum and the key processes that are in place and operating. More recently (e.g., in the 2004 PEV training manual), ABET has started interpreting educational objectives to be “statements that describe the expected accomplishments of graduates during the first several years following graduation from the program.” Furthermore, there must be “a process based on needs of constituencies in which objectives are determined and periodically evaluated.” The use of outcomes assessment results, and linkages of outcomes to objectives, may not be enough to satisfy accreditation teams that achievement of EOs is being measured and evaluated in a meaningful manner. Two approaches that are popular to evaluate recent graduates on the job are: 1) ask the corporate representatives who come to campus to formally rate how well each objective is met in newhires from your program; 2) request all your recent alumni place a simple questionnaire in the hands of their direct supervisor, to be returned anonymously by mail, fax, or Internet.

Table 5. Evaluation of Educational Objective 7 via Outcomes Indicators Fall 2000 Outcomes Assessment Objective 7:

Outcome f:

An appreciation of the ethical and professional responsibilities of Industrial Engineers and the benefits of commitment to life-long learning. An understanding of professional and ethical responsibility

Indicators: f.1

Outcome i:

Combined senior opinion of how well developed f.2 Combined alumni opinion of how well developed f.3 IE 463 students completing “The Ethics Challenge” f.4 IE 463 Homework assignment on ABET Canons of Ethics f.5 Average score on ethics portion of FE Exam A recognition of a need for and an ability to engage in lifelong learning

Green Green Green Green Green*

Indicators: i.1 i.2

*

Combined alumni opinion of how well developed Percentage of alumni with advanced degrees

Yellow** Green

Each of these red-yellow-green evaluations is from a Fall 2000 Outcomes Assessment Memo, for example in f.5, our students consistently score above the national average for all IE students on the ethics portion of the FE exam.

** Fall 1999 Alumni Survey showed recent alumni (past decade) had lower commitment to life-long learning than same group in the Fall 1996 Survey. b.

Use of assessment results to improve required courses Assessment Memorandum 2000.1 cited one indicator as red and 6 indicators as yellow, based on IE student performance on the FE Exam. The one red indicator n.3 concerned the average score of our seniors on the Manufacturing Processes section of the FE Exam. Exposure to manufacturing processes is a part of the four hour course IE 321 Manufacturing Systems. While our students were scoring well on the Manufacturing Systems portion of the FE Exam, it was up to the IE 321 Course Coordinator and the 2000-2001 Curriculum Committee to decide how to remedy this “red” situation in Manufacturing Processes. The IE 321 Course Coordinator decided that within the four hours devoted to so many topics (manufacturing processes, manufacturing systems, automation, material handling and facility layout), there was little he could do to increase

content or retention of the 25% of the content devoted to manufacturing processes -- that what was needed was a separate course. He reported this to the Curriculum Committee, which subsequently took action. Each of the FE Exam areas cited as “yellow” are referred to the course coordinator for notice and future observation, though no action is required. In the case of m.2 Total Quality Management, because the course IE 425 Statistical Quality Control was being taught that semester, the instructor was able to increase the emphasis on TQM by devoting one class session each to ISO 9000 Quality System Criteria and the Malcolm Baldrige Performance Excellence Award Criteria. These topics are now a permanent part of IE 425 each semester it is taught. In the case of n.1, Facility Design and Location, refer to the next subsection for a major curriculum change proposed to help move this yellow to green. In the case of l.1, Industrial Ergonomics, the IE 351 Course Coordinator requested and received information from the Assessment Committee on the topics included in the Industrial Ergonomics portion of the FE, sample exam questions, and the booklet of formulas/tables provided each examinee. This insight enabled him to emphasize certain topics in the Fall 2000 section of IE 351, and will influence future preparation for the course, although the course outline was not changed. Because the tri-annual Alumni Survey had been completed in Spring 1999 and the faculty had reviewed the quantitative and qualitative information provided by our alumni, the Assessment Committee was not surprised to report in Assessment Memorandum 2000.2 three yellow findings from review of the IE Alumni Survey. Rated yellow based on alumni responses were the indicators: l.1) j.2) h.2)

Commitment to life-long learning Knowledge of contemporary issues in technology and society An understanding of the impact of engineering in U.S. and global society.

As stated earlier, the assignment of a “yellow” rating does not require action on behalf of the Curriculum Committee or individual faculty. In this case however, the Fall 2000 IE 463 instructor was alerted to these alumni responses and took action to permanently increase the number and variety of required-reading outside articles related to “technology and society” and the “impact of engineering on U.S. and global society” in the course. A homework assignment related to these readings provides a performance indicator that is measured annually, rather than every three years. We expect alumni responses in Spring 2002 to questions j.2 and h.2 to show improvement and move these indicators from yellow to green. To address the downward trend in recent graduates’ “commitment to lifelong learning” the Spring 2001 IE 485 instructor has created a permanent one-hour session devoted to discussion of the following topics: • • • • •

Responsibility of employee to seek out skills training and educational opportunities Possible advanced degree paths for BSIEs What are certifications and why pursue them? Appreciation of engineering knowledge obsolescence (5 year half-life) Role of professional societies such as IIE, INFORMS, SME, APICS, and ASQ is maintaining one’s professional currency.

Assessment Memorandum 2000.3 is interesting in that two items highlighted as “yellow” concerns by the tri-annual alumni survey also showed up in the senior exit interviews as non-green indicators: h.1)

Understand impact of engineering on U.S. and global society, was rated “yellow” because 2 of 9 responses were only “3" (note the average response for 1999-2000 was 4.0 which is just at the standard)

j.1)

Knowledge of contemporary issues in technology and society was rated “red” because the mean was 3.7, 5 of 9 students rated this a three, and two semesters had only a 3.0 average.

We noted above actions taken by the Fall 2000 IE 463 instructor and the Spring 2001 IE 485 instructor to permanently modify their course content in order to address the Assessment Committee ratings in these two indicators and their respective outcomes, BSIE Outcomes h and j. c.

Use of assessment results to improve BSIE curriculum In Subsection b above, actions taken by individual faculty members to modify their respective courses in reaction to assessment results were described. An example is the increased emphasis on TQM first incorporated into IE 425 in Fall 2000. In other instances, the faculty member may take those corrective actions he deems feasible, but at the same time the BSIE Curriculum Committee decides to work on a permanent, more effective change based on discussions with the course coordinator and their independent review of all assessment data available to them. As shown in Figure 2, the Curriculum Committee then develops a proposal, presents it to the faculty and the department head, and based on these discussions either modifies their proposal or proceeds directly to revise the curriculum. The faculty review the proposal from a pedagogical perspective and how well the proposal “fits” within the part of the curriculum not changing, and what are the overall educational impacts likely to be. The department head reviews the changes from the perspective of timing the change, communicating it to constituencies (especially those students affected) and resource implications. Note that in Figure 2, whether the corrective actions taken were implemented as course changes by the course coordinator, or curriculum changes by the faculty and department head, the success of their actions will be determined by on-going annual monitoring outcomes indicators.

“Red” Outcome Indicator

Course Coordinator Actions

Annual Outcomes Assessment

Course Coordinator Actions enough to return to green?

Yes

No Curriculum Committee Proposal Faculty and Department Head Approval Curriculum Revision

Monitor Indicators for Benefits Expected

Figure 2. Process used to react to a “red” indicator score

In the AY 2000-2001 Assessment Memoranda, only three of the 91 indicators were assessed in a “red” condition, requiring immediate action. They were: Indicator Number n.3

Indicator Name Manufacturing Processes

Rationale for “red” assessment Our seniors who took the FE exam scored below the national average 4 of 5 exams reported, and about 20 point below national average the past two Springs.

j.1

g.7

Knowledge of contemporary issues in technology and society

Senior exit interviews since May 1999 produced an average rating of 3.7, below the 4.0 standard. Two semesters the average was only 3.0, which means “adequate” Average score on writing The average score reported in Spring 2000 was portion of IE 461 Systems only 68%, far below the departmental standard Simulation (a core curriculum of 80% W course)

The Curriculum Committee in a memorandum to the faculty on February 5, 2001 recommended the following courses of action be considered for implementation in Fall 2001 in direct response to these red indicators. Indicator Number

Recommend Courses of Action (to be considered)

n.3

Adopt one of the following two alternatives: (a) Require IE students to take Materials Processing, MTE 343, as a “fixed” IE elective, and keep IE 321 as a manufacturing systems/facilities planning course, or (b) Transform IE 321 into a manufacturing systems/processes course, and make IE 460 Facility Planning and Design a required course in the curriculum. Adopt both the following: (a) The IE 463 instructor should immediately increase the readings, discussions, and reports on “contemporary issues in technology and society”, and (b) start a combined effort with College of Arts and Sciences’ faculty to find or develop a course, or course sequence, covering important contemporary issues such as professional ethics, organizational behavior, occupational regulations, etc., that our students can take as required humanities. Replace Technical English (EN 319) with a new Engineering College course (GES XXX) on professional writing and presentation techniques. This course would become a prerequisite to IE 461 where the writing difficulties were detected. It should focus on skills, to be developed through practice, evaluation, and revision.

j.1

g.7

The Curriculum Committee noted the modification made by the IE 463 course coordinator to include more sessions on “contemporary issues in technology and society” in the Fall 2000 section. If an appropriate text could be found, then even more emphasis on this topic was expected in the Fall 2001 section. However, the Curriculum Committee explored the possibility of an Arts and Sciences humanities course that would address “technology and society” and could become a requirement for all BSIE majors at time in the future. As for the g.7 red indicator on the writing portion of IE 461 Systems Simulation, the faculty noted that even though this indicator was below the 80% standard, technical report writing in the curriculum continues into the senior year with two individual design project reports in IE 463 (with corresponding oral presentations to client) and one team design project report in IE 485 (with corresponding oral presentations to the client). It was noted that the overall performance of students in writing technical reports and

preparing/delivering oral presentations improves throughout the senior year, and that the writing-related indicators from the senior year (q.1, q.2, and q.3) and the oral presentation indicators from the senior year (g.7, p.1, p.2, p.3, p.4) did not show a cause for concern.

On-going use of the UA IE Outcomes Assessment System The annual cycle of assessment and evaluation of outcomes has continued for seven years. As described above, annual Outcomes Assessment Memoranda have been formally prepared by the IE Outcomes Assessment Committee, and forwarded to the appropriate committees, individual faculty members, and the department head. Actions taken by any of these recipients based on the memoranda were documented, and their impacts monitored and commented upon in future assessment memoranda. Applying the red-yellow-green rating system within the Assessment Committee has enabled the department to focus on red indicators for immediate corrective action, and yellow indicators for preventive actions. A separate Excel-based system to visually the red-yellow-green assessments within each outcome, within each objective, and across time was developed and implemented in 2005. This system was used to provide annual status reports to faculty, and will be used with the IE program visitor in Fall 2007. A major modification in one set of indicators (the FE exam results for the afternoon IE topics) was required because combining topics required reducing and renaming indicators—essentially starting all over except for engineering economy scores. A minor modification of certain indicators was made that same year based on a revision in the curriculum that combined our work measurement and human factors three-hour courses into one four-hour course. We believe the indicators we chose are a good mix of direct and indirect measures, and have served the needs to triangulate quite well. Because we require the FE exam of all seniors, we discontinued the departmental exam. Conclusion This paper demonstrates how to apply industrial QC concepts of measurement, evaluation, and corrective action to create a system that responds to ABET EC 2000 Criterion 3. By linking indicators to outcomes, and outcomes to objectives, a variety of evaluations are possible. The case study demonstrates the value of choosing measurement instruments which generate data appropriate for the outcomes chosen, and archiving that data electronically. Simple Excel macros can be used to produce time graphs which show indicator data trends and performance against faculty-defined standards. The use of an Outcomes Assessment Committee (or Officer), independent of those who must take corrective action, is an innovation we recommend and which happens to parallel industrial QC practice. References 1. ABET, Inc., Criteria for Accrediting Engineering Programs, E1 2/9/06, ABET Engineering Accreditation Commission, Baltimore, MD, February 9, 2006. 2. Sarin, Sanjiv, “Quality Assurance in Engineering Education: A Comparison of EC-2000 and ISO-9000,” Journal of Engineering Education, October 2000. 3. Team B Report (unpublished), ABET Workshop (sponsored by ASEE&NSF), Atlanta, GA, December 4-6, 1998.

Appendix: Course Titles for Courses Referenced in Table 4 MATH 125 MATH 126 MATH 227 MATH 238 PH 105 PH 106 CH 131 ESM 201 ESM 250 ESM 264 ME 215 ECE 320 MTE 271 EC 110 EC 111 PY 101 IE 253 IE 351 IE 321 IE 473 GES 126 GES 255 GES 257 DR 125 IE 203 IE 363 IE 364 IE 431 IE 464 IE 475 EH 319 IE 463 IE 485

Calculus I Calculus II Calculus III Differential Equations Physics I Physics II Chemistry for Engineers I Statics Mechanics of Materials Dynamics Thermodynamics Electrical Circuits Engineering Materials Microeconomics Macroeconomics Introduction to Psychology Work Measurement and Design Human Factors Engineering Manufacturing Processes Production Planning and Control Computer Programming Engineering Statistics I Engineering Statistics II Engineering Graphics Engineering Economy Operations Research I Operations Research II Systems Simulation Information Systems Design Statistical Quality Control Technical Writing Systems Design I (individual projects) Systems Design II (team projects)

Suggest Documents