Regression-Based Statistical Process Control

Regression-Based Statistical Process Control ABSTRACT An algorithmic technique for using product measurements to control processes is described, alon...
Author: Peter Ford
Regression-Based Statistical Process Control

ABSTRACT An algorithmic technique for using product measurements to control processes is described, along with the results of simulations that indicate its superiority over commonly used alternatives. The procedures can also provide meaningful and diagnostic alarms in situations for which conventional control limits would have little or no utility.

Glenn J. Battaglia AMP Incorporated

• Guides which position parts in plating cells may be jarred slightly, may be set improperly, or may wear with time.

• When a run of a given product is started after the equipment has been used on many other products, conditions differ somewhat from the last run of the product. • Dimensions and composition of incoming stock vary.

INTRODUCTION It is common practice to take samples of products periodically from industrial processes that can be adjusted by manipulating process variables. An electroplating strip line, where the plating thickness is measured periodically and can be controlled by changes in plating current, is a good example of such a process. After measuring a product, operators of such processes must decide whether and how much to adjust a process variable and whether they should look for and correct an “assignable cause” that may be disturbing the process. Conventional statistical process control (SPC) advice is to take no action until a control limit is exceeded, at which time the assignable cause should be found and corrected. When a process is not naturally centered on target or does not consistently operate in statistical control, this conventional approach can be quite suboptimal; in such cases, operators need better algorithms for knowing (1) whether to adjust process variables, (2) how much control action to take when it is called for, and (3) when the process is operating atypically and should be investigated immediately. Often, as in the case of gold plating, these decisions are important and somewhat complex. There are many reasons for processes failing to operate in statistical control. Using electroplating as an example, several obvious causes include: • Concentrations in baths, some of which cannot be measured continuously on-line—drift with time.

Although all of these problems can theoretically be prevented, practical corrective measures may not be known, particularly when new, state-of-the-art products and processing technology are being introduced regularly to achieve more cost-effectiveness. Because of the stigma placed upon processes that are not in statistical control, the need for such a technique is probably greater than many plants are comfortable to admit. Furthermore, it may be quite unclear whether a process is in statistical control and centered on target. A process that fails to meet those two criteria may run within conventional control limits for some time; and when a limit violation does occur, it may be interpreted as an incident in which an unidentified assignable cause upset the process or an improbable event occurred, rather than being taken as an indication that the process is badly centered or really does not conform to Shewhart’s simple model.

MATHEMATICAL FRAMEWORK AND PROCESS MODEL Before describing the recommended control and alarm procedures, it is necessary to explain the model on which they are based. Let x = a product measurement or sample average, z j = the jth process variable to be included in a formula for predicting x,

38

G.J. Battaglia

AMP Journal of Technology Vol. 3 November, 1993

y = a formula prediction of x based upon measurable process variables, so that y = f(z l , z 2, . . .). The formula predictions will, of course, usually contain some error, caused either by systematic formula error or by superimposed noise that originates in variables not compensated by the formula. With a given formula, the best control action will depend upon the predictability of the next formula error, and that is automatically assessed by the proposed technique without needing to know the sources of the error. Consequently, the combined effect of the two general causes is considered as a single error variable, e, which is positive when y is high and is defined as

for the i th product measurement in a series of evenly spaced observations. Immediately after sample i has been taken, the variable ei is the most recent formula prediction error. Discussion of the procedures is simplified if we define an additional variable, f, which is the error at the following sample and which will be observable only after sample (i + 1) has been taken. That is,

where wi = the EWMA, which is considered to be a filter or an estimator of an underlying general error level at time i, L = a constant which is selected to minimize the mean square error (MSE) of hypothetical control actions using a procedure which is described later, and 0 L l. The above equation is somewhat different from the usual expression for an EWMA because it was written to emphasize that wi is an estimator of an underlying formula error level at time i, not a predictor of the next formula error (at time i + 1). For a discussion of the EWMA, see Hunter, Reference 1. Perform a linear regression of f upon w, using historic data. The regression will produce a line that gives optimum forecasts of the next formula error in the sense that it minimizes the squares of the errors of the forecasts for the data used in the regression. The regression can be updated quickly after every sample by remembering six running totals which are used in regression calculations. Determining Control Action

When a new measurement is taken, update w. Then calculate F as

where

RECOMMENDED PROCEDURES The Formula

Write a formula that relates the product variable of concern to pertinent measurable process variables, at least one of which can be adjusted. The formula may be very simple, or as complex as required. For example, in dealing with a machine that cuts lumber to width and that has a calibrated dial, the formula might be just: y = dial setting. In some other applications, the formula might be a complicated equation involving theoretical and empirical knowledge about chemical reactions. Although the technique compensates for consistent formula bias, if the formula is oversimplified some of its inaccuracies may appear as uncontrollable time-independent noise. Conversely, unnecessary complication is a disadvantage per se and may also increase both measurement and theoretical error. Tuning Calculations

Tune an exponentially weighted moving average (EWMA) of the prediction errors, ei. The EWMA is calculated as

AMP Journal of Technology Vol. 3 November, 1993

F = the regression estimate of f, the next formula error, b = the slope of the regression line, and f and w = the averages of the values of f and w, respectively, used in the regression. Then, given that there is no reason to believe the procedure is inapplicable at this point in time (this possibility is discussed below), adjust the manipulated variables so that the formula prediction of the next product measurement becomes

where T = the target value for the product variable. If the next error, at time i + 1, does in fact equal F, then

That is, the next observation will equal the target if the next error does equal the regression prediction of that error. Figures 1 and 2 may make it easier to visualize the mechanism by which the technique adjusts control action to the process. Both figures show regressions of f upon e,

G.J. Battaglia

39

which are the same as regressions off upon w when L is set to 1.0.

Figure 2. Regression of f upon e for statistical control.

Figure 1. Regression of f upon e for a random walk.

The best strategy for the random walk of Figure 1 is to assume that the next error will equal the last one. The best strategy for the statistical control depicted in Figure 2 is to assume the next error is independent of the last error. The technique reaches these conclusions empirically, with no attempt to construct a mathematical process model. Theoretically, y can be adjusted through any combination of the z’s, but there maybe compelling practical reasons for the selection of certain process variables for manipulation. Detecting Abnormal Conditions If there has been a fundamental change in the process, the formula may no longer be applicable and/or the historic regression off upon w may no longer apply. In either case, adjusting y to equal T + F may not lead to the desired value of the product variable and there is a need to learn the cause, so it is better to investigate than to adjust blindly. There are two readily available sources of warnings that process behavior may have changed. The first source is the process itself. An operator may observe a malfunction, like the burning out of a heater, or a control system may signal that a limit on a process variable has been exceeded.

40

G.J. Battaglia

The second source of warnings about process change is the statistics from the control procedure, which can provide three useful sets of control limits. Discriminating computer-generated alarms are becoming increasingly necessary in highly automated plants, where humans must be summoned automatically. Control limits should be set on the process variable or variables that are manipulated to adjust the product variable y to the desired level. For example, in the electroplating of gold, increased current can be used to compensate for decreased gold concentration in a plating bath, but this practice will ultimately lead to an unsatisfactory deposit. A control chart on the thickness will not reveal that unacceptable plating quality is imminent, but control limits on amperage can quickly detect that possibility. Control limits should also be maintained on the formula prediction error. If such limits are violated: 1) the formula may be inadequate, in which case it should be revised, and/or 2) one or more process variables that are not included in the formula may have changed, so an investigation should be instituted to find and correct the problem. The limit violation might simply bean unfortuitous combination of many small errors, but if the limits are set at four

AMP Journal of Technology Vol. 3 November, 1993

standard deviations it is more likely that a major change has occurred in only one or a few variables. Note that conventional SPC does not specify control limits for manipulated variables or for formula error. Control charts on the product variable itself can also be effective when this technique is used, even if conventional SPC would be quite marginal without application of the recommended procedures. If a process is not in statistical control, traditional control limits can be violated so frequently that they are virtually meaningless. Measurements of a product variable from a process that is not in statistical control are likely to form an autocorrelated series when no control action is taken. When the regression-based control technique was applied to a wide variety of autoregressive integrated moving average (ARIMA) processes in simulations, it often removed the autocorrelation between samples, leaving product measurements that were random in time.

Finally, select the value of L which gives the lowest value of (F9i 2 e i+1) 2, but beware of using low values of L unless many observations have been made. If the available data come from more than one run, go through the data run by run, finding (F9i 2 e i+1) 2 for each run, then select the value of L which produces the lowest sum of (F9i 2 e i+1) 2 for all the runs. This approach avoids the nonsensical procedure of extending an EWMA series from one run into another run that started months later. Summary of Routine Algorithm The basic processing that is routinely required after every sample is summarized below. 

Obtain: the product variable (x i) and the pertinent process variables (zi,l, z i, 2, . . .). 

Calculate: the formula prediction as f (zi,l, z i, 2, . . .). yi the formula error as e i yi x i, the EWMA as wi L . e i (l L) . wi+l, the regression forecast of the next formula error as F f b . (w i w).

Control limits on the product variable should be set at the conventional 3 standard deviations from target; but the standard deviation should be estimated directly from the subgroup averages, not based upon the variance within subgroups. When the limits for the product variable are violated, there is reason to believe that control is not as good as it could be and that the process merits investigation. Other common indicators, such as the occurrence of eight consecutive values of the product variable above target, can also be useful with a series that is usually random in time. Tuning the EWMA The recommended EWMA tuning procedure optimizes the predictions of a regression based upon the EWMA and is not the usual method for tuning an EWMA, which optimizes only the predictions of the EWMA itself. If a hypothetical adjustment is made to historic data so that y9i+l (where the prime indicates a hypothetical value) equals T 1 F9i, and e9i+l remains at the value it had in the historic data; now, the hypothetically resulting value of x9i+l should equal T 1 F92 e i+l and its deviation from target i should be equal to F92 e i+l. Tune the EWMA to minimize i the sum of the squares of these hypothetical deviations from target, or to minimize (F92 e i+1)2. i More specifically, calculate e i and fi for all available data. Then perform the following calculations for all values of L from 0.2 to 1.0 by 0.1 increments:

Find all w, corresponding to the available ei , Regress f upon the w. Find (F9i 2 e i+1) 2, where e i+l is taken from the original series and F9i is found from the regression at this L.

AMP Journal of Technology Vol. 3 November, 1993

Compare: the product variable (xi), the formula prediction error (ei), and pertinent process variables (z i,j) with limits and issue alarms if necessary. 

Adjust the manipulated variable(s) so that F, yi+l T where T is the target value for x. 

Update the running totals used in regression calculations. 

Update f, w, and bin the regression equation.

SIMULATIONS The proposed technique was tested extensively in Monte Carlo simulations using ARIMA processes, which are mathematical process models that are used in time-series analysis and are thought to be suitable for modeling virtually any real world process. For detailed information on ARIMA processes, see Box and Jenkins, Reference 2. Such simulations have several advantages. They can compare competing control strategies under identical conditions. They provide repeatable and statistically meaningful comparisons with large numbers of observations over an extremely wide variety of process types. They allow process variation and measurement error to be separated clearly so that the impact of each can be investigated and control policies can be evaluated on the basis of the actual, rather than the measured, product values. Finally, they allow deductions to be made about the suitability of the various strategies for various types of ARIMA processes.

G.J. Battaglia

41

In each simulation, values of e i were generated as process noise by a specified ARIMA process and superimposed upon yi to create values of x i . All ARIMA processes were driven by disturbances that were normally distributed.

During the subsequent comparative production run, the regression was updated after every sample and the EWMA was retuned every 20 samples for the first 100 samples and less frequently thereafter.

A normally distributed time-independent random variable could be added to the process noise to represent measurement error. The magnitude of this simulated measurement error was fixed by estimating the standard deviation of the process in a preliminary tuning run and by specifying the ratio of the standard deviation of the measurement error to the process standard deviation estimated from the tuning run. It was also possible to add to the process noise series a bias that was equal to a specified fraction of the range observed in the preliminary tuning run.

Fourteen combinations of process, measurement noise and bias were investigated. Results from 10 comparative runs of each type were averaged and are presented in Table 1, in which the headings have the following meanings:

Three policies were employed for adjusting y. The mean square errors (MSEs) of the three policies relative to target were calculated and compared, using the “true” values of the product variable, rather than the value of x, to which measurement error had been added. The three policies compared were: EC: Change y by an amount equal to the deviation of x from target. MS: Modified Shewhart. Calculate S, an estimate of the standard deviation of the subgroup averages, directly from the subgroup averages and set limits at T 3S. Make no changes until out of control limits, then change yin the manner used for the EC policy. All subgroups consisted of just one observation, so the traditional method of setting limits was impossible. For a process in statistical control, however, the expected value of the limits is the same for both methods. REG: The proposed regression-based system. Adjust y directly, which amounts to using a tacit formula y z, and minimizes execution time. The introduction of a more complicated formula would have made no difference in the results. Note that it was assumed that a theoretical formula was available for use by all three policies, whereas such a formula often is not employed in plants where a practice approximating either the EC or MS policy is used. Simply introducing a sound formula and providing automatic error-free calculations might provide substantial improvement in control, but that possibility is in no way reflected in the study results. Study Results Many such comparisons have been made, but only a major study, which involved considerable replication and which gave results consistent with the other simulations, will be detailed here. Comparative Monte Carlo “production” runs consisted of 250 consecutive samples, but they were preceded by “tuning” runs of 25 samples which generated the data for initial tuning of the EWMA and an initial regression.

42

G.J. Battaglia

M / P = the ratio of the standard deviation of simulated measurement error to the standard deviation of the process noise Bias = the bias of the process noise, expressed as a fraction of the range of a preliminary tuning run p = the autoregressive order of the ARIMA series d = the differencing order of the ARIMA series q = the moving average order of the ARIMA series = the ith autoregressive coefficient i 1 = the first order moving average coefficient REG MSE = the MSE of regression-based control MS MSE = the MSE of the MS policy EC MSE = the MSE of the EC policy

Table 1. Results of Monte Carlo simulations.

A repetition of the above simulations with tuning runs of 150 observations gave similar results. Note from Table 1, in which the lowest MSE for each process type has been underlined, that the REG policy was often much better than the other policies and that it was never more than slightly worse. It is also significant that the regression-based policy often gave nearly optimum control. The disturbances in the ARIMA processes had a variance of 1, so the lowest MSE that could be expected was 1. The first four processes in Table 1 are in statistical control in the sense of being completely random in time. The

AMP Journal of Technology Vol. 3 November, 1993

superior performance of the regression-based policy in the face of a superimposed measurement error is due primarily to the fact that the regression coefficient, b, stayed at or very close to zero, so that little or no action was ever taken, which is the optimum strategy for statistical control. The MS policy would occasionally violate a control limit, take a control action, and become lost; this tendency was particularly noticeable in the face of measurement noise or a biased process noise series. The EC policy always took action based on the random noise and exhibited the expected resulting increases in MSE.

experience with those processes can be used to estimate the variances of w and f and the regression slope, b. The program uses those values to set starting values of the six running totals used by the regression update routines. The EWMA tuning constant can also be estimated from similar processes. These capabilities have been incorporated into software written for use at AMP.

Runs 5 and 6 were based on moderately autoregressive processes.

Even if data from similar processes are unavailable, it is not unreasonable to use rough starting estimates, such as setting b at the intermediate value of 0.5. If measurement error is known to be low, it may also be satisfactory to start with the EC procedure, which seldom does a really bad job unless measurement error is high.

Run 7 was a random walk, for which the EC policy is the theoretical optimum; but one must know that the process is a random walk with no measurement error to take advantage of this fact. When measurement error was added to a random walk for run 8, control suffered under all strategies, but the REG policy became clearly superior.

In any event, the regression is automatically improved as the process inns. The rate at which the regression is modified as actual data are acquired can be controlled by assigning a hypothetical number of observations to the initial estimates. The larger the number assigned, the more slowly will initial estimates be modified.

Runs 9 through 12 area mixture of more complex stationary processes, always with no measurement error. The EC policy did fairly well on these runs but was always worse than the REG policy, probably because the EC policy overcontrolled.

Run 13 was generated by a process which is sometimes called an IMA(l,l) process and in which the level of the process may be viewed as an EWMA of past observations. (See Nelson, Reference 3, pp. 62-63.) The EC process did very well on this run and on run 7, both of which were nonstationary processes (no fixed mean or variance). Run 14 was a quite unstable process, showing that the EC policy does not always control a nonstationary process satisfactorily. Note how well the REG policy coped with this extreme process. Numerous other runs were made with different ARIMA processes and different assumptions about measurement error and the bias of the noise series. As with the study reported above, the regression-based technique was never much worse than the other policies and was often much better. Even odd processes like negative autoregression, which completely defeated the EC and MS strategies, were handled well by the REG policy. Autocorrelation of the controlled variable produced by the REG policy was checked in a number of cases, using autocorrelation functions. It often appeared to be removed altogether and was substantially reduced in all the cases tested. Control of New Processes To address the problem of controlling a new process, a simulation program was written which allowed control to be started on a process in the absence of the historic data needed for a regression. If data exist for similar processes,

AMP Journal of Technology Vol. 3 November, 1993

The recommended technique synergistically utilizes the following existing knowledge: 1. theoretical and empirical process knowledge embodied in a formula relating process and product variables, 2. historic data from the process, and 3. statistical inference. Simply introducing a sound formula and automatic calculation may provide an improvement for a number of processes. The procedures automatically distinguish between those processes which should be treated as if they are in statistical control and those which should not. This distinction is not always made accurately by conventional control charts, and conventional SPC prescribes no method for controlling processes which do not regularly operate in statistical control. The recommended procedures are almost completely algorithmic and transparent to process operators and can be carried out by computer. They use the conventional subgroups (or single observations) which are now taken for SPC in many plants. The only additional effort that they require from operators is the input of pertinent process variables when automatic data acquisition is not employed. The recommended procedures fill a gap between Shewhart SPC, which is useful only for processes in statistical control, and engineering process control, which is not directed toward process improvement through removal of causes of variation and which requires supporting personnel who are unavailable in many plants. In contrast, the proposed methodology is based on simple linear regression, which should be familiar to any qualified quality engineer.

G.J. Battaglia

43

LIMITATIONS AND CAVEATS A clear correspondence that is not obscured by time delays must exist between process variables and product variables for the methodology to function properly. If, for example, the power to an electrical heater were used as a process variable in a formula, a change in the power might take effect on the product variable slowly as the affected process temperature increased toward a new equilibrium. The regression-based technique could perform well under such circumstances only if the new equilibrium temperature were reached quickly enough for its impact upon the product to be reflected fully in the next sample. The procedures are not a substitute for PID (proportional /integral /derivative) algorithms and other methodologies that are needed to handle process dynamics. Samples should be taken at approximately regular intervals. The procedures have limited usefulness if the regression of f upon w is not statistically significant, whether that condition results from a chaotic process that has no underlying consistency or from a process in statistical control. However, the procedures do appropriately set regression coefficients at or near 0 under such circumstances so that they discourage harmful tweaking. Furthermore, in a study of five electroplating processes, all showed considerable autocorrelation (prior to control action) and had regression coefficients between 0.57 and 0.72. All five gave high t statistics for the regression coefficient, indicating a high degree of process consistency. Unless data on similar processes are applicable, control may be marginal until some history is available, e.g., for 20 subgroups, but this number is no greater than is required to set conventional control limits. Computer programs are required to perform all the calculation and data storage economically and conveniently. The programs are not overly difficult, and a prototype of such a system has been written for testing at AMP. Successful application requires some precautions, most of which can be incorporated into or prompted by the computer system. In this regard, implementers should be aware that: • Only statistically significant regressions should be used. In the simulations, b was kept at zero if the t statistic for b was between 2.0 and 1.5. • The EWMA tuning constant should be set at low values only if a substantial number of observations are available, and should never be set below 0.2. Tuning constants of 1.0 never degraded control disastrously in the simulations. • Both the tuning of the EWMA and the regressions should be updated automatically as more data become available and should periodically be recalculated using data from which very old data have been purged. Remember, processes do change!

44

G.J. Battaglia

• Data from atypical periods should not be used in the statistical calculations. • Process variable measurements must be properly associated with product measurements. By the time an off-line product measurement is completed, the process variables may have changed substantially from those which affected the product sample. • The formula probably should be revised if the control limits for e are violated more often than occasionally. • Operators can observe many factors which are not available to or meaningful to a control algorithm and should either retain final judgment or be able to override the system. • Relying exclusively upon the continual updating of the regression for adjusting to changes in the underlying relationship between process and product variables can result in unnecessarily slow adaptation to certain sudden changes. For example, when a given product is run again after many months, the old formula may be less accurate than it was. When an operator adds gold to a gold plating bath, the plating thickness will increase suddenly if other factors are held constant. Worthwhile improvement in control can be expected from modifications to the control strategy that assess and allow for changes in the bias of the formula due to periods of interruption and due to identifiable events for which the operator can supply estimates of impact. Regression-based control gives “optimum” results in a limited sense. Control can be further improved by reducing underlying noise (the ultimate goal of SPC), by improving the formula, by increasing the sampling frequency, and possibly by using more complex techniques like curvilinear regressions and PID. The Monte Carlo simulations that were described earlier simply controlled a superimposed noise signal and did not try to cope with inaccuracies in the formula or interactions between noise and the formula. The technique can adjust to a consistent formula bias, but it has no means of adjusting for formula errors whose magnitude depends upon the existing values of the process variables which are included in the formula. Consequently, it is worthwhile to develop an accurate formula. The EC and MS policies, of course, also would suffer from formula inaccuracies. The variables f and w do not conform to the assumptions used in conventional regression theory, but the familiar regression calculations give a least-squares fit to the data, which is all that is needed for good results.

SUMMARY Many production processes offer an opportunity to control a product variable through manipulation of process variables. However, it is often unclear how much to adjust process variables on the basis of product measurements, or even whether such adjustments should be made at all. It is a related problem that Shewhart control charts reliably

AMP Journal of Technology Vol. 3 November, 1993

indicate when a process requires attention only if the process typically runs in strict statistical control. A regression-based control technique has been developed to address these two problems. The proposed policy gave clearly superior control when it was compared to two common alternatives in extensive Monte Carlo simulations. The procedures also give meaningful alarms for autocorrelated processes and offer three sets of control limits, each of which points to different underlying problems. The recommended procedures are automatic, algorithmic, self-tuning, and computerizable. They utilize sampling of the type that is now used for conventional statistical process control (SPC). The only additional effort that they require from operators is the input of pertinent process variable measurements when that data acquisition has not been automated. The technique is based upon a user-supplied formula that relates a product variable of interest to selected process variables and which may be quite simple. Historic formula prediction errors are recorded. A linear regression is made of each formula prediction error upon the previous formula prediction error, or upon an exponentially weighted moving average (EWMA) of the errors. After each sample the regression is used to predict the formula error of the next sample and then process variables are adjusted accordingly. The method is applicable only to situations in which a change in any of the process variables used in the formula takes essentially complete effect before the next sample.

ABBREVIATIONS AND SYMBOLS ARIMA: autoregressive integrated moving average EWMA: exponentially weighted moving average EC: a control policy in which process variables are modified enough to change a formula prediction by an amount equal to the most recent deviation of the product variable from its target MS: a control policy in which the control action described for the EC policy is taken only when control limits are exceeded M / P: the ratio of the standard deviation of simulated measurement error to the standard deviation of the process noise PID: the proportional / integral / derivative control technique used in many analog controllers and in digital control patterned on that approach REG: a control policy employing the procedures recommended in this paper S P C: statistical process control b = the slope of a regression line d = the differencing order of an ARIMA series ei = the formula prediction error at time i, defined as ei yi x i e i = a hypothetical value of e i used in tuning an EWMA to be used with the recommended control technique

AMP Journal of Technology Vol. 3 November, 1993

fi = the formula prediction error for sample (i 1), defined as fi e i+l f = the average of the values of f used in the regression of f upon x F = the value of the formula prediction error, f, forecasted by the regression for the next sample F = a hypothetical value of F used in tuning an EWMA L = an EWMA tuning constant, often denoted by the Greek letter lambda p = the autoregressive order of an ARIMA series q = the moving average order of an ARIMA series S = an estimate of a standard deviation based on a sample T = the target level for a product variable wi = the EWMA of the formula prediction error at time i w = the average of the values of w used in a regression off upon x xi = a product measurement, or a subgroup average of the measurements, resulting from product sample i x i = a hypothetical value of x i used in tuning an EWMA yi = a formula prediction of x i at sample i, based upon measurable process variables (z) which affect xi y i = a hypothetical value of yi used in tuning an zj = the jth process variable to be included in a formula for predicting x i = the ith autoregressive coefficient 1 = the first order moving average coefficient

REFERENCES 1. J.S. Hunter, “The Exponentially Weighted Moving Average,” Journal of Quality Technology 18, (4), 203210, (1986). 2. G.E.P. Box and G.M. Jenkins, Time Series Analysis: Forecasting and Control, revised ed. (Holden-Day, Inc., Oakland, CA 1976). 3. C.R. Nelson, Applied Time Series Analysis for Managerial Forecasting (Holden-Day, Inc., San Francisco, CA, 1973). Glenn J. Battaglia is a Project Manager in the Process Control Technology Department at AMP Incorporated in Harrisburg, Pennsylvania. Mr. Battaglia holds an S.B. in Chemical Engineering from the Massachusetts Institute of Technology and an M.B.A. from Harvard University. His industrial career, most of which has been spent at the Polaroid Corporation and at AMP Incorporated, includes work in research, manufacturing, and computer systems. Since joining AMP, his primary activity has been the application of statistics to control of processes.

G.J. Battaglia

45