Continuous Process Performance Monitoring Using Nelson Rules

Rustom Mody,  Executive Vice President Science & Technology Intas Biopharmaceuticals Ltd. India

Shaligram Rane,  Sr. General Manager Quality

Hitesh Patel,  Asst. General Manager Quality Assurance

Process monitoring is the method of collecting process data and statistical evaluation of critical process parameters to identify unplanned and unforeseen process changes and shifts / drifts, in order to verify and demonstrate that the process is operating in a state of control. Process data is statistically evaluated using control charts by applying Nelson rules. These rules are used as a tool to determine if some measured variable is out of control i.e. unpredictable versus consistent. These rules are for detecting ‘out-of-control’ or ‘non-random conditions’.

Regulatory agencies expect, that there should be a system, for process monitoring for detecting unplanned departures from the process during manufacture and to identify future process improvement opportunities, in order to provide continual assurance that the process is in a state of control (validated state) during manufacture to ensure desired product quality.

Process monitoring is the method for collecting process data and statistical evaluation of critical process parameters to identify unplanned and unforseen process changes and shifts / drifts, in order to verify and demonstrate that the process is operating in a state of control. Process monitoring also helps to identify areas for continual improvement.

Following guidance has been issued by various agencies related to process monitoring.

FDA process validation guide (stage 3)[ 1 ] states that:

An ongoing program to collect and analyse product and process data that relate to product quality must be established (§ 211.180(e)). The data collected should include relevant process trends and quality of incoming materials or components, in-process material, and finished products. The data should be statistically trended and reviewed by trained personnel. The information collected should verify that the quality attributes are being appropriately controlled throughout the process.

21 CFR 211.110 (a) [ 2 ] states that:

To assure batch uniformity and integrity of drug products, written procedure shall be established and followed that describe the in-process controls and tests, or examinations to be conducted on appropriate samples of in-process materials of each batch. Such control procedure shall be established to monitor the output and to validate the performance of those manufacturing processes that may be responsible for causing variability in the characteristics of in-process material and drug product.

The GMP regulations at 21 CFR 211.110(b) also require the use of ‘suitable’ statistical procedures to evaluate drug variability.

The ICH Q8 guidance (2.3) [ 3 ] states that:

In order to provide flexibility for future process improvement, when describing the development of the manufacturing process, it is useful to describe measurement systems that allow monitoring of critical attributes or process end-points.

Collection of process monitoring data during the development of the manufacturing process can provide useful information to enhance process understanding.

The ICH Q10 guidance (3.2.1) [ 4 ] identifies monitoring as a key element of the pharmaceutical quality system and states that:

Pharmaceutical companies should plan and execute a system for the monitoring of process performance and product quality to ensure a state of control is maintained. An effective monitoring system provides assurance of the continued capability of processes and controls to meet product quality and to identify areas for continual improvement.

Continuous process monitoring

Good process design and development anticipates significant sources of variability in critical process parameters and establishes appropriate detection, control, and / or mitigation strategies, as well as appropriate alert and action limits. However, a process is likely to encounter sources of variation that were not previously detected or to which the process was not previously exposed.

There are two different approaches that are used in monitoring of biopharmaceutical manufacturing processes. The first approach focuses on data from a single lot. Such data associated with a particular lot are reviewed before releasing that lot. A data point beyond a control limit referred to as out of specification (OOS) for a given process, would cause a non-conformance that would have to be accounted for as part of the lot disposition process. The second monitoring approach analyses process performance data across lots looking for trends i.e. to identity any out of trend (OOT) from the normal process.

Intas Biopharmaceuticals Limited (IBPL) has an ongoing programme to collect and analyse product and process data that relate to product and process quality as required by various regulatory bodies. The data collected by IBPL includes relevant process trends, quality of incoming materials or components, in-process materials, and finished products and is statistically trended on an ongoing basis (live) and also annually in the form of Annually Product Quality Review (APQR). The trends are reviewed by trained personnel to ensure that quality attributes are being appropriately controlled throughout the process and to identify variability in the process to signal potential problems and identify areas of process improvements as expected by various regulatory agencies.

IBPL uses many tools and techniques to detect variation, characterise it, and determine the root cause. Review includes scrutiny of critical process parameters (as defined in the batch manufacturing record) as well as analysis of inter-batch variation as a part of a comprehensive continuous process verification program.


Data is statistically analysed using control charts. A control chart is a plot of a process parameter or product characteristic over time, which includes upper and lower control limits (UCL and LCL, respectively) and a centerline calculated from the process data. These upper and lower control limits are derived statistically, and provide bounds for the natural variability of the process. Upper and lower control limits are typically established at ±3 Standard Deviations above and below an established process mean; individual points are evaluated against these limits using Nelson rules.

The Nelson rules were first published in the October 1984 issue of the Journal of Quality Technology in an article by Lloyd S Nelson [5,6].

These rules are used as a tool to determine if some measured variable is out of control i.e. unpredictable versus consistent. These rules are for detecting ‘out-of-control’ or ‘non-random conditions’

The Nelson rules are applied to a control chart on which the magnitude of some variable is plotted against time. The rules are based around the mean value and the standard deviation of the samples.

The Nelson rules were based on dividing the control chart into eight segments using one, two, and three standard deviations above and below the centerline. These rules are very useful in process monitoring; particularly Nelson rules 1 through 4.

Control chart and Nelson rules assist in determining whether observed variability is likely to be because of “chance” (isolated incident) or the result of some (perhaps as yet unidentified) process shift or change.

Variation in the process could be due to common or special causes of variation. Common causes refer to the many unknown sources of variation that go into producing a natural variation that is predictable within limits. Special causes (also known as assignable cause) refer to the sources of variation that are not part of the natural variation and usually can be traced to an assignable cause.

Use of nelson rules in monitoring process performance

Following are the examples in which Nelson rules helped in monitoring and identifying common cause or special cause of various in the manufacturing process.


Example 1

While monitoring the step yield of a pegylation process for one of the products, a shift in the trend from batch No 017 was observed as shown in Figure 1. The reason for the shift was identified to be due to change of manufacturing lot of two critical raw materials used in the pegylation process. Both materials were released by Quality Control since the test values were within the defined release specification. Yield again increased as soon as fresh materials from different manufacturing lots were used.(one material lot was changed from batch no. 40 and second material lot was changed from Batch No 43). The cumulative impact on Pegylation yield (%) was found after batch no. 43.

As a CAPA, for better control of the pegylation process two additional tests were introduced in the testing of one of the critical materials, thereby better process control was achieved.

Pegylation yield (%) in all batches were within the set acceptance criteria of NLT 65% and found more than this defined specification.

Another example was of step yield in anion exchange chromatography step for one of the product.

In the above graph, batch no. 86 had clearly broke Nelson rule 1 which clearly indicates process was grossly out of control. On investigation it was found that column was packed with new resin after batch no 85.

The control chart clearly showed the impact of a change made in the process. Initial batches (72 to 85) had less % step yield (average 36.21%) which then significant increased (average 51.31 %) from batch no. 86 onwards due to change of resin. This indicates that the column resin plays an important role in consistent performance of the chromatography step. To better control the process. It was decided to define the, frequency of replacement of resin after observing trend from more number of batches.

While interpreting Nelson rules online with limited number of batches initially and latter with more number of batches, (generally at the time of annual product review) it is possible to observe different rules being breached for the same batch at different time point. This happens because of shifting of the centre line and its corresponding 3 standard deviation. With more batches, the mean (centre line) changes and with it the applicable Nelson rule changes. This is depicted in the example below.

Example 3

When interpreting with limited number of batches it shows breach of Nelson rule 1, but at the time of APQR (with more number of data points) it shows no violation of nelson rules.

In another case the converse was observed. There was no violation during online monitoring (with limited number of batches). However, rule 1 was violated when trend from more batches was plotted.

Summary

Nelson rules are useful in detecting process related problems. Except in case of rule 1 the samples required before a rule is found to be breached is anywhere from 2 to 15.

Table-1: Nelson rules and statistically valid number of samples required.
Thus, when the problem is detected (after the statistical number of observations are made as shown in above Table) the product produced from these batches would already be released to the market. This is because the rule is not breached until the required statistical numbers of manufacturing lots are produced as shown in Table-1. Nevertheless, it helps in identifying a problem and with appropriate CAPA more process consistency can be addressed.

Nelson rules are very useful in monitoring and controlling process and providing opportunity for process improvement.

Quality department at IBPL continuously trends and provides feedback on process performance to production department based on the on-going (live) trending. Both departments together evaluate data, discuss possible trends or undesirable process variation, and decide on the CAPA which would help in controlling process thereby ensuring desired product quality.

Author Bio

Rustom Mody
Shaligram Rane
Hitesh Patel
TOP