Tampilkan postingan dengan label science-tech. Tampilkan semua postingan
Tampilkan postingan dengan label science-tech. Tampilkan semua postingan

Biorelevant Dissolution: Methodology and Application in Drug Development

0 comments
Dissolution testing can play an important role in several areas for drug products as a quality control
tool to monitor batch-to-batch consistency of drug release from a dosage form and as an in vitro surrogate
for in vivo performance that can guide formulation development and ascertain the need for bioequivalence
tests. The possibility of substituting dissolution tests for clinical studies has been revealed by the development
of the Biopharmaceutics Classification System, and dissolution tests that can predict the in vivo performance
of drug products (usually called “biorelevant” dissolution tests) could serve this purpose (1, 2). In terms of media and hydrodynamics, biorelevant dissolution testing should provide a baseline for drug and dosage-form performance and should be used to guide formulation development, to identify food effects on the dissolution and bioavailability of orally administered drugs, and to identify solubility limitations and stability issues. The importance of the development of predictive dissolution testing is increased by the fact that the majority of drugs currently in development are poorly soluble drugs and by the challenges for new dosage-form approaches.

For complete article please kindly click here

Film Coating Technology: An Overview

1 comments
Abstract

Tablet coating is perhaps one of the oldest pharmaceutical processes still in existence. The sugar-coating process was a skilled manipulative operation and could last for even five days. The operator must be highly skilled for such coating. In the last 25 years tablet coating has undergone several fundamental changes. Many modifications were advocated to improve the basic process and film coating chosen in place of sugar coating. Coating solution composition may affect the quality of final coated tablets. Some coating process parameters are affect the final quality of coated tablets so it is necessary to optimize the coating process parameters for particular equipment and particular film former. Optimization of composition of film coating solution is also required.

Keywords : Film coating, Coating Process parameters, Coating composition

Introduction

All drugs have their own characteristic, like some drugs are bitter in taste or has an unpleasant odor, some are sensitive to light or oxides, some are hygroscopic in nature.1,2,3 Because of this reason tablet coating is the choice of option to solve such problems in conventional dosage form.

In the past sugar coating was mostly borrowed from the confectionary industry. But now a days it is replaced with film coating, because the sugar coating process was a skilled manipulative process and could last for even five days. The operator must be highly skilled for such coating. Hence film coating is preferred over sugar coating.

Tablet film coating is performed by two types, one is aqueous film coating (generally water is used as a solvent) and non aqueous film coating (generally organic solvent are used.) Some problems are associated with the non aqueous film coating like employee safety (it’s dangerous, it smells, and it’s not good to breathe.) atmosphere pollution etc. But key problem is with the approval of the regulatory authority4. High quality aqueous film coating must be smooth, uniform and adhere satisfactorily to the tablet surface and ensure chemical stability of a drug.
“ Why Tablet Coating is Required ? ” 1, 2, 3

A number of reasons can be suggested:

§ The core contains a material which has a bitter taste in the mouth or has an unpleasant odour.

§ Coating will protect the drug from the surroundings with a view to improve its stability.

§ Coating will increase the ease by which a tablet can be ingested by the patient.

§ Coating will develop the mechanical integrity, means coated products are more resistant to mishandling (abrasion, attrition etc.)

§ The core contains a substance which is incompatible in the presence of light and subject to atmospheric oxidation, i.e. a coating is added to improve stability.

§ The core alone is inelegant.

§ The active substance is coloured and migrates easily to stain hands and clothes.

§ The coated tablets is packed on high-speed packaging machine. Coating reduces friction and increases packaging rate.

§ Coating can modify the drug release profile, e.g., enteric coating, osmotic pump, pulsatile delivery.

Introduction to Film Coating Materials

A film coating is a thin polymer-based coat applied to a solid dosage form such as a tablet. The thickness of such a coating is usually between 20-100 µm. Under close inspection the film structure can be seen to be relatively non- homogenous and quite distinct in appearance, from a film forming, from casting a polymer solution on a flat surface.5, 6

Film coating formulations usually contain the following components


Polymer,

Plasticizer,

Colourants / Opacifiers,

Solvent / Vehicle.

Polymers


Amongst the vast majority of the polymers used in film coating are cellulose derivatives or acrylic polymers and copolymers.5, 6, 7

Non-enteric polymers8


·Hypromellose

·Hydroxyethyl cellulose

·Hydroxyethylmethyl cellulose

·Carboxymethylcellulose sodium

·Hydroxypropyl cellulose

·Polyethylene glycol

·Ethylcellulose

Enteric polymers

Some examples of enteric coating polymers

·Hypromellose phthalate

·Polyvinyl acetate phthalate

·Cellulose acetate phthalate

·Polymethacrylates

·Shellac

Plasticizers


Plasticizers are simply relatively low molecular weight materials which have the capacity to alter the physical properties of the polymer to render it more useful in performing its function as a film-coating material.7, 8, 9 It is generally considered to be mechanism of plasticizer molecules to interpose themselves between individual polymer strands thus breaking down polymer-polymer interactions. Thus polymer is converted in to more pliable materials. Plastisizers are classify in three groups. Polyos type contain glycerol, propylene glycol, PEG( Polyethylene glycol ). Organic esters contain phthalate esters, dibutyl sebacete, citrate esters, triacetin. Oils/glycerides contain castor oil, acetylated, monoglycerides, fractionated coconut oil.

Solvents/Vehicles


The key function of a solvent system is to dissolve or disperse the polymers and other additives. All major manufactures of polymers for coating give basic physicochemical data on their polymers. These data are usually helpful to a formulator. Some important considerations for solvent are as follows:6

The major classes of solvents being used are


·Water

·Alcohols

·Ketones

·Esters

·Chlorinated hydrocarbons

Because of environmental and economic considerations, water is the solvent of choice; however organic coating is totally cannot be avoided.

Colourants / opacquants


The sematerials are generally used as ingredients in film-coating formulae to contribute to the visual appeal of the product, but they also improve the product in other ways7,8,9:

Identification of the product by the manufacturer and therefore act as an aid for existing GMP procedures.

- Reinforcement of brand imaging and reduction in product counterfeiting.

- Identification of the product by patients by using colourants.

Colourants for film coating are having, in more or less amount, property of opacifier. So they would give protection to active ingredients in presence of light. Colourants are mainly classified in to three part. Sunset yellow, tartrazine, erythrosine are examples of Organic dyes and their lakes. Iron oxide yellow, red and black, titanium dioxide, talc are the examples of Inorganic colours. Anthrocyanins, ribofloavine and carmine are the examples of natural colours.

Miscellaneous coating solution components


To provide a dosage form with a single characteristic, special materials may be incorporated into a solution6.

Flavours and sweeteners are added to mask unpleasant odours or to develop the desired taste. For example, aspartame, various fruit spirits (organic solvent), water soluble pineapple flavour (aqueous solvent) etc.

Surfactants are supplementary to solubilize immiscible or insoluble ingredients in the coating. For example, Spans,Tweens etc.

Antioxidants are incorporated to stabilize a dye system to oxidation and colour change. For example oximes, phenols etc.

Antimicrobials are added to put off microbial growth in the coating composition. Some aqueous cellulosic coating solutions are mainly prone to microbial growth, and long-lasting storage of the coating composition should be avoided. For example alkylisothiazloinone, carbamates, benzothiazoles etc.

Coating Process


Film-coating of tablets is a multivariate process, with many different factors, such as coating equipment, coating liquid, and process parameters which affect the pharmaceutical quality of the final product10-13.

Coating equipment14


Before few years different types of coating pans are used for coating like conventional coating pans, manesty accelacota, driam ( driacoater ), butterfly coater etc. Now a days the side-vented, perforated pan-coater is the most commonly used coating device of tablets. In equipment spray nozzle, number of spray nozzle, pan size, etc may also affect the quality of final product. Its air flow system through a perforated pan ensures rapid and continuous drying conditions. The low evaporation capacity of water requires high drying efficiency of aqueous film-coating equipment.

Coating liquid


Coating liquid may affect the final quality of the tablets. Different film former have different chemical nature and different charesteristic. Viscosity may affect the spreading of coating liquid across surface of substrate. Surface tension may affect in wetting of surface. % Solid content generally affect the tablet surface and coating efficiency.15

Process parameters


Spray rate


The spray rate is an significant parameter since it impacts the moisture content of the formed coating and, subsequently, the quality and uniformity of the film.16,17,18 A low coating liquid spray rate causes incomplete coalescence of polymer due to insufficient wetting, which could effect in brittle films16. A high coating liquid spray rate may result in over wetting of the tablet surface and subsequent problems such as picking and sticking.16,17 If the spray rate is high and the tablet surface temperature is low, films are not formed during the spraying but the post drying phase, and rapid drying often produces cracks in the films16.

Atomizing air pressure


In general, increasing the spraying air pressure decreases the surface roughness of coated tablets and produces denser and thinner films.19,20,21 If spraying air pressure is excessive, the spray loss is great, the formed droplets are very fine and could spray-dry before reaching the tablet bed, resulting in inadequate droplet spreading and coalescence21. If spraying air pressure is inadequate, the film thickness and thickness variation are greater possibly due to change in the film density and smaller spray loss. In addition, with low spraying air pressure big droplets could locally over wet the tablet surface and cause tablets to stick to each other.

Inlet air temperature


The inlet air temperature affects the drying efficiency (i.e. water evaporation) of the coating pan and the uniformity of coatings19. High inlet air temperature increases the drying efficiency of the aqueous film coating process and a decrease in the water penetration into the tablet core decreases the core tablet porosity, tensile strength and residual moisture content of coated tablets.19,22 Too much air temperature increases the premature drying of the spray during application and, subsequently, decreases the coating efficiency.18,23 Measuring the pan air temperature helps to manage the optimum conditions during the coating process and, consequently, enables predicting possible drying or over wetting problems which may result in poor appearance of the film or may have unfavorable effects on the moisture and heat sensitive tablet cores.24

Rotating speed of pan


It is well documented that increasing the rotating speed of the pan improves the mixing of tablets.23-27 The pan speed affects the time the tablets spend on the spraying zone and, subsequently, the homogeneous distribution of the coating solution on the surface of each tablet throughout the batch. Increasing the pan speed decreases the thickness variation and increase the uniformity of coatings.16,23,26 Too much rotating speed of the pan will cause the tablet to undergo unnecessary attrition and breakage.

Conclusion


Film coating technology is now a days very important in the field of pharmacy particularly in formulation development. Process parameters and coating composition play an important in coating of tablets. So for getting good final quality of coated tablet it would be necessary to optimize the parameters.

References


1. Cole G. Pharmaceutical Coating Technology,Taylor and Francis Ltd, 1998;1-5.

2.Porter C. Coating of Pharmaceutical Solid-dosage forms, Pharm. Tech., 1980,4(3), 66.

3.Libermen H, Lachman L, Pharmaceutical Dosage Forms: Tablets, Vol. I to III, Marcel Dekker Inc., N.Y, 85-143.

4.Hinkes T, Solvent film coating: aqueous vs. Organic, Wisconsin Alumni Research Foundation Madison, Wisconsin.

5.Nyamweya N, Mehta K. Film Coating with Aqueous Latex Dispersions, Pharmaceutical Technology Yearbook 2001.

6.Hogan J. Pharmaceutical Coating Technology, Taylor and Francis Ltd 1998; 6-52.

7.Rowe R, Sheskey P, Owen S. Pharmaceutical Excipients.

8.Martini L, Ford J, Roberts M. The use of hypromellose in oral drug delivery, J Pharm Pharmacol. 2005,57,533-546.

9. Porter S. Coating of Pharmaceutical Dosage forms, Remington’s book of science,volume 1, Ch 46; 894-902.

10.Rowe R. The effect of some formulation and process variables on the surface roughness of film-coated tablets. J. Pharm. Pharmacol 1978a,30,669-672.

11. Elaine S, Celine V, Dawn Z, Xiaohua L, Anthony J, Paul W. Study of Coat Quality of Tablets Coated by an On-line Supercell Coater, AAPS PharmSciTech 2007,8(3), Article 63.

12. Tobiska S, Peter K. Coating uniformity and coating efficiency in a Bohle Lab-Coater using oval tablets European Journal of Pharmaceutics and Biopharmaceutics. 2003,56,3-9.

13. Philip A, Rowe R, York P, Doherty C. The effect of experimental design on the modeling of a tablet coating formulation using artificial neural networks;European Journal of Pharmaceutical Sciences. 2002,16,281-288.

14.Heinamaki. J, Ruotsalainen M, Lehtola V, Antikainen O, Yliruusi J. Optimization of Aqueous-Based Film Coating of Tablets Performed by a Side-Vented Pan- Coating System. Pharmaceutical Development and Technology. 1997,2(4),357-364.

15.Optimal Coating Process Parameters for a New, Fully-Formulated, Acrylic-based, Enteric, Film-Coating System,Poster ReprintAmerican Association ofPharmaceutical Scientists,November 2000.

16. Obara S, McGinity J. Influence of processing variables on the properties of free films prepared from aqueous polymeric dispersions by a spray technique. Int. J. Pharm.1995,126,1-10.

17. Franz R, Doonan G. Measuring the surface temperature of tablet beds using infrared thermometry. Pharm Technol. 1983,7,55-67.

18.Porter S, Verseput R, Cunningham C. Process optimization using design of experiments. Pharm. Technol. 1997,21,60-70.

19. Twitchell A, Hogan J, Aulton M. The behaviour of film coating droplets on the impingement onto uncoated and coated tablet. S.T.P. Pharm. Sci. 1995a,5,190-195.

20. Twitchell A, Hogan J, Aulton M. Assessment of the thickness variation and surface roughness of aqueous film coated tablets using a light-section microscope. Drug Dev.Ind.Pharm. 1995b,21,1611-1619.

21.Tobiska S, Kleinbudde P. Coating Uniformity: Influence of atomizing air pressure. Pharm. Dev. Tech. 2003,8,39-46.

22. Poukavoos N, Peck G. Effect of aqueous film coating conditions on water removal efficiency and physical properties of coated tablet cores containing superdisintegrants. Drug Dev. Ind. Pharm. 1994,20,1535-1554.

23. Rege B, Gawel J, Kou H. Identification of critical process variables for coating actives onto tablets via statistically designed experiments, International Journal of Pharmaceutics. 2002,237,87-94.

24. Okutgen E, Jordan M, Hogan J, Aulton, M. Effects of tablet core dimensional instability of the generation of internal stresses within film coats. Part II: Temperature and relative humidity variation within a tablet bed during aqueous film coating in an Accela-Cota. Drug. Dev. Ind. Pharm. 1991b,17,1191-1199.

25. Tobiska S, Kleinbudde P. A simple method for evaluating the mixing efficiency of a new type of pan coater. Int. J. Pharm. 2001,224,141-149.

26. Wilson K, Crossman E. The influence of tablet shape and pan speed on intra-tablet film coating uniformity. Drug Dev. Ind. Pharm. 1997,23,1239-1243.

27.Skultety P, Rivera D, Dunleavy J, Lin C. Quantification of the amount and uniformity of aqueous film coating applied to tablets in a 24" Accela-Cota. Drug Dev.Ind. Pharm. 1988,14,617-631.

Authors:

Anand Shah, Navin Sheth, Sunny Shah, Ashok Suthar, Sanjay Patel, and Arvind Desai

Dry Granulation and Compression of Spray-Dried Plant Extracts

0 comments
The purpose of this research was to evaluate the influence of dry granulation parameters on granule and tablet properties of spray-dried extract (SDE) from Maytenus ilicifolia, which is widely used in Brazil in the treatment of gastric disorders. The compressional behavior of the SDE and granules of the SDE was characterized by Heckel plots. The tablet properties of powders, granules, and formulations containing a high extract dose were compared. The SDE was blended with 2% magnesium stearate and 1% colloidal silicon dioxide and compacted to produce granules after slugging or roll compaction. The influences of the granulation process and the roll compaction force on the technological properties of the granules were studied. The flowability and density of spray-dried particles were improved after granulation. Tablets produced by direct compression of granules showed lower crushing strength than the ones obtained from nongranulated material. The compressional analysis by Heckel plots revealed that the SDE undergoes plastic deformation with a very low tendency to rearrangement at an early stage of compression. On the other hand, the granules showed an intensive rearrangement as a consequence of fragmentation and rebounding. However, when the compaction pressure was increased, the granules showed plastic deformation. The mean yield pressure values showed that both granulation techniques and the roll compaction force were able to reduce the material’s ability to undergo plastic deformation. Finally, the tablet containing a high dose of granules showed a close dependence between crushing strength and the densification degree of the granules (ie, roll compaction force).

View Full Article

Author(s):
Luiz Alberto Lira Soares, George González Ortega, Pedro Ros Petrovick, Peter Christian Schmidt
Journal:
American Association of Pharmaceutical Scientists.
Copyright:
© All Rights Reserved. ISSN 1550-7416

Process Validation: How Much to Do and When to Do It

0 comments
The trick to process validation, these industry experts argue, is to understand that it is a process that stretches through the whole product life cycle. Some secrets of success: Take a team approach; focus o­n the timing of the various stages of validation; avoid some common mistakes; and build your documentation as you go.

For full article Click Here

Author(s):
Anurag S. Rathore, Joseph F. Noferi, and Edward R. Arling from Pharmacia Corporation, and Gail Sofer, Bioreliance; Peter Watler, Amgen, Inc.; and Rhona O'Leary, Genentech, Inc
Journal:
BioPharm International, October 2002
Copyright:
BioPharm International

Methods and Tools for Process Validation

0 comments
ABSTRACT
by : Dr. Wayne A. Taylor

There are many statistical tools that can be used as part of validation. Control charts, capability studies, designed experiments, tolerance analysis, robust design methods, failure modes and effects analysis, sampling plans, and mistake proofing are but a few. Each of these tools will be summarized and their application in validation described.



1. INTRODUCTION

Validation requires documented evidence that a process consistently conforms to requirements. It requires that you first obtain a process that can consistently conform to requirements and then that you run studies demonstrating that this is the case. Statistical tools can aid in both tasks.



2. USES OF THE TOOLS

This section describes the many contributions that statistical tools can make to validation. Each tool appearing in bold is further described in Section 4.

One tool that is particularly useful in organizing the overall validation effort is a failure modes and effects analysis (FMEA) or a closely related fault tree analysis (FTA). An FMEA involves listing out the potential problems or failure modes and evaluating their risk in terms of their severity, likelihood of occurring and ease of detection. Where potential risks exists, the FMEA can be used to document which failure modes have been addressed and which still need to be addressed. As each failure mode is addressed, the controls established are documented. The end result is a control plan. Addressing the individual failure modes will require the use of many different statistical tools.

Failures or nonconformities occur because of errors made and because of excessive variation. Obtaining a process that consistently conforms to requirements requires a balanced approach using both mistake proofing and variation reduction tools. When a nonconformance occurs because of an error, mistake proofing methods should be used. Mistake proofing attempts to make it impossible for the error to occur or at least to go undetected.

However, many nonconformities are not the result of errors, instead they are the result of excessive variation and off-target processes. Reducing variation and proper targeting of a process requires identifying the key input variables and establishing controls on these inputs to ensure that the outputs conform to requirements. Strategies and tools for reducing variation and optimizing the process average are described in Section 3.

The end result is a control plan. The final phase of validation requires demonstrating that this control plan works, i.e., that it results in a process that can consistently conform to requirements. One key tool here is a capability study. A capability study measures the ability of the process to consistently meet the specifications. It is appropriate for measurable characteristics where nonconformities are due to variation and off-target conditions. Testing should be performed not only at nominal, but also under worst-case conditions. When pass/fail data is involved, acceptance sampling plans can be used to demonstrate conformance to specifications. Finally, in the event of potential errors, challenge tests should be performed to demonstrate that mistake proofing methods designed to detect or prevent such errors are working.

Depending of circumstances, not all tools need be used, other tools could be used instead and the application of the tools can vary.



3. STRATEGIES AND TOOLS FOR REDUCING VARIATION AND OPTIMIZATION

Each unit of product differs to some small degree from all other units of product. These differences, no matter how small, are referred to as variation. Variation can be characterized by measuring a sample of the product and drawing a histogram. For example, one operation involves cutting wire into 100 cm lengths. The tolerance is
100 ± 5 cm. A sample of 12 wires is selected at random and the following results obtained:

98.7 99.3 100.4 97.6 101.4 102.0

100.2 96.4 103.4 102.0 98.0 100.5

A histogram of this data follows. The width of the histogram represents the variation.


Histogram of Lengths

Of special interest is whether the histogram is properly centered and whether the histogram is narrow enough to easily fit within the specification limits. The center of the histogram is estimated by calculating the average of the 12 readings. The average is 99.99. The width of the histogram is estimated by calculating either the range or standard deviation. The range of the above readings is 7.0 cm. The standard deviation is 2.06 cm. The standard deviation represents the typical distance a unit is from the average. Approximately half of the units are within ± 1 standard deviation of the average and about half of the units are more than one standard deviation away from the average. On the other hand, the range represents an interval containing all the units. The range is typically 3 to 6 times the standard deviation, depending on the sample size.

Frequently, histograms take on a bell-shaped appearance that is referred to as the normal curve as shown below. For the normal curve, 99.73% of the units fall within ± 3 standards deviation of the average.

For measurable characteristics like wire length, fill volume, and seal strength, the goal is to optimize the average and reduce the variation. Optimization of the average may mean to center the process as in the case of fill volumes, to maximize the average as is the case with seal strengths, or to minimize the average as is the case with harmful emissions. In all cases, variation reduction is also required to ensure all units are within specifications. Reducing variation requires the achievement of stable and capable processes. The figure below shows an unstable process. The process is constantly changing. The average shifts up and down. The variation increases and decreases. The total variation increases due to the shifting.

Instead, stable processes are desired as shown below. Stable processes produce a consistent level of performance. The total variation is reduced. The process is more predictable.


However, stability is not the only thing required. Once a consistent performance has been achieved, the remaining variation must be made to safely fit within the specification limits. Such a process is said to be stable and capable. Such a process can be relied on to consistently produce good product.


A capability study is used to determine whether a process is stable and capable. It involves collecting samples over a period of time. The average and standard deviation of each time period is estimated and these estimates plotted in the form of a control chart. These control charts are used to determine if the process is stable. If it is, the data can be combined into a single histogram to determine its capability. To help determine if the process is capable, several capability indices are used to measure how well the histogram fits within the specification limits. One index, called Cp, is used to evaluate the variation. Another index, Cpk, is used to also evaluate the centering of the process. Together these two indices are used to decide whether the process passes. The values required to pass depend on the severity of the defect (major, minor, critical).

While capability studies evaluate the ability of a process to consistently produce good product, it does little to help achieve such processes. Reducing variation and the achievement of stable processes requires the use of numerous variation reduction tools. Variation of the output is caused by variation of the inputs. Consider a pump. An output is flow rate. Suppose the pump uses a piston to draw solution into a chamber through one opening and then pushes it back out another opening. Valves are used to keep the solution moving in the right direction. Flow rate will be affected by piston radius, stroke length, motor speed and valve backflow to name a few. Flow rate varies because piston radius, stroke length, etc. varies. Variation of the inputs is transmitted to the output as shown below.

Reducing variation requires identifying the key input variables affecting the outputs and then establishing controls on these inputs to ensure that the outputs conform to their established specifications. In general, one must identify the key input variables, understand the effect of these inputs on the output, understand how the inputs behave and finally, use this information to establish targets (nominals) and tolerances (windows) for the inputs. One type of designed experiment called a screening experiment can be used to identify the key inputs. Another type of designed experiment called a response surface study can be used to obtain a detailed understanding of the effects of the key inputs on the outputs. Capability studies can be used to understand the behavior of the key inputs. Armed with this knowledge, robust design methods can be used to identify optimal targets for the inputs and tolerance analysis can be used to establish operating windows or control schemes that ensure the output consistently conforms to requirements.

The obvious approach to reducing variation is to tighten tolerances on the inputs. This improves quality but generally drives up costs. The robust design methods provide an alternative. Robust design works by selecting targets for the inputs that make the outputs less sensitive (more robust) to the variation of the inputs as shown below. The result is less variation and higher quality but without the added costs. Several approaches to robust design exist including Taguchi methods, dual response approach and robust tolerance analysis.

Another important tool is a control chart. A control chart can be used to help determine whether any key input has been missed and if so to help identify them. Many other tools also exist for identifying key inputs and sources of variation including component swapping studies, multi-vari charts, analysis of means (ANOM), variance components analysis, and analysis of variance (ANOVA).

When studying variation, good measurements are required. Many times an evaluation of the measurement system should be performed using a gage R&R or similar study.



4. DESCRIPTIONS OF THE TOOLS

A brief description of each of the cited tools follows:

1. Acceptance Sampling Plan – An acceptance sampling plan takes a sample of product and uses this sample to make an accept or reject decision. Acceptance sampling plans are commonly used in manufacturing to decide whether to accept (release) or to reject (hold) lots of product. However, they can also be used during validation to accept (pass) or to reject (fail) the process. Following the acceptance by a sampling plan, one can make a confidence statement such as: "With 95% confidence, the defect rate is below 1% defective."

2. Analysis of Means (ANOM) – Statistical study for determining if significant differences exist between cavities, instruments, etc. It has many uses including determining if a measurement device is reproducible with respect to operators and determining if differences exists between fill heads, etc. Simpler and more graphical alternative to Analysis of Variance (ANOVA)
.
3. Analysis of Variance (ANOVA) – Statistical study for determining if significant differences exist between cavities, instruments, etc. Alternative to Analysis of Means (ANOM).

4. Capability Study – Capability studies are performed to evaluate the ability of a process to consistently meet a specification. A capability study is performed by selecting a small number of units periodically over time. Each period of time is called a subgroup. For each subgroup, the average and range is calculated. The averages and ranges are plotted over time using a control chart to determine if the process is stable or consistent over time. If so, the samples are then combined to determine whether the process is adequately centered and the variation is sufficiently small. This is accomplished by calculating capability indexes. The most commonly used capability indices are Cp and Cpk. If acceptable values are obtained, the process consistently produces product that meets the specification limits. Capability studies are frequently towards the end of the validation to demonstrate that the outputs consistently meet the specifications. However, they can also be used to study the behavior of the inputs in order to perform a tolerance analysis.

5. Challenge Test – A challenge test is a test or check performed to demonstrate that a feature or function is working. For example, to demonstrate that the power backup is functioning, power could be cut to the process. To demonstrate that a sensor designed to detect bubbles in a line works, bubbles could be purposely introduced.

6. Component Swapping Study – Study to isolate the cause of a difference between two units of product or two pieces of equipment. Requires the ability to disassemble units and swap components in order to determine if the difference remains with original units or goes with the swapped components.

7. Control Chart – Control charts are used to detect changes in the process. A sample, typically consisting of 5 units, is selected periodically. The average and range of each sample is calculated and plot. The plot of the averages is used to determine if the process average changes. The plot of the ranges is used to determine if the process variation changes. To aid in determining if a change has occurred, control limits are calculated and added to the plots. The control limits represent the maximum amount that the average or range should vary if the process does not change. A point outside the control limits indicates that the process has changed. When a change is identified by the control chart, an investigation should be made as to the cause of the change. Control charts help to identify key input variables causing the process to shift and aid in the reduction of the variation. Control charts are also used as part of a capability study to demonstrate that the process is stable or consistent.

8. Designed Experiment – The term designed experiment is a general term that encompasses screening experiments, response surface studies, and analysis of variance. In general, a designed experiment involves purposely changing one or more inputs and measuring the resulting effect on one or more outputs.

9. Dual Response Approach to Robust Design – One of three approaches to robust design. Involves running response surface studies to model the average and variation of the outputs separately. The results are then used to select targets for the inputs that minimize the variation while centering the average on the target. Requires that the variation during the study be representative of long term manufacturing. Alternatives are Taguchi methods and robust tolerance analysis.

10. Failure Modes and Effects Analysis (FMEA) – An FMEA is systematic analysis of the potential failure modes. It includes the identification of possible failure modes, determination of the potential causes and consequences and an analysis of the associated risk. It also includes a record of corrective actions or controls implemented resulting in a detailed control plan. FMEAs can be performed on both the product and the process. Typically an FMEA is performed at the component level, starting with potential failures and then tracing up to the consequences. This is a bottom up approach. A variation is a Fault Tree Analysis, which starts with possible consequences and traces down to the potential causes. This is the top down approach. An FMEA tends to be more detailed and better at identifying potential problems. However, a fault tree analysis can be performed earlier in the design process before the design has been resolved down to individual components.

11. Fault Tree Analysis (FTA) – A variation of a FMEA. See FMEA for a comparison.

12. Gauge R&R Study – Study for evaluating the precision and accuracy of a measurement device and the reproducibility of the device with respect to operators. Alternatives are to perform capability studies and analysis of means on measurement device.

13. Mistake Proofing Methods – Mistake proofing refers to the broad array of methods used to either make the occurrence of a defect impossible or to ensure that the defect does not pass undetected. The Japanese refer to mistake proofing as Poka-Yoke. The general strategy is to first attempt to make it impossible for the defect to occur. For example, to make it impossible for a part to be assembled backwards, make the ends of the part different sizes or shapes so that the part only fits one way. If this is not possible, attempt to ensure the defect is detected. This might involve mounting a bar above a chute that will stop any parts that are too high from continuing down the line. Other possibilities include mitigating the effect of a defect (seat belts in cars) and to lessen the chance of human errors by implementing self-checks.

14. Multi-Vari Chart – Graphical procedure for isolating the largest source of variation so that further efforts concentrate on that source.

15. Response Surface Study – A response surface study is a special type of designed experiment whose purpose is to model the relationship between the key input variables and the outputs. Performing a response surface study involves running the process at different settings for the inputs, called trials, and measuring the resulting outputs. An equation can then be fit to the data to model the affects of the inputs on the outputs. This equation can then be used to find optimal targets using robust design methods and to establish targets or operating windows using a tolerance analysis. The number of trials required by a response surface study increases exponentially with the number of inputs. It is desirable to keep the number of inputs studied to a minimum. However, failure to include a key input can compromise the results. To ensure that only the key input variables are included in the study, a screening experiment is frequently performed first.

16. Robust Design Methods – Robust design methods refers collectively to the different methods of selecting optimal targets for the inputs. Generally, when one thinks of reducing variation, tightening tolerances comes to mind. However, as demonstrated by Taguchi, variation can also be reduced by the careful selection of targets. When nonlinear relationships between the inputs and the outputs, one can select targets for the inputs that make the outputs less sensitive to the inputs. The result is that while the inputs continue to vary, less of this variation is transmitted to the output causing the output to vary less. Reducing variation by adjusting targets is called robust design. In robust design, the objective is to select targets for the inputs that result in on-target performance with minimum variation. Several methods of obtaining robust designs exist including robust tolerance analysis, dual response approach and Taguchi methods.

17. Robust Tolerance Analysis – One of three approaches to robust design. Involves running a designed experiment to model the output’s average and then using the statistical approach to tolerance analysis to predict the output’s variation. Requires estimates of the amounts that the inputs will vary during long-term manufacturing. Alternatives are Taguchi methods and the dual response approach.

18. Screening Experiment – A screening experiment is a special type of designed experiment whose primary purpose is to identify the key input variables. Screening experiments are also referred to as fractional factorial experiments or Taguchi L-arrays. Performing a screening experiment involves running the process at different settings for the inputs, called trials, and measuring the resulting outputs. From this, it can be determined which inputs affect the outputs. Screening experiments typically require twice as many trials as input variables. For example, 8 variables can be studied in 16 trials. This makes it possible to study a large number of inputs in a reasonable amount of time. Starting with a larger number of variables reduces the chances of missing an important variable. Frequently a response surface study is performed following a screening experiment to gain further understanding of the affects of the key input variables on the outputs.

19. Taguchi Methods – One of three approaches to robust design. Involves running a designed experiment to get a rough understanding of the effects of the input targets on the average and variation. The results are then used to select targets for the inputs that minimize the variation while centering the average on the target. Similar to the dual response approach except that while the study is being performed, the inputs are purposely adjusted by small amounts to mimic long-term manufacturing variation. Alternatives are the dual response approach and robust tolerance analysis.

20. Tolerance Analysis – Using tolerance analysis, operating windows can be set for the inputs that ensure the outputs will conform to requirements. Performing a tolerance analysis requires an equation describing the effects of the inputs on the output. If such an equation is not available, a response surface study can be performed to obtain one. To help ensure manufacturability, tolerances for the inputs should initially be based on the plants and suppliers ability to control them. Capability studies can be used to estimate the ranges that the inputs currently vary over. If this does not result in an acceptable range for the output, the tolerance of at least one input must be tightened. However, tightening a tolerance beyond the current capability of the plant or supplier requires that improvements be made or that a new plant or supplier selected. Before tightening any tolerances, robust design methods should be considered.

21. Variance Components Analysis – Statistical study used to estimate the relative contributions of several sources of variation. For example, variation can on a multi-head filler could be the result of shifting of the process average over time, filling head differences and short-term variation within a fill head. A variance components analysis can be used to estimate the amount of variation contributed by each source.

Obat Tamiflu Tak Cocok Buat Orang Dewasa Sehat

0 comments
Obat flu Tamiflu dan Relenza mungkin tak cocok untuk mengobati influenza musiman pada orang dewasa yang sehat, kata beberapa peneliti Inggris.

"Merekomendasikan penggunaan obat anti-virus bagi perawatan orang yang memiliki beberapa gejala tampaknya bukan jalur tindakan yang paling cocok," tulis Jane Burch dari University of York, dan rekannya.

Studi mereka, yang disiarkan di dalam "Lancet Infectious Diseases", mendukung saran dari Organisasi Kesehatan Dunia (WHO) --yang menyatakan pasien sehat yang terserang flu babi H1N1 tanpa menderita komplikasi tak memerlukan pengobatan anti-virus.

Tamiflu, yang dibuat oleh perusahaan Swiss, Roche, berdasarkan lisensi dari Gilead Sciences Inc., adalah pil yang dapat mengobati dan mencegah segala jenis virus influenza A.

Zanamivir, yang dibuat oleh GlaxoSmithKline, berdasarkan lisensi dari perusahaan Australia, Biota, dan dijual dengan merek Relenza, adalah obat hirup di klas yang sama.

WHO sangat menyarankan penggunaan kedua obat itu buat perempuan hamil, pasien dengan kondisi medis yang mendasari dan anak-anak yang berusia di bawah 5 tahun, karena mereka menghadapi resiko yang meningkat terhadap penyakit yang lebih parah.

Tim Burch mengkaji beragam studi yang diterbitkan mengenai Tamiflu dan Relenza. "Kami menyajikan hasilnya buat orang dewasa yang sehat dan orang yang menghadapi resiko komplikasi yang berkaitan dengan influenza," tulis mereka.

Mereka mendapati kedua obat tersebut, rata-rata, memangkas setengah hari hari saat pasien sakit. Influenza biasanya mempengaruhi orang selama sekitar satu pekan.

Obat itu memberi hasil sedikit lebih baik pada orang yang memiliki resiko komplikasi, seperti pasien yang menderita diabetes atau asme, sementara Relenza mengurangi rasa sakit hampir satu hari dan Tamiflu sebanyak tiga-perempat per hari.

Itu menunjukkan obat tersebut mesti diberikan kepada orang yang paling memerlukannya, kata para peneliti tersebut.

Banyak negara telah menimbun kedua obat itu. Flu babi H1N1 telah dinyatakan sebagai wabah dan menyebar ke seluruh dunia. Para pejabat kesehatan AS, Jumat, mengatakan penyakit tersebut masih bertambah parah di Jepang, kondisi membaik di Inggris dan masih aktif di Amerika Serikat.

Flu jarang menyerang di semua ketiga negara itu pada Agustus.

Pabrik global menduga tak dapat menyediakan vaksin tersebut sampai akhir September atau Oktober.(*)

Disadur dari antaranews

Apa arti “c” di depan kata GMP (cGMP)

1 comments
Mungkin tidak hanya anda yang tertarik dengan pertanyaan diatas. Apa sih makna huruf “c” di depan singkatan GMP (Good Manufacturing Practice).

“c” pada kata cGMP merupakan singkatan dari kata current atau dapat diartikan terkini.

Good Manufacturing Practice atau di negara kita lebih dikenal dengan CPOB (Cara pembuatan obat yang baik) tidaklah tetap sepanjang tahun. Peraturan-peraturan didalamnya serta petunjuk-petunjuk prakatis penerapannya terus disempurnakan untuk memberikan perlindungan terhadap konsumen pengguna obat. Prosedur-prosedur terkini, metode-metode terbaru terus dikembangkan untuk memberikan jaminan kualitas yang terus meningkat. Olehkarenanya industri farasi hendaklah terus menyimak dan menilai kembali apakah penerapan CPOB/GMP diwilayah industrinya masih sesuai dengan peraturan terbaru.

Arti penting validasi dalam dunia farmasi

1 comments
Industri farmasi seperti kita ketahui adalah salah satu bidang industri dengan segudang persyaratan yang harus dipenuhi dalam proses produksinya. Industri farmasi dituntut untuk menghasilkan produk yang seragam secara kualitas, aman bagi pasien, manjur, dan efektif.
Untuk mencapai hal-hal tersebut, maka harus didukung oleh serangkaian proses dan sistem yang konsisten. Validasi adalah sistem yang akan mendukung dalam pencapaian tersebut.

Dalam dunia farmasi, validasi didefinisikan sebagai suatu kegiatan pembuktian yang terdokumentasi bahwa suatu prosedur, proses, alat/mesin, serta sistem akan memberikan kesesuaian secara berulang terhadap suatu syarat yang telah ditetapkan.

Selanjutnya, apa arti penting validasi dalam industri farmasi?
1. Jaminan terhadap kualitas
Sebagai ilustrasi, proses pencampuran dengan menggunakan mesin mixing tidak akan mengahsilkan hasil yang seragam antar batch-nya jika mesin yang digunakan bekerja tidak konsisten. Olehkarenanya, mesin sebagai salah satu komponen penting dalam industri farmasi harus dibuktikan terlebih dahulu akan bekerja secara konsisten dan sesuai ketentuan sebelum digunakan.

2. Memenuhi persyaratan regulatory
Validasi telah diatur dalam Cara Pembuatan Obat yang Baik (CPOB) atau disebut juga Good Manufacturing Practice (GMP). Persyaratan tersebut harus dipenuhi oleh industri farmasi.

3. Pengurangan biaya
Pengalaman telah membuktikan bahwa proses produksi yang telah tervalidasi akan lebih efisien serta menunjukkan kualitas yang terus berulang. Dengan demikian, maka kejadian re-work (pengolahan kembali), reject, serta sisa proses yang tidak terdeteksi akan dapat diminimalkan atau dengan kata lain biaya diluar proses normal dapat ditekan.