Understanding Tolerable Upper Intake Levels

Understanding Tolerable Upper Intake Levels Toxicology of Micronutrients: Adverse Effects and Uncertainty1,2 A. G. Renwick3 School of Medicine, Univer...
Author: Ann Townsend
7 downloads 0 Views 135KB Size
Understanding Tolerable Upper Intake Levels Toxicology of Micronutrients: Adverse Effects and Uncertainty1,2 A. G. Renwick3 School of Medicine, University of Southampton, Southampton SO16 7PX, United Kingdom ABSTRACT The establishment of safe upper intake levels for micronutrients must consider the intake-response relations for both deficiency and toxicity. Limited data are available on the toxicities of most micronutrients, and few studies that meet the criteria considered essential for the risk assessment of other chemicals in food, such as pesticides and food additives, have been performed. In some cases, the application of large uncertainty factors, which are used to establish the amount of a chemical that would be safe for daily intake throughout life, could result in nutritionally inadequate intakes of micronutrients. As a consequence, lower than normal uncertainty factors have been applied to determine safe or tolerable intakes of many micronutrients. There is no clear scientific rationale, on the basis of the metabolism and elimination of micronutrients or the nature of the adverse effects reported for high intakes, for the use of reduced uncertainty factors for micronutrient toxicity. A review of recent evaluations of selected vitamins and minerals shows little consistency in the application of uncertainty factors by different advisory groups, such as the Institute of Medicine in the United States and the Scientific Committee on Foods in the European Union. It is apparent that, in some cases, the uncertainty factor applied was selected largely to give a result that is compatible with nutritional requirements; therefore, the uncertainty factor represented part of risk management rather than hazard characterization. The usual risk assessment procedures for chemicals in food should be revised for micronutrients, so that the risks associated with intakes that are too low and too high are considered equally as part of a risk-benefit analysis. J. Nutr. 136: 493S–501S, 2006. KEY WORDS:  upper intake levels  micronutrient toxicity  risk management

Establishing upper safe or tolerable levels for vitamins and essential trace elements is the subject of considerable interest and activity at present. Because micronutrients are essential for a normal and healthy life, public perception seems to be that there is no risk associated with their intake at any dose, and proposals to restrict availability (1) can produce a highly critical public reaction. However, even the briefest of reviews of the available literature shows that, at very high doses, vitamins and minerals can be as toxic as other compounds present in the food and the environment. This is not unexpected, and to paraphrase Paracelsus (1493–1541), ‘‘In all things there is a poison and there is nothing without a poison. It depends only upon the dose whether a poison is a poison or not.’’ A number of evaluations of the safety of high doses of vitamins and minerals have recently been conducted by both national and international bodies, with comprehensive reviews being undertaken in the United States (2–5), the European Union (6), and the United Kingdom (7). In addition, the safety

of vitamins and minerals has been considered by the Nordic Council (8), the International Program on Chemical Safety (IPCS)4 (9) and French Ministries (10), as well as by associations of the manufacturers and producers of vitamins and vitamin supplements, such as the Council for Responsible Nutrition (11) and the European Federation of Health Product Manufacturers Association (12), and consumer groups, for example, Consumers for Health Choice (13). Assessment of the safety of non-nutrient chemicals present in foods is undertaken by identifying the adverse effect(s) produced at high intakes, defining the dose-response relationship for the effect(s), and then selecting an appropriate safety margin (such as a safety factor of 100) to establish levels of intake that can be consumed daily throughout life by humans without significant adverse health effects (14). In contrast, in the case of micronutrients 2 dose-response relations need to be considered (Fig. 1). Adverse effects would be present at very low intakes because of a deficiency condition, which would decrease in severity with an increase in intake, and at high intakes because of toxicity, which would increase in severity with an increase in the dose. The relative positions of these 2 curves may vary widely between different vitamins and minerals. In most cases, the adverse effects associated with deficiency and toxicity are unrelated, and different adverse effects with different health implications are produced at very low and very high intakes.

1 Published in a supplement to The Journal of Nutrition. Presented as part of the Workshop on Understanding Tolerable Upper Intake Levels held in Washington, DC, April 23 and 24, 2003. This workshop and publication were supported by the Project Committee of the North American Branch of the International Life Sciences Institute. Guest editors for the supplement publication were Johanna Dwyer, Louise A. Berner, and Ian C. Munro. Guest Editor Disclosure: J. Dwyer, no relationships to disclose; L. A. Berner, no relationships to disclose; I. C. Munro, no relationships to disclose. 2 Author Disclosure: No relationships to disclose. 3 To whom correspondence should be addressed: E-mail: a.g.renwick@ soton.ac.uk.

4 Abbreviations used: IOM, Institute of Medicine; IPCS, International Program on Chemical Safety; LOAEL, lowest observed adverse effect level; NOAEL, no observed adverse effect level; SCF, Scientific Committee on Foods.

0022-3166/06 $8.00 Ó 2006 American Society for Nutrition.

493S

494S

SUPPLEMENT

FIGURE 1 Dose-response relationships that need to be considered in establishing the safe upper level or tolerable upper intake level for a vitamin or mineral. The acceptable range of intake should take into account any uncertainties inherent in the use of the dose-response data for toxicity in setting a safe upper level of intake for humans.

A major potential problem for establishing a safe upper level for vitamins and minerals is that, in many cases, the large uncertainty or safety factors that would normally be applied to toxic effects in experimental animals or humans, cannot be used because this could result in an intake below the dietary reference intake, or the intake necessary to avoid a deficiency syndrome. Therefore, safe upper intake levels that avoid both toxicity and deficiency must be established, and this may involve the application of a lower than normal margin of safety between exposure and the intakes that produce toxicity in humans or animals. Nevertheless, in principle there is no reason why the safety margin available to the public in relation to the intake of food supplements should be any different from that which would apply to other ingested compounds, providing that the maximum recommended intake was nutritionally adequate. This paper considers the nature of the uncertainties associated with the use of the available data to establish safe levels of human exposure and whether there is a scientific rationale for reducing the magnitude of the uncertainty factors in the case of micronutrients. The problems faced in deriving safe upper intake levels are illustrated with examples from recent evaluations by the Institute of Medicine (IOM), the Scientific Committee on Foods (SCF), and the Expert Group on Vitamins and Minerals (EVM). Data available for assessment of hazards and risks from high intakes of micronutrients For many vitamins and minerals homeostatic mechanisms ensure that the amounts present in the body are largely independent of intake at low levels. Such mechanisms may be overwhelmed at very high intakes, and this may indicate a nonlinear relation between intake and body burden. The intakes associated with saturation of homeostatic mechanisms have only rarely been defined, and therefore it is not possible to use this as a basis for establishing safe upper intake levels. Because the adverse effects associated with excessive intakes of micronutrients do not relate to the beneficial effects, there is no a priori reason to consider that these hazards are inherently different from the hazards caused by high intakes of a nonessential chemical. Indeed, micronutrients should be considered foreign compounds when the intake exceeds the homeostatic range. For convenience, the data available on the toxicities of micronutrients may be divided into data from studies with humans and data from studies with animals. Data from studies with humans. The available data on the effects of high exposure of humans to micronutrients arise from

a variety of sources: clinical studies on nutritional need, clinical studies on interactions between nutrients, clinical studies for medical uses, epidemiological analyses, and anecdotal case reports. In most cases, the studies were not designed to establish safety at high doses but, rather, to establish benefit in relation to the prevention of deficiency by the use of low doses or to study a potential therapeutic benefit at very high doses. Each of these types of data has deficiencies and problems in relation to defining an upper safe level for micronutrients. Data on adverse effects are often reported from clinical trials for therapeutic indications, but such trials usually define the incidence of adverse effects at the therapeutic dose, which may be a gross exaggeration of nutritional intakes. The effects at lower doses, which would not produce the therapeutic effect, are not usually reported (the examples of niacin and pyridoxine are discussed below), but such information would be essential in establishing a safe upper intake level. In addition, such trials are often performed with highly selected patient groups, which results in problems of extrapolation of the findings from the trial to the general population. Epidemiological studies often suffer from confounding variables (for example, selenium and fluoride), and causation defined using recognized criteria (15) is rarely established. Anecdotal case reports are useful for hazard identification but frequently involve ‘‘abuse’’ dosages and cannot be used to define the incidence of the adverse effect. Hazards identified from animal studies are rarely investigated systematically during human trials, so that the human data do not normally provide useful information related to the adverse effects identified in animal studies. Data from studies with animals. In many cases, the hazard identification is based on animal experimentation, and in some cases the recognized hazards have then been investigated in studies with human volunteers. In the case of essential minerals, usually very few data are available from studies with humans, and therefore the hazard dose-response relation is derived largely from animal experimentation. A defined set of studies is necessary to establish safe daily intakes of chemicals such as pesticides and food additives (Table 1); the studies involve different durations of intakes, and all major organ systems are assessed without any preconception of the hazard(s) that may be detected. The principal aim of such studies is to identify all possible hazards, recognize TABLE 1 Typical toxicity database expected for nonnutrient food chemicals for which there is extensive human exposure1 Test

Nature of study

Genetic Acute Short-term Sub-chronic Chronic Reproductive

1 2

Various genetic endpoints in bacteria and mammalian cells; screen for potential carcinogens Usually single dose study Repeated daily doses for 14–28 d; identifies the target organ Repeated daily doses for 90 d; gives dose-response, and used for dose selection in chronic studies Repeated daily doses for 2 y in rodents; used to investigate carcinogenicity; usual source of NOAEL for ADI2 Dosing occurs before, during, and after reproduction to investigate effects on fetal and neonatal development

Data from Renwick, 1999 (14). ADI, acceptable daily intake.

TOXICOLOGY OF MICRONUTRIENTS

the hazard of greatest concern (usually that which occurs at the lowest doses), and to define an intake that does not produce a detectable response, the so-called no observed adverse effect level (NOAEL). Studies used for risk assessment of pesticides and additives must be performed using recognized protocols and must meet the established quality criteria of good laboratory practice. In contrast, animal studies on vitamins and minerals are usually hypothesis driven and are limited in the ranges of tissues and effects studied. Frequently, adverse effects are reported at high intakes, but the dose-response relation is not studied and the NOAEL is often not defined. Such limited data would not be considered adequate to approve the use of a pesticide or a food additive, whereas in the case of a food contaminant, an additional uncertainty factor(s) would be applied to determine a safe intake. The absence of an adequate safety database for most vitamins and minerals means that the scientific basis of hazard characterization is less secure than is usually the case, but the use of large uncertainty factors, as would traditionally be applied for nonessential compounds, could result in adverse effects due to a deficiency condition. Rationale for the use of uncertainty factors in hazard characterization Uncertainty factors allow for aspects of hazard characterization for which there are no compound-specific data. By definition, uncertainty factors are not precise, and the values selected are usually 1 log unit (10) or 0.5 log units (3). Uncertainty factors can be divided into those related to issues of database deficiencies and of extrapolation. For contaminants, database deficiencies, such as the absence of a NOAEL [in which case the lowest observed adverse effect level (LOAEL) has to be used as the starting point], or the absence of a chronic toxicity study, are often allowed for by the use of an extra 3- or 10-fold factor (16–18). Traditionally 10-fold factors have been used to allow for extrapolation issues, such as animal-to-human extrapolation, and to allow for human variability. There has been a long history of use of such factors (9,19), and their validity has been the subject of numerous reviews (20–29). Renwick (24) suggested that the 10-fold uncertainty factors for interspecies differences and human variability should be subdivided to allow for toxicokinetic differences (4-fold) and toxicodynamic differences (2.5-fold). The aim of this subdivision was to allow chemical-specific toxicokinetic or mechanistic data to contribute quantitatively to the selection of the uncertainty factor (by the use of a combination of chemical-specific data and default factors). The principle of subdivision was accepted by an IPCS workshop on the derivation of guidance values and modified to allocate even 3.16 (100.5) factors for toxicokinetic and toxicodynamic differences in humans (17). IPCS has published guidance on the use of chemical-specific adjustment factors (30) to replace the toxicokinetic or toxicodynamic factors for interspecies differences or human variability (Fig. 2). There have been very few examples to date for which there are sufficient data of adequate quality to replace default uncertainty factors, but in principle, this subdivision could be used to refine the uncertainty factor used for vitamins and minerals. Although there are extensive human databases on vitamins and minerals, very few studies have defined species differences or human variability in the concentrations in plasma at high and potentially toxic intakes. Differences between nonessential foreign compounds and vitamins and minerals There are major differences between animal species and humans and between different human individuals in the fate of

495S

FIGURE 2 The subdivision of the usual 100-fold default uncertainty factor into separate factors for species differences (10-fold) and human variability (10-fold) (19). Subdivision into separate default factors for toxicokinetics (processes that determine the concentration of the compound at its site of action), and toxicodynamics (processes that relate the concentration in the target organ to an adverse effect) was designed to allow compound-specific data to replace part of the overall uncertainty factor. The figure was adapted from IPCS (17,30).

foreign compounds in the body. Coefficients of variation for human variability in the main pathways of elimination are typically about 35% (31–35). These differences provide a rationale and justification for the use of large (10-fold) uncertainty factors in the establishment of safe intakes of such compounds in human food. The high variability between species and between different individuals arises from protein-mediated processes, such as enzyme-catalyzed reactions, rather than physicochemical processes, such as passive diffusion, or physiological processes, such as glomerular filtration (35). In contrast to foreign compounds, the concentrations of micronutrients in the body are controlled closely at low intakes and the body burden may be largely independent of intake. This supports the assumption of a low coefficient of variation [10% (2) or 15% (36)] that is used to define the 95th percentile daily requirement. High-dose toxicity with micronutrients probably occurs at intakes that saturate homeostatic control, and therefore, the fate at high doses and the variabilities in response would resemble those of a foreign compound. As a consequence, both species differences and similar person-to-person variations will exist at high intakes. Protein-mediated processes are of greater potential importance for micronutrients than for foreign compounds (Table 2), and therefore, there is no a priori rationale to assume that the variability associated with micronutrient toxicity is lower than that associated with foreign compound toxicity. As a consequence, in the absence of adequate compound-specific information, there is no scientific justification to use an uncertainty factor other than the usual default in the determination of an intake of a nutritional supplement that can be consumed safely by all members of society, every day throughout life. Examples of apparent discrepancies in the setting of safe upper intake levels by different groups The examples below illustrate how recent evaluations in the United States, the European Union, and the United Kingdom have reached similar or different conclusions about tolerable

SUPPLEMENT

496S

TABLE 2 Differences between foreign compounds (xenobiotics) and micronutrients1 Process

Foreign compound

Absorption

PD of lipid soluble form

Blood transport

Free or bound to albumin

Entry into tissues

PD of lipid soluble form

Metabolism

Enzymes with low specificity and high capacity Glomerular filtration 1 passive reabsorption 1 tubular secretion

Renal excretion

Micronutrient Carrier-facilitated uptake at low conc; PD at high conc. Free or bound to specific carrier proteins Carrier-facilitated transport at low conc; PD at high conc. Often specific enzymes with low capacity Specific reuptake transporters 1 glomerular filtration 1 passive reabsorption 1 tubular secretion

1 Note that processes mediated by interactions with proteins, such as enzymes and transporters are italic. PD, passive diffusion; conc, concentration.

upper intake levels (the United States and the European Union) or safe upper levels (United Kingdom) for selected vitamins and minerals. A problem that all 3 groups faced was that for some micronutrients there were concerns about safety but inadequate data for derivation of a numerical tolerable or safe upper intake level, whereas for others there was limited information but the available information did not raise safety concerns. In the United Kingdom a guidance level was set for some micronutrients as an indication to risk managers of an intake that was likely to be without adverse effects but for which the database was not adequate to set a clear safe upper level. Vitamin A. Vitamin A may be consumed either as preformed vitamin A (retinyl esters and retinol) or as provitamin A, such as the carotenoids (e.g., b-carotene). Each may be obtained from dietary sources and by taking vitamin supplements. In addition, b-carotene may be consumed from its use as a food additive. The toxicities of preformed vitamin A and b-carotene differ markedly, because preformed vitamin A has potential teratogenic activity, whereas this has not been reported for b-carotene. As a consequence, each was evaluated separately. Preformed vitamin A and other retinoids are well-recognized animal teratogens at high doses. 13-cis-Retinoic acid, a metabolite of retinol, has been used as an oral treatment for acne and has been associated with a significant incidence of teratogenic effects in both experimental animals and treated patients. Both 13-cis-retinoic acid and retinol are metabolized to alltrans-retinoic acid and to 9-cis-retinoic acid, which are believed to be the proximate teratogens because of their interactions with RAR and RXR nuclear receptors. Biesalski (37) reported 18 cases of retinoid-like malformations in the offspring of women who had taken very large doses of vitamin A supplements, indicating that retinoid teratogenicity may occur after highdose vitamin A intake. Such anecdotal data were inadequate to define the dose-response relation for teratogenicity. A major study (38) reported on the pregnancy outcomes of .22,000 women between 1984 and 1987 in relation to their intake of vitamin A. Although that study has a number of deficiencies, it indicated the possibility of an increase in risk at intakes above 10,000 international units (3000 mg)/d. Studies on the absorp-

tion and metabolism of vitamin A (39) indicate that there would be little change in the circulating concentrations of retinoic metabolites up to 30,000 international units, suggesting that this could be the upper range of homeostasis and the upper safe limit of intake. Such toxicokinetic data may help to define the risk associated with normal levels of vitamin A in food, such as liver, compared with the risk identified from the intake of supplements, because they would allow for the major differences in the conversion of retinol to all-trans-retinoic acid under these 2 different conditions (40). However, the interpretation of data on circulating levels in plasma is complex because animal studies have indicated that there is not a clear relation between the circulating concentrations of all-trans-retinoic acid and teratogenic risk (41). In the United States, the IOM set a tolerable upper intake level of 3000 mg/d for preformed vitamin A (5) for women of childbearing age [on the basis of a NOAEL of 4500 mg/d from the large epidemiology study of Rothman et al. (38) and an uncertainty factor of 1.5 for interindividual variability], and also for other adults (on the basis of a LOAEL of 14,000 mg/d for case reports of adverse hepatic effects and an uncertainty factor of 5 to allow for the use of a LOAEL, the severity of the effects, and interindividual variability). The European SCF (42) set a tolerable upper intake level of 3000 mg/d on the basis of the NOAEL derived by regression analysis from the study of Rothman et al. (38) without applying an uncertainty factor, with advice that postmenopausal women should restrict their intake to 1500 mg/d because of an increased risk of osteoporosis and bone fracture. In the United Kingdom the EVM (7) concluded that the data were not adequate for the establishment of a safe upper intake level and gave advice similar to that from SCF, but pointed out that the risk of bone fracture increased over the normal range of intakes for adults in the United Kingdom. The shift in emphasis from IOM, to SCF, and to EVM reflects an increase in the numbers of publications investigating an association between vitamin A intake and bone fracture (43–48). b-Carotene. b-Carotene has been studied extensively in animal studies in relation to its use as a food color, and is largely nontoxic and is not associated with teratogenicity. Because of its antioxidant properties it has been used as a chemoprevention agent to reduce the risk of development of cancer in high-risk groups, such as cigarette smokers. The hypothesis for this was that oxidative stress contributes to cancer development and that prevention of oxidative damage may delay or prevent the development of lung cancer. Four major b-carotene intervention trials have been conducted, and 3 of these were performed using subjects with adequate b-carotene intakes. The studies differed slightly in their designs and differed significantly in their outcomes [a review has been presented by Woutersen et al. (49)]. The ATBC trial (a-tocopherol b-carotene prevention study) investigated a total of 29,133 male smokers who were given a-tocopherol (50 mg), b-carotene (20 mg), a-tocopherol plus b-carotene, or placebo. At the end of the study (after an average of 6 y) there was a significant increase in the incidence of lung cancer in subjects given supplemental b-carotene. A similar outcome was found in the CARET study (b-carotene and retinol efficacy trial), which included 18,314 male and female smokers and asbestos-exposed workers who were given retinol (25,000 international units) plus b-carotene (30 mg) or placebo. The PHS study (United States Physicians’ Health Study) administered aspirin (325 mg), b-carotene (50 mg), aspirin plus b-carotene, or a placebo to 22,071 healthy American male physicians, of whom 11% were smokers. There was no measurable

TOXICOLOGY OF MICRONUTRIENTS

effect on cancer incidence: neither an increased risk, as found in the other 2 studies, nor the decreased risk predicted at the onset of the study. Given the enormous sample sizes of these studies and the adverse outcomes in 2 of the 3 intervention trials, it is unlikely that there will be future studies on the dose-response relation for b-carotene and its adverse effects on the development of lung cancer in cigarette smokers. As a consequence, a risk assessment must be made on the basis of the present data. Recent studies with a ferret model (50) have indicated that b-carotene has short-term effects on the ferret lung, and this may provide information on the dose-response relationship, which can then be extrapolated to humans. In the United States, the IOM concluded that the safety of supplements containing b-carotene could not be ensured. The European SCF (51) concluded that a tolerable upper intake level could not be set because of the absence of suitable doseresponse data. In the United Kingdom, the EVM (7) established from the findings of the large supplementation trials a safe upper level based on a LOAEL of 20 mg/d and an uncertainty factor of 3 because a NOAEL was not identified. Pyridoxine. Pyridoxine (vitamin B-6) is an essential cofactor, in the form of pyridoxal phosphate, for a number of enzymes involved in transamination, deamination, decarboxylation, and desulfuration reactions. Pyridoxine deficiency leads to retarded growth, acrodynia, alopecia, and skeletal changes, as well as neurological effects such as seizures and convulsions. The daily requirement for pyridoxine is related to protein intake and is equivalent to ;2 to 3 mg/d (36). The principal hazard associated with pyridoxine is neurotoxicity and neuronal degeneration, which were initially identified from the results of studies with experimental animals. Doses associated with toxicity in animals are .25 mg/kg of body weight/d, indicating a wide safety margin (about 500) between normal dietary intakes and potentially toxic doses. However, pyridoxine illustrates the problems that can arise when vitamins are used at very high doses for the treatment of clinical conditions. Pyridoxine has been used to treat a number of conditions at doses up to and exceeding 1 g/d, and these clinical uses have given rise to anecdotal reports of neurotoxicity in humans. The first report (52) described 7 patients who had been taking 2–6 g/d of pyridoxine daily for between 2 and 40 mo. Subsequent anecdotal reports described cases with daily intakes of 200 mg or more (53,54). These data cannot be used to define the dose-response relationship, because the number of subjects taking such doses is unknown and the incidence cannot be calculated. Berger et al. (55) gave either 1 or 3 g/d of pyridoxine to 5 volunteers until signs of clinical or laboratory abnormalities were detected. That study showed sensory symptoms and quantitative sensory threshold abnormalities in all subjects. Under these controlled high-dose conditions, sensory and neurological effects were not detected until after many months of treatment. There was an inverse relationship between the dosage and the duration of intake before the onset of symptoms. This inverse relationship was also apparent in the original report by Schaumburg et al. (52). A number of publications have reported no adverse neurological effects in patients receiving ,500 mg/d of pyridoxine (56–63). Many of these studies were of short duration (,6 mo), which would be insufficient time to produce toxicity even at higher doses, and therefore do not provide evidence of the safety of low intakes. Brush (58) reported a low incidence of possible pyridoxine-related side effects (5 subjects with tingling or numbness, or both) in a cohort of 336 subjects who had taken 200 mg/d, but the duration of treatment and other details were not given. Only 1 subject in this cohort reported side

497S

effects at an intake of 100–120 mg/d. The main evidence raising doubts about the safety of doses ,200 mg/d comes from a study in which a group of 172 women, who were receiving pyridoxine for premenstrual syndrome, were subdivided into those who reported altered sensations in their limbs or skin or had muscle weakness or pain (n 5 103) and those who did not (64). Comparison of the 2 subgroups showed that those who reported symptoms had been taking pyridoxine for a significantly greater duration (2.9 y) than those without symptoms (1.6 y). There was no significant difference in the average intakes of vitamin B-6 between the 2 groups (117 and 116 mg/d, respectively), although a higher proportion of patients with symptoms (70%, compared with 55% of patients without symptoms) had serum vitamin B-6 levels .34 mg/L. This study has been heavily criticized because of the absence of an appropriate control group and the subjective nature of the symptoms described. However, it is difficult to ignore these data given the long duration of intake by those patients who reported symptoms and the reversal of symptoms on the cessation of intake. Some of the patients in the study had intakes ,100 mg/d, and therefore, if the data are taken at face value, an intake of 100 mg/d was not equivalent to a no-effect level (in this study). The case of pyridoxine illustrates well a problem with the available data on adverse effects in both animals and humans and their interpretation in relation to the safety of high doses of vitamins. It is clear from any analysis that the normal dietary intakes would be without adverse health effects; however, it is less clear that intakes of 100–200 mg/d would similarly be without adverse effects. Given the available data, an upper level could be selected somewhere within the range of 10–100 mg/d (the range spanned by the recent recommendations from groups in the United Kingdom and the United States). A detailed review of all the data indicates that the influence of duration of intake was not given similar weightings in the different evaluations. In the United States, the IOM set a tolerable upper intake level of 100 mg/d on the basis of a NOAEL of 200 mg/d from a number of short-term studies showing no adverse effects and a 2-fold uncertainty factor because of deficiencies in the studies. The IOM highlighted a number of methodological weaknesses in the study of Dalton and Dalton (64) and concluded that the findings were inconsistent with the weight of evidence pertaining to the safety of higher doses of pyridoxine. The report identified a NOAEL of 200 mg/d largely on the basis of the findings of 2 studies (57,65), supported by additional studies that reported no cases of neuropathy among hundreds of individuals given pyridoxine doses of 40–500 mg/d (60,66–68). Careful scrutiny of these reports shows that the study durations were too short to have been able to detect the slowly developing neuropathy produced by pyridoxine. Bernstein and Lobitz (57) reported data for 16 patients who received doses of 150 mg/d with a duration of intake of up to 6 mo (only 5 subjects were studied after 5 mo), whereas Del Tredici et al. (65) studied 24 patients for 4 mo. The large number of subjects evaluated in the other studies used to support the NOAEL of 200 mg/d was largely due to the work of Brush and colleagues (60) (which was not unequivocally without possible adverse effects; see above), whereas Ellis et al. (66) reported on the findings for 22 subjects [Ellis (69) discusses data for 35 subjects], Mitwalli et al. (67) gave detailed data for only 7 subjects (but these subjects had been treated for 2.8 y), and Tolis et al. (68) provided information on 9 patients. The European SCF (70) set a tolerable upper intake level of 25 mg/d on the basis of the average intake in the study of Dalton and Dalton (64) with an uncertainty factor of 4 to allow

498S

SUPPLEMENT

for the fact that the intake was a possible effect level (2-fold) and for deficiencies in the database (2-fold). In the United Kingdom, the EVM (7) considered that none of the human data were adequate for the establishment of a numerical upper level and derived a safe upper level of 10 mg/d by using the LOAEL from a study with dogs (50 mg/kg of body weight/d) with a 300-fold uncertainty factor to allow for LOAEL-toNOAEL extrapolation (3-fold) and for the use of data from animal studies to predict a safe intake for humans (100-fold). Niacin. Niacin is a term used to describe nicotinic acid and nicotinamide, both of which have biological activities. Niacin is a precursor of the essential cofactors NAD and NADP, which are involved in a vast array of redox reactions. Niacin is not a true vitamin, since it can be produced in vivo by humans from the catabolism of the essential amino acid tryptophan, and there is no absolute requirement for preformed niacin in the diet. There are interesting parallels between niacin and pyridoxine because experiments with animals revealed a hazard, in this case, hepatotoxicity (71), and the use of very high doses of niacin for therapeutic purposes gave rise to anecdotal reports of severe hepatic toxicity in treated patients. As with pyridoxine, the doses associated with this toxicity were generally many times the nutritional intake; most clinical trials used up to 3 g/d (72). A second minor adverse effect reported at doses of 50 mg or more was the production of flushing and hypotension (3). This phenomenon was seen only with nicotinic acid and not with nicotinamide, and was found to a greater extent when nicotinic acid was taken on an empty stomach. As a consequence, the interpretation of the safety of high doses of niacin is complicated by the difference in chemical forms taken and the influence of food on the generation of the adverse effect. In the United States, the IOM set a tolerable upper intake level of 35 mg/d for both nicotinic acid and nicotinamide, on the basis of a LOAEL of 50 mg for nicotinic acid–induced flushing and an uncertainty factor of 1.5 to allow for LOAEL-toNOAEL extrapolation. The European SCF (73) set different tolerable upper intake levels for nicotinic acid (10 mg/d on the basis of a LOAEL for flushing at 30 mg and an uncertainty factor of 3) and nicotinamide (900 mg/d on the basis of a NOAEL of 25 mg/kg of body weight/d from a number of studies with children that gave assurance about possible hepatic effects and an uncertainty factor of 2 because adults might eliminate nicotinamide more slowly than children). In the United Kingdom, the EVM established a guidance level for nicotinic acid of 17 mg/d on the basis of a LOAEL for flushing of 50 mg and a 3-fold factor to allow for the use of a LOAEL and a guidance level for nicotinamide of 500 mg/d using the same NOAEL as the SCF, but with a 3-fold uncertainty factor. Selenium. Selenium is a commonly occurring micronutrient that is present in enzymes such as glutathione peroxidase. Intakes ,10 mg/d have been associated with selenium deficiency, known as Keshan disease, an endemic juvenile cardiomyopathy occurring in the Keshan region of China. The amounts of selenium present in different geographical areas vary widely, and some areas have high selenium intakes (74), whereas other areas, for example, in China, have much lower intakes. The adverse effects reported among subjects living in seleniumrich areas include brittle hair, new hair with no pigments, and thickened and brittle nails with spots and streaks, a condition known as selenosis (75). The data available to define a doseresponse relationship for selenium-related adverse effects arise from epidemiological studies and from clinical investigations in which selenium supplementation has been given intentionally. Epidemiological studies (76,77) indicated adverse effects with selenium intakes from foods of ;900 mg/d, whereas Longnecker et al. (74) reported no adverse effects for intakes up to about

700 mg/d. A clinical study on the effect of selenium in patients with cancer (78) reported no signs of selenosis at an intake of 200 mg/d (in addition to the normal dietary intake). In the United States, IOM (4) set a tolerable upper intake level of 400 mg/d using a NOAEL of 800 mg/d on the basis of the findings from the studies of Yang et al. (76,77) and Longnecker et al. (74) with an uncertainty factor of 2 to allow for possible human variability in sensitivity compared with that of the study populations. The European SCF (79) set a tolerable upper intake level of 300 mg/d on the basis of a NOAEL of 900 mg/d from the study of Yang et al. (77) and an uncertainty factor of 3 for general uncertainty about the study data. In the United Kingdom, the EVM (7) set a safe upper level of 450 mg/d on the basis of a NOAEL of 900 mg/d in the studies of Yang et al. (76,77) and an uncertainty factor of 2 to allow for the use of a LOAEL for humans that was close to the NOAEL. Future directions for combined consideration of risks and benefits of micronutrients It is clear from the descriptions given above that there has been flexibility in the choice of the uncertainty factor used to establish a safe or tolerable upper intake level, and it is equally clear that there has been little consistency in the approaches adopted. To some extent this reflects the generally inadequate nature of the available data, which were not derived for the purposes of risk assessment, but it also reflects the need to ensure that the resulting safe level is nutritionally adequate. Human variability in different biological parameters is best represented as log-normal distributions rather than normal distributions. The data from a single observation of the incidence of response in a group of subjects related to either a requirement (benefit) or an excess (toxicity) can be modeled, provided that an assumption regarding the coefficient of variation that represents human variability in sensitivity is made (Fig. 3). The optimum intake is that which minimizes the

FIGURE 3 Dose-response relationships for the risks due to the absence of benefit or the presence of toxicity. The data were plotted assuming a log-normal distribution. The absence of benefit (equivalent to a deficiency condition) has been plotted assuming coefficients of variation of 10% (thick line) and 15% (narrow line), and the toxicity line has been plotted assuming a coefficient of variation of 45%. The intersection of the 2 lines is the optimum intake, provided that the nature of the deficiency and the toxicity are of equivalent adversities (see text). ED50, the dose that gives a 50% incidence; CV, coefficient of variation.

TOXICOLOGY OF MICRONUTRIENTS

incidences of both deficiency and toxicity (provided that the effects are of similar severities). The European Branch of the International Life Sciences Institute (ILSI Europe) Expert Group on Risk-Benefit Analysis for Nutrients Added to Foods has developed an approach that integrates both of the intake-response relationships shown in Figure 3 into a risk-benefit analysis (80). There is an established method for the derivation of the dietary reference intake, which is based on the average requirement plus 2 standard deviations. For the prevention of a deficiency, a coefficient of variation of 10% has a history of use by IOM in the United States for the establishment of dietary reference intakes, whereas a value of 15% has been used by SCF in Europe; this difference will affect the position of the optimum (Fig. 3). A similar approach was proposed by the ILSI Europe Expert Group in relation to the dose-response for toxicity at high intakes, with the incidence of adverse effects determined a log-normal distribution with a coefficient of variation of 45% to reflect human variability [see (80) for details]. Data on variability relevant to the specific micronutrient would be used when available, but in the absence of such data, a value of about 45% would be a suitable default, because it represents the average for a number of pathways of foreign compound elimination (81), which are probably relevant to the elimination of micronutrients at intakes that exceed the homeostatic range (see above). Derivation of an optimum range from the graphical representation given in Figure 3 would require determination of a suitable cutoff on the log-normal distribution. A complete absence of both deficiency and toxicity on the basis of this model is impossible, as they will result from intakes of infinity and zero, respectively. As outlined above, a value of 2 standard deviations (95th percentile) has been used for benefit (the prevention of a deficiency), but selection of a suitable cutoff for toxicity would need to take into account the nature and the level of adversity of the effect. A 5% incidence of subjects with a sensitive biochemical parameter outside the normal range might be acceptable, but a 5% incidence of a potentially irreversible effect such as teratogenicity or neuropathy would not be acceptable. The publication of the ILSI Europe Expert Group (80), proposes that the advice provided to risk managers should indicate the incidences of both deficiency and toxicity predicted for different intakes, combined with a description of the severity of the health effects on which the incidence data are based. In this way, the acceptability of a particular incidence can be a risk-management decision that takes into account societal and other considerations, which are not strictly parts of risk characterization. An additional benefit of a structured riskbenefit analysis is that risk assessors will not be asked to set a safe or tolerable upper intake level, which requires risk management considerations in order to balance a precautionary approach to toxicity with a pragmatic approach to maintaining benefit. Conclusions Unlike food additives and pesticides, which require prior approval before they can be used, there is no requirement for the formal toxicity testing of high doses of micronutrients. Application of the procedures used for other chemicals in food to vitamins and minerals could in some cases result in an increase in the incidence of adverse effects if the inappropriate application of large uncertainty factors were to result in an acceptable intake that caused deficiency. The major problem for groups undertaking risk assessments of micronutrients with the aim of establishing safe or tolerable upper intake levels is that there is a requirement to provide risk

499S

managers with advice on both the risks of excessive exposure and the risks associated with deficiency. As a consequence, the normal risk assessment paradigm for food additives and contaminants should be replaced by some form of risk-benefit analysis to prevent adverse effects from arising from inappropriate advice. Recent reviews have used very low uncertainty factors in some cases; but there is no scientific rationale for low uncertainty factors and the logic for their derivation is not clear. In some cases the uncertainty factors used simply seem to be a means of getting from the doses that characterize the adverse effect to a reasonable and practical intake that will not lead to excessive toxicity or deficiency. In such cases the application of an uncertainty factor, which is a part of hazard characterization (82), is being used as a part of risk management. Micronutrients with a narrow range between essentiality and toxicity would be handled more clearly by a descriptive narrative to the risk manager rather than as a pseudonumerical estimate. In some cases lower than normal uncertainty factors have been applied to an adverse effect, despite a very wide separation of the two intake-response curves in Figure 1. It is unclear what nutritional benefit there is to consumers by such an approach, and, indeed, it gives the risk of apparently approving therapeutic uses of vitamins and minerals without the normal requirements of establishing safety, quality, and efficacy. The risk assessment of micronutrients requires the establishment of a new approach that considers the risks of both deficiency and toxicity. The output of risk assessment should provide the risk manager with sufficient information and advice on the predicted incidences and natures of the adverse health effects that would result from low intakes and from high intakes (80). The determination of an acceptable range of intake could then be based on the available scientific data and their interpretation (risk characterization), combined with consideration of what would be an acceptable incidence of adverse health effects (risk management).

LITERATURE CITED 1. Committee on Toxicity of Chemicals in Food, Consumer Products, and the Environment. Annual report. London: Department of Health; 1997. 2. Food and Nutrition Board, Institute of Medicine. Dietary reference intakes for calcium, phosphorus, magnesium, vitamin D, and fluoride. Washington, DC: National Academies Press; 1997. Available from: http://www.nap.edu/books/ 0309063507/html. 3. Food and Nutrition Board, Institute of Medicine. Dietary reference intakes for thiamin, riboflavin, niacin, vitamin B6, folate, vitamin B12, pantothenic acid, biotin, and choline. Washington, DC: National Academies Press; 1998. Available from: http://www.nap.edu/openbook/0309065542/html/. 4. Food and Nutrition Board, Institute of Medicine. Dietary reference intakes for vitamin C, vitamin E, selenium, and carotenoids. Washington, DC: National Academies Press; 2000. Available from: http://www.nap.edu/openbook/0309069351/ html/index.html. 5. Food and Nutrition Board, Institute of Medicine. Dietary reference intakes for vitamin A, vitamin K, arsenic, boron, chromium, copper, iodine, iron, manganese, molybdenum, nickel, silicon, vanadium, and zinc. Washington, DC: National Academies Press; 2001. Available from: http://www.nap.edu/books/0309072794/ html/. 6. Scientific Committee for Food. European Commission. Outcome of discussions. Available from: http://europa.eu.int/comm/food/fs/sc/scf/outcome_en.html. 7. Expert Group on Vitamins and Minerals. Safe upper levels for vitamins and minerals. London: Food Standards Agency of the United Kingdom; 2003. 8. Nordic Council of Ministers. Risk evaluation of essential trace elements— essential versus toxic levels of intake. Copenhagen: Nordic Council of Ministers; 1995. 9. International Programme on Chemical Safety. Principles and methods for the assessments of risk from essential trace elements. Geneva: World Health Organization; 1999. 10. Ministe`re de l’E´conomie et des Finances, Ministe`re du Travail et des Affaires Sociale, Ministe`re de l’Agriculture, de la Peche et de l’Alimentation. Rapport sur les limites de se´curite´ dans les consummations alimentaires des vitamins et mine´raux. Paris: Ministe`re de l’E´conomie et des Finances, Ministe`re du Travail et des Affaires Sociale, Ministe`re de l’Agriculture, de la Peche et de l’Alimentation; 1995.

500S

SUPPLEMENT

11. Council for Responsible Nutrition. Vitamin and mineral safety. Washington, DC: Council for Responsible Nutrition; 1997. 12. European Federation of Health Product Manufacturers Associations. Vitamins and minerals. a scientific evaluation of the range of safe intakes. Surrey, UK: The Council of Responsible Nutrition; 1997. 13. Consumers for Health Choice. A summary of the science of vitamin and mineral supplement safety. Brussels: Consumers for Health Choice; 1998. 14. Renwick AG. Exposure estimation, toxicological requirements and risk assessment. In: van der Heijden K, Younes M, Fishbein L, Miller S, editors. International food safety handbook. New York: Marcel Dekker; 1999. p. 59–94. 15. Hill AB. The environment and disease: association or causation? Proc R Soc Med. 1965;58:295–300. 16. Beck BD, Conolly RB, Dourson ML, Guth D, Harris D, Kimmel C, Lewis SC. Improvements in quantitative noncancer risk assessment. Fundam Appl Toxicol. 1993;20:1–14. 17. International Programme on Chemical Safety. Assessing human health risks of chemicals: derivation of guidance values for health-based exposure limits: Environmental Health Criteria no 170. Geneva: World Health Organization; 1994. 18. Vermeire T, Stevenson H, Peiters MN, Rennen M, Slob W, Hakkert BC. Assessment factors for human health risk assessment: a discussion paper. Crit Rev Toxicol. 1999;29:439–90. 19. World Health Organization. Principles for the safety assessment of food additives and contaminants in food: Environmental Health Criteria no. 70. Geneva: World Health Organization; 1987. 20. Dourson ML, Stara JF. Regulatory history and experimental support of uncertainty (safety) factors. Regul Toxicol Pharmacol. 1983;3:224–8. 21. Calabrese EJ. Uncertainty factors and interindividual variation. Regul Toxicol Pharmacol. 1985;5:190–6. 22. Hattis D, Erdreich L, Ballew M. Human variability in susceptibility to toxic chemicals: a preliminary analysis of pharmacokinetic data from normal volunteers. Risk Anal. 1987;7:415–26. 23. Renwick AG. Safety factors and the establishment of acceptable daily intakes. Food Addit Contam. 1991;8:135–50. 24. Renwick AG. Data-derived safety factors for the evaluation of food additives and environmental contaminants. Food Addit Contam. 1993;10:275–305. 25. Calabrese EJ, Beck BD, Chappell WR. Does the animal-to-human uncertainty factor incorporate interspecies differences in surface area? Regul Toxicol Pharmacol. 1992;15:172–9. 26. Naumann BD, Weideman PA. Scientific basis for uncertainty factors used to establish occupational exposure limits for pharmaceutical active ingredients. Hum Ecol Risk Assess. 1995;1:590–613. 27. Dourson ML, Felter SP, Robinson D. Evolution of science-based uncertainty factors for noncancer risk assessment. Regul Toxicol Pharmacol. 1996;24: 108–20. 28. Renwick AG, Lazarus NR. Human variability and noncancer risk assessment-an analysis of the default uncertainty factor. Regul Toxicol Pharmacol. 1998;27:3–20. 29. Silverman KC, Naumann BD, Holder DJ, Dixit R, Faria EC, Sargent EV, Gallo MA. Establishing data-derived adjustment factors from published pharmaceutical clinical trial data. Hum Ecol Risk Assess. 1999;5:1059–89. 30. World Health Organization, International Programme on Chemical Safety. Guidance document for the use of data in development of chemical-specific adjustment factors (CSAFs) for interspecies differences and human variability in dose/concentration-response assessment. Geneva: WHO; 2001. Available from: http://www.who.int/ipcs/methods/harmonization/areas/uncertainty/en/index.html. 31. Dorne JL, Walton K, Renwick AG. Human variability in glucuronidation in relation to uncertainty factors for risk assessment. Food Chem Toxicol. 2001;39: 1153–73. 32. Dorne JL, Walton K, Slob W, Renwick AG. Human variability in polymorphic CYP2D6 metabolism: is the kinetic default uncertainty factor adequate? Food Chem Toxicol. 2002;40:1633–56. 33. Dorne JL, Walton K, Renwick AG. Human variability in CYP3A4 metabolism and CYP3A4-related uncertainty factors. Food Chem Toxicol. 2003;41:201–24. 34. Dorne JL, Walton K, Renwick AG. Polymorphic CYP2C19 and N-acetylation: human variability in kinetics and pathway-related uncertainty factors. Food Chem Toxicol. 2003;41:225–45. 35. Dorne JL, Walton K, Renwick AG. Human variability in the renal elimination of foreign compounds and renal excretion-related uncertainty factors for risk assessment. Food Chem Toxicol. 2004;42:275–98. 36. Scientific Committee for Food. Reports of the Scientific Committee for Food: nutrient and energy intakes for the European Commission, thirty-first series. Brussels: European Commission; 1993. Available from: http://europa.eu.int/comm/ food/fs/sc/scf/reports/scf_reports_31.pdf. 37. Biesalski HK. Comparative assessment of the toxicology of vitamin A and retinoids in man. Toxicology. 1989;57:117–61. 38. Rothman KJ, Moore LL, Singer MR, Nguyen US, Mannino S, Milunsky A. Teratogenicity of high vitamin A intake. N Engl J Med. 1995;333:1369–73. 39. Miller RK, Hendricks AG, Mills JL, Hummler H, Wiegand UW. Periconceptional vitamin A use: how much is teratogenic? Reprod Toxicol. 1998;12:75–88. 40. Buss NE, Tembe EA, Prendergast BD, Renwick AG, George CF. The teratogenic metabolites of vitamin A in women following supplements and liver. Hum Exp Toxicol. 1994;13:33–43. 41. Tembe EA, Honeywell R, Buss NE, Renwick AG. All-trans-retinoic acid in maternal plasma and teratogenicity in rats and rabbits. Toxicol Appl Pharmacol. 1996;141:456–72.

42. Scientific Committee on Food. Opinion of the Scientific Committee on Food on the tolerable upper intake level of preformed vitamin A (retinol and retinyl esters). Brussels: European Commission; 2002. Available from: http://europa. eu.int/comm/food/fs/sc/scf/out145_en.pdf. 43. Freudenheim JL, Johnson NE, Smith EL. Relationship between usual nutrient intake and bone-mineral content of women 35–65 years of age: longitudinal and cross-sectional analysis. Am J Clin Nutr. 1986;44:863–76. 44. Houtkooper LB, Ritenbaugh C, Aickin M, Lohman TG, Going SB, Weber JL, Greaves KA, Boyden TW, Pamenter RW, Hall MC. Nutrients, body composition and exercise are related to change in bone mineral density in pre-menopausal women. J Nutr. 1995;125:1229–37. 45. Melhus H, Michaelsson K, Kindmark A, Bergstrom R, Holmberg L, Mallmin H, Wolk A, Ljunghall S. Excessive dietary intake of vitamin A is associated with reduced bone mineral density and increased risk of hip fracture. Ann Intern Med. 1998;129:770–8. 46. Ballew C, Galuska D, Gillespie C. High serum retinyl esters are not associated with reduced bone mineral density in the Third National Health and Nutrition Examination Survey, 1988–1994. J Bone Miner Res. 2001;16:2306–12. 47. Feskanich D, Singh V, Willett WC, Colditz GA. Vitamin A intake and hip fractures among postmenopausal women. JAMA. 2002;287:47–54. 48. Promislow JH, Goodman-Gruen D, Slymen DJ, Barret-Connor E. Retinol intake and bone mineral density in the elderly: the Rancho Bernado Study. J Bone Miner Res. 2002;17:1349–58. 49. Woutersen RA, Wolterbeek APM, Appel MJ, van den Berg H, Goldbohm RA, Feron VJ. Safety evaluation of synthetic b-carotene. Crit Rev Toxicol. 1999; 29:515–42. 50. Wang XD, Liu C, Bronson RT, Smith DE, Krinsky NI, Russel RM. Retinoid signalling and activator protein 1 expression in ferrets given b-carotene supplements and exposed to tobacco smoke. J Natl Cancer Inst. 1999;91:60–6. 51. Scientific Committee on Food. Opinion of the Scientific Committee on Food on the tolerable upper intake level of beta carotene. Brussels: European Commission; 2000. Available from: http://europa.eu.int/comm/food/fs/sc/scf/out80b_en. pdf. 52. Schaumburg H, Kaplan J, Windebank A, Vick N, Rasmus S, Pleasure D, Brown MJ. Sensory neuropathy from pyridoxine abuse. N Engl J Med. 1983; 309:445–8. 53. Berger A, Schaumburg HH. More on neuropathy from pyridoxine abuse. N Engl J Med. 1984;311:986–7. 54. Parry GJ, Bredesen DE. Sensory neuropathy with low-dose pyridoxine. Neurology. 1985;35:1466–8. 55. Berger AR, Schaumburg HH, Schroeder C, Apfel S, Reynolds R. Dose response, coasting, and differential fiber vulnerability in human toxic neuropathy. Neurology. 1992;42:1367–70. 56. Bernstein AL. Vitamin B6 in clinical neurology. Ann N Y Acad Sci. 1990; 585:250–60. 57. Bernstein AL, Lobitz CS. A clinical and electrophysiologic study of the treatment of painful diabetic neuropathies with pyridoxine. In:Leklem J, Reynolds RD, editors. Clinical and physiological applications of vitamin B-6. New York: Alan R. Liss, Inc.; 1988. p. 415–23. 58. Brush MG. Vitamin B-6 treatment of premenstrual syndrome. In: Leklem J, Reynolds RD, editors. Clinical and physiological applications of vitamin B-6. New York: Alan R. Liss, Inc.; 1988. p. 363–79. 59. Brush MG, Perry M. Pyridoxine and the premenstrual syndrome. Lancet. 1985;(8442)1:1399. 60. Brush MG, Bennett T, Hansen K. Pyridoxine in the treatment of premenstrual syndrome: a retrospective survey in 630 patients. Br J Clin Pract. 1988;42: 448–52. 61. Kerr GD. The management of premenstrual syndrome. Curr Med Res Opin. 1977;4:29–34. 62. Day JB. Clinical trials in the premenstrual syndrome. Curr Med Res Opin. 1979;6: suppl 5:40–5. 63. Williams MJ, Harris RI, Dean BC. Controlled trial of pyridoxine in the premenstrual syndrome. J Int Med Res. 1985;13:174–9. 64. Dalton K, Dalton MJT. Characteristics of pyridoxine overdose neuropathy syndrome. Acta Neurol Scand. 1987;76:8–11. 65. Del Tredici AM, Bernstein AL, Chinn K. Carpal tunnel syndrome and vitamin B-6 therapy. In: Vitamin B-6: its role in health and disease. New York: Alan R. Liss, Inc.; 1985. p. 459–62. 66. Ellis J, Folkers K, Watanabe T, Kaji M, Saji S, Caldwell JW, Temple CA, Wood FS. Clinical results of a cross-over treatment with pyridoxine and placebo of the carpal tunnel syndrome. Am J Clin Nutr. 1979;32:2040–6. 67. Mitwalli A, Blair G, Oreopoulos DG. Safety of intermediate doses of pyridoxine. Can Med Assoc J. 1984;131:14. 68. Tolis G, Laliberte´ R, Guyda H, Naftolin F. Ineffectiveness of pyridoxine (B6) to alter secretion of growth hormone and prolactin and absence of therapeutic effects on galactorrhea-amenorrhea syndromes. J Clin Endocrinol Metab. 1977; 44:1197–9. 69. Ellis JM. Treatment of carpal tunnel syndrome with vitamin B6. South Med J. 1987;80:882–4. 70. Scientific Committee on FoodOpinion of the Scientific Committee on Food on the tolerable upper intake level of vitamin B6. Brussels: European Commission; 2000. Available from: http://europa.eu.int/comm/food/fs/sc/scf/out80c_en.pdf. 71. Chen KK, Rose CL, Robbins EB. Toxicity of nicotinic acid. Proc Soc Exp Biol Med. 1938;38:241–5. 72. Rader JI, Calvert RJ, Hathcock JN. Hepatic toxicity of unmodified and time-release preparations of niacin. Am J Med. 1992;92:77–81.

TOXICOLOGY OF MICRONUTRIENTS 73. Scientific Committee on Food. Opinion of the Scientific Committee on Food on the tolerable upper intake levels of nicotinic acid and nicotinamide (niacin). Brussels: European Commission; 2002. Available from: http://europa.eu.int/comm/ food/fs/sc/scf/out80j_en.pdf. 74. Longnecker MP, Taylor PR, Levander OA, Howe M, Veillon C, McAdam PA, Patterson KY, Holden JM, Stampfer MJ, et al. Selenium in diet, blood, and toenails in relation to human health in a seleniferous area. Am J Clin Nutr. 1991;53:1288–94. 75. Yang G, Wang S, Zhou R, Sun S. Endemic selenium intoxication of humans in China. Am J Clin Nutr. 1983;37:872–81. 76. Yang G, Zhou R, Yin S, Gu L, Yan B, Liu Y, Liu Y, Li X. Studies of safe maximum daily dietary selenium intake in a seleniferous area in China. I. Selenium intake and tissue selenium levels of the inhabitants. J Trace Elem Electrolytes Health Dis. 1989;3:77–87. 77. Yang G, Yin S, Zhou R, Gu L, Yan B, Liu Y, Liu Y. Studies of safe maximum daily dietary Se-intake in a seleniferous area in China. II. Relation between SE-intake and the manifestation of clinical signs and certain biochemical

501S

alterations in blood and urine. J Trace Elem Electrolytes Health Dis. 1989;3: 123–30. 78. Clark LC, Combs GF, Turnbull BW, Slate EH, Chalker DK, Chow J, Davis LS, Glover RA, Graham GF, et al. Effects of selenium supplementation for cancer prevention in patients with carcinoma of the skin. A randomised controlled trial. JAMA. 1996;276:1957–63. 79. Scientific Committee on Food. Opinion of the Scientific Committee on Food on the tolerable upper intake level of selenium. Brussels: European Commission; 2000. Available from: http://europa.eu.int/comm/food/fs/sc/scf/out80g_en.pdf. 80. Renwick AG, Flynn A, Fletcher RJ, Muller DJG, Tuijtelaars S, Verhagen H. Risk–benefit analysis of micronutrients. Food Chem Toxicol. 2004;42:1903–22. 81. Dorne JL, Walton K, Renwick AG. Human variability in xenobiotic metabolism and pathway-related uncertainty factors for chemical risk assessment: a review. Food Chem Toxicol. 2004;43:203–16. 82. Renwick AG, Barlow SM, Hertz-Picciotto I, Boobis AR, Dybing E, Edler L, Eisenbrand G, Greig JB, Kleiner J, et al. Risk characterization of chemicals in food and diet. Food Chem Toxicol. 2003;41:1211–71.