Exposure and Risk




A. Routes of Uptake

In order for a chemical to exert a toxic effect, it must first enter the body. We can be exposed to toxic substances through inhalation, ingestion of contaminated food or water, or through dermal application. Data are needed on concentrations of toxicants in air, drinking water, and food; but these data are often not available or not up-to-date. Furthermore, to calculate exposure, the assumed concentration has to be applied to the quantity of, for example, water that is ingested over a period of time by the "average" person. For food intake, there is little information on the quantities of different foodstuffs consumed in different regions, age groups, ethnic or socioeconomic classes, etc.

Next, the compound must reach one or more sites where toxic effects can be exerted. Along the way it could be partially metabolized, perhaps into a more toxic material. For example, inhaled chloroform is oxidized in the lung to phosgene, a highly reactive and toxic species capable of damaging proteins and DNA. Many known carcinogens are metabolized to more reactive forms in this fashion. The possibilities of immobilization and metabolism (pharmacokinetic effects) complicate the assessment of risk in humans, especially since the rates and routes of metabolism of a foreign compound at the high doses usually administered in laboratory tests may be far different from that observed at low, environmentally realistic exposure concentrations.


B. Estimation of Toxicity

Typically, a substance that is suspected to be hazardous is subjected to experiments with animals to elucidate a dose-response relationship between the amount of the substance administered and the degree of damage observed, whether death, tumor development, birth defects, or other endpoint. Toxicological data is often scanty, even for rather commonly used chemicals, and the limited data available are used for assessment of risk despite the possibility of false positive (or negative) findings.


C. Risk Assessment - the Traditional Approach

The computation of risk to human health for a given substance requires several assumptions related not only to sources of exposure and route of ingestion, but also to how mathematical modeling is applied to animal toxicity data, and how effects of biomass and body surface area should be assessed. Somewhat systematic approaches have been formalized. Dose-response data obtained from animal experiments are extrapolated, using various mathematical assumptions, to expected exposure levels to produce a (hopefully) quantitative assessment of risk. Currently, this type of risk assessment has been used largely to predict the likelihood of developing or dying from cancer.

A major unresolved issue in risk assessment is the most appropriate method of extrapolating short-term, high-dose, positive toxicity data to the much lower exposure levels to which people will be exposed. The issue is whether toxic compounds have a lower threshold of activity, a concentration below which harmful effects do not occur. For some types of hazardous compounds, such as carcinogens, it has been considered prudent to assume that there is no threshold, and to assume that a plot of the dose-hazard relationship goes straight through the origin; any measurable dose is assumed to carry with it some risk. Other kinds of toxicity data, however, often indicate a lower limit below which no physiological effects are observed. In these cases, the dose-response curve has a "hockey-stick" profile. Complications occur when dose-response data for animals are highly curvilinear, as they often are.

Another major concern is the extent to which animal toxicity data can, or should be, translated to humans, given the large differences in anatomy, physiology, and metabolism, and therefore susceptibility, of various animal species to toxicants. Furthermore, animal data are not always reproducible due to variances in experimental design and in the numbers of animals tested.

In the past, the assumption has been made that people should be considered at least as susceptible as the most susceptible animal species. Where a concentration is available in these experiments in which no observed effect is noted, a "safety" or "uncertainty" factor is then applied to the data to calculate acceptable (reduced) concentrations or intake levels to which humans can be assumed to be safely exposed. For example, one suggestion was that a safety factor of 10 be applied when there were valid results from studies of long-term ingestion by humans and no indications of carcinogenicity; a factor of 100 was recommended when no valid human data existed but there were valid studies of long-term effects on animals; and so on.

In any event, for carcinogenesis the assumption has usually been made that a "safe" level of human exposure is that in which the total lifetime exposure of a person to a compound should produce one chance in a million for developing cancer. This is essentially how drinking-water standards (for lead concentrations, e.g.) are calculated. Occasionally, however, operational or political concerns have overridden this type of computation. An example of this is the current drinking-water standard for trihalomethanes. If the traditional extrapolation method with safety factors were to be applied to the data, concentrations in the range of 1-10 ppm would be obtained; however, since such levels are not achievable with common disinfection technology, a higher standard of 100 ppm is currently on the books.


D. Risk Assessment - a Mechanistic Approach

A more modern approach to risk assessment is one that takes into account some of the genetic and enzymatic processes that contribute to toxicity. These studies are certainly more complex and difficult than the "count-the-dead-bodies" approach of traditional toxicology, but they hold out the eventual promise of more rational assessment of health risks.

One of the most heavily criticized aspects of traditional risk assessment is the "mathematical leap of faith" necessary to project risk from high-dose animal studies to human risk. Mechanistic differences between test animals and humans are only one of many problems with this approach. For example, in order for a number of chemicals to cause kidney cancer in male rats, they must bind to a protein found in the urine of these animals. However, this protein is absent from human urine and binding does not occur. In addition, mice contain a gene that causes them to activate some epigenetic carcinogens (peroxisome proliferators) to initiate liver cancer. Humans contain the gene, but the activity it induces is only a tenth of the level of that in mice, making it far less likely that people would develop cancer by this pathway.

Recently (summer 1999) the USEPA has tried to formulate new drinking-water rules using findings such as these. It was suggested that the allowed concentration of chloroform (an epigenetic carcinogen) be raised to 300 ppb from its current <100 ppb. However, outcries from environmental pressure groups caused the agency to delay adoption of the rule, and rolled back the level to its original value. It will be interesting to observe future trends in risk analysis as more becomes known about molecular mechanisms of toxicity.


Professor Patricia Shapley, University of Illinois, 2010