Pesticide residue levels
The determination of appropriate pesticide residue levels on food items is a
challenging task. In practice, residue levels have been estimated in a variety
of ways, incorporating different assumptions leading to different levels of
uncertainty. Approaches taken in the estimation of residue levels range from
those that may be highly theoretical and assume that all residues are present at
a predetermined level to more complex, data-intensive approaches based on actual
measurements of residue levels at the time the food is ready to be consumed. A
variety of intermediate techniques incorporating data on pesticide use and
actual field residue levels may also be used (Winter, 1992a).
Some of the methods for assessing pesticide residue levels are shown in Figure
2. (Figure 2 not available electronically) Approaches requiring the lowest cost
often yield the greatest amount of data but provide the greatest overestimation
of residue levels. As refinements are made in the process to accommodate
additional data, the accuracy of the residue estimates is improved, but the data
is less available and the cost to produce additional data rises significantly.
The most common method to predict pesticide residue levels relies uses
theassumption that residues are always present for all pesticides on all food
items at the maximum allowable level, known as the tolerance. This theoretical
approach provides enormously exaggerated residue estimates since it fails to
take into account a variety of important factors such as extent of actual
pesticide use, application practices, and post-harvest effects upon residue
levels (Archibald and Winter, 1990). The tolerance values themselves are
artificially high since tolerances are enforcement tools rather than safety
standards and are set to exceed the maximum residue levels expected during legal
application of a pesticide (Winter, 1992b).
The accuracy of the residue level estimates may be improved by making
refinements for the actual extent of pesticide use. While this data is sometimes
available, it is often difficult to obtain. As many as 41 individual states have
pesticide-use reporting plans, although most are flawed by major limitations
such as difficulties in keeping the data up-to-date or selective reporting of
specific pesticides and/or crops to which the pesticides are applied.
A further refinement involves the substitution of actual field residue data for
theoretical residue values. Most often, this refinement uses results from state
and federal regulatory pesticide monitoring programs designed to enforce
tolerances. Results over the past several years have indicated that the majority
of foods analyzed contained no detectable residues, that residues rarely
exceeded the tolerance levels, and that most residues, when detected, appeared
at only a small fraction of the tolerance level (Winter, 1992a). The monitoring
programs, however, are not routinely capable of detection of all possible
pesticides that could appear as residues, and the sample sizes for specific
pesticide/commodity combinations are not always large enough for results to be
applied to the general food supply.
While field residue data provides a significant improvement over theoretical
estimates of residue levels, it may not accurately represent the residue levels
that consumers are exposed to at the time the foods are eaten. A variety of
factors, such as processing, handling, transportation, cooking, peeling, and
washing have been shown to have dramatic effects upon the ultimate level of
residue reaching the consumer. In most cases, these effects reduce the residue
levels significantly, although residue levels are occasionally increased by
processing, which may also produce pesticide breakdown products of potential
toxicological concern. When available, correction factors accounting for
post-harvest effects may be used to convert field residue data into more
accurate representations of residue levels at the time of consumption. Often,
however, such data is not available.
The most accurate method to assess pesticide residue levels is through the
analysis of food at the time of consumption. Studies performed in such a manner
are often called "market-basket" studies. One such study, performed annually by
the U.S. Food and Drug Administration, is known as the Total Diet Study
(Pennington and Gunderson, 1987). In this study, foods are collected from retail
outlets four times each year, once from each of the four geographical areas of
the country. Each collection consists of the purchase of identical foods from
markets in three cities in each geographical area, and the subsamples from each
city are combined to form a market basket sample for analysis. A total of 234
different food items are selected for each market basket and all of the foods
are prepared by institutional kitchens using standard recipes and normal
washing, peeling, mixing, and cooking procedures into table-ready forms prior to
analysis. While such an approach yields superior results, it is very costly and
limited in sample size and pesticide coverage. Due to these factors, concerns
have been raised as to the applicability of extrapolation of the results
generated from the Total Diet Study to the entire U.S. food supply. At the same
time, the Total Diet Study represents the most comprehensive market basket
survey in existence.
Food consumption estimates
The determination of the amount of consumption of particular food items by the
population at large or by particular population subgroups is also important in
the assessment of dietary pesticide exposure. Multiplication of food consumption
estimates by estimated pesticide residue levels leads to an estimate of
pesticide exposure that can be combined with results from the toxicology
assessment to estimate risks.
In the U.S., food consumption estimates are most commonly derived from the
results of U.S. Department of Agriculture (USDA) Nationwide Food Consumption
Surveys. These surveys involve tens of thousands of individuals in the 48
continental states and Hawaii, Alaska, and Puerto Rico. A stratified probability
sampling process is used and individuals are asked to describe the types and
amounts of foods they consumed, both at home and away from home, for a defined
three-day period. Results enable summarizations of consumption patterns for 10
food groups and 43 food subgroups, each of 10 sex-age categories, four income
levels, three urbanization categories, two racial groups, and each of the four
seasons.
The adequacy of this approach has been called into question. A major limitation
appears to be the sample size for certain subpopulation groups such as nursing
infants, which was considered from the latest (1987-88) survey to be too small
to enable statistical credibility. In addition, the accuracy of surveys of this
type is limited by recall bias and recall errors. The 1987-88 survey, in fact,
was severely criticized by the U.S. General Accounting Office, which concluded
that the data should not be used unless the greatest caution is employed due to
the survey's low response rate (GAO, 1991).
As a result of the problems with the 1987-88 survey, much food consumption data
used for risk assessment purposes still derives from the 1977-78 survey.
Comparisons from 1987-88 and 1977-78 results indicate that fruit and vegetable
consumption has increased, as has wine consumption among adults, wheras the
consumption of beef and distilled spirits has decreased (Chaisson, 1990).
Differences in food consumption patterns among children are particularly
dramatic. The 1987-88 survey indicated that children consumed 2.7 times as much
apple juice than they did in 1977-78 and that their consumption of chicken and
turkey has also increased. In contrast, the consumption of beef, pork, milk,
eggs, and cooked fruits and vegetables by children decreased over the 10-year
period (Chaisson, 1990).
In addition to problems resulting from difficulties in acquiring accurate food
consumption estimates from national surveys, individual variation in food
consumption by members of the same population subgroup is also difficult to
account for in the risk assessment process. This is particularly important in
determining acute exposure to pesticides in the diet, since short-term
consumption of particular foods may differ dramatically from average
consumption.
To compensate for some of the inaccuracies in food consumption estimation, it is
often suggested that average levels of consumption be replaced by consumption
estimates of those consuming the greatest amounts of individual food items. This
approach may overcome the possibility of underestimating the exposures to some
members of the population but may also lead to unrealistic exposure estimates
for the general population since it does not allow for compensation in the diet
to reduce consumption of other foods. It is likely that this issue will be
considered in the upcoming NRC report.