首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We review approaches for characterizing “peak” exposures in epidemiologic studies and methods for incorporating peak exposure metrics in dose–response assessments that contribute to risk assessment. The focus was on potential etiologic relations between environmental chemical exposures and cancer risks. We searched the epidemiologic literature on environmental chemicals classified as carcinogens in which cancer risks were described in relation to “peak” exposures. These articles were evaluated to identify some of the challenges associated with defining and describing cancer risks in relation to peak exposures. We found that definitions of peak exposure varied considerably across studies. Of nine chemical agents included in our review of peak exposure, six had epidemiologic data used by the U.S. Environmental Protection Agency (US EPA) in dose–response assessments to derive inhalation unit risk values. These were benzene, formaldehyde, styrene, trichloroethylene, acrylonitrile, and ethylene oxide. All derived unit risks relied on cumulative exposure for dose–response estimation and none, to our knowledge, considered peak exposure metrics. This is not surprising, given the historical linear no‐threshold default model (generally based on cumulative exposure) used in regulatory risk assessments. With newly proposed US EPA rule language, fuller consideration of alternative exposure and dose–response metrics will be supported. “Peak” exposure has not been consistently defined and rarely has been evaluated in epidemiologic studies of cancer risks. We recommend developing uniform definitions of “peak” exposure to facilitate fuller evaluation of dose response for environmental chemicals and cancer risks, especially where mechanistic understanding indicates that the dose response is unlikely linear and that short‐term high‐intensity exposures increase risk.  相似文献   

2.
The selection and use of chemicals and materials with less hazardous profiles reflects a paradigm shift from reliance on risk minimization through exposure controls to hazard avoidance. This article introduces risk assessment and alternatives assessment frameworks in order to clarify a misconception that alternatives assessment is a less effective tool to guide decision making, discusses factors promoting the use of each framework, and also identifies how and when application of each framework is most effective. As part of an assessor's decision process to select one framework over the other, it is critical to recognize that each framework is intended to perform different functions. Although the two frameworks share a number of similarities (such as identifying hazards and assessing exposure), an alternatives assessment provides a more realistic framework with which to select environmentally preferable chemicals because of its primary reliance on assessing hazards and secondary reliance on exposure assessment. Relevant to other life cycle impacts, the hazard of a chemical is inherent, and although it may be possible to minimize exposure (and subsequently reduce risk), it is challenging to assess such exposures through a chemical's life cycle. Through increased use of alternatives assessments at the initial stage of material or product design, there will be less reliance on post facto risk‐based assessment techniques because the potential for harm is significantly reduced, if not avoided, negating the need for assessing risk in the first place.  相似文献   

3.
The awareness of potential risks emerging from the use of chemicals in all parts of daily life has increased the need for risk assessments that are able to cover a high number of exposure situations and thereby ensure the safety of workers and consumers. In the European Union (EU), the practice of risk assessments for chemicals is laid down in a Technical Guidance Document; it is designed to consider environmental and human occupational and residential exposure. Almost 70 EU risk assessment reports (RARs) have been finalized for high-production-volume chemicals during the last decade. In the present study, we analyze the assessment of occupational and consumer exposure to trichloroethylene and phthalates presented in six EU RARs. Exposure scenarios in these six RARs were compared to scenarios used in applications of the scenario-based risk assessment approach to the same set of chemicals. We find that scenarios used in the selected EU RARs to represent typical exposure situations in occupational or private use of chemicals and products do not necessarily represent worst-case conditions. This can be due to the use of outdated information on technical equipment and conditions in workplaces or omission of pathways that can cause consumer exposure. Considering the need for exposure and risk assessments under the new chemicals legislation of the EU, we suggest that a transparent process of collecting data on exposure situations and of generating representative exposure scenarios is implemented to improve the accuracy of risk assessments. Also, the data sets used to assess human exposure should be harmonized, summarized in a transparent fashion, and made accessible for all risk assessors and the public.  相似文献   

4.
Monte Carlo simulations are commonplace in quantitative risk assessments (QRAs). Designed to propagate the variability and uncertainty associated with each individual exposure input parameter in a quantitative risk assessment, Monte Carlo methods statistically combine the individual parameter distributions to yield a single, overall distribution. Critical to such an assessment is the representativeness of each individual input distribution. The authors performed a literature review to collect and compare the distributions used in published QRAs for the parameters of body weight, food consumption, soil ingestion rates, breathing rates, and fluid intake. To provide a basis for comparison, all estimated exposure parameter distributions were evaluated with respect to four properties: consistency, accuracy, precision, and specificity. The results varied depending on the exposure parameter. Even where extensive, well-collected data exist, investigators used a variety of different distributional shapes to approximate these data. Where such data do not exist, investigators have collected their own data, often leading to substantial disparity in parameter estimates and subsequent choice of distribution. The present findings indicate that more attention must be paid to the data underlying these distributional choices. More emphasis should be placed on sensitivity analyses, quantifying the impact of assumptions, and on discussion of sources of variation as part of the presentation of any risk assessment results. If such practices and disclosures are followed, it is believed that Monte Carlo simulations can greatly enhance the accuracy and appropriateness of specific risk assessments. Without such disclosures, researchers will be increasing the size of the risk assessment "black box," a concern already raised by many critics of more traditional risk assessments.  相似文献   

5.
Workplace exposures to airborne chemicals are regulated in the U.S. by the Occupational Safety and Health Administration (OSHA) via the promulgation of permissible exposure limits (PELs). These limits, usually defined as eight-hour time-weighted average values, are enforced as concentrations never to be exceeded. In the case of chronic or delayed toxicants, the PEL is determined from epidemiological evidence and/or quantitative risk assessments based on long-term mean exposures or, equivalently, cumulative lifetime exposures. A statistical model was used to investigate the relation between the compliance strategy, the PEL as a limit never to be exceeded, and the health risk as measured by the probability that an individual's long-term mean exposure concentration is above the PEL. The model incorporates within-worker and between-worker variability in exposure, and assumes the relevant distributions to be log-normal. When data are inadequate to estimate the parameters of the full model, as it is in compliance inspections, it is argued that the probability of a random measurement being above the PEL must be regarded as a lower bound on the probability that a randomly selected worker's long-term mean exposure concentration will exceed the PEL. It is concluded that OSHA's compliance strategy is a reasonable, as well as a practical, means of limiting health risk for chronic or delayed toxicants.  相似文献   

6.
Typical exposures to lead often involve a mix of long-term exposures to relatively constant exposure levels (e.g., residential yard soil and indoor dust) and highly intermittent exposures at other locations (e.g., seasonal recreational visits to a park). These types of exposures can be expected to result in blood lead concentrations that vary on a temporal scale with the intermittent exposure pattern. Prediction of short-term (or seasonal) blood lead concentrations arising from highly variable intermittent exposures requires a model that can reliably simulate lead exposures and biokinetics on a temporal scale that matches that of the exposure events of interest. If exposure model averaging times (EMATs) of the model exceed the shortest exposure duration that characterizes the intermittent exposure, uncertainties will be introduced into risk estimates because the exposure concentration used as input to the model must be time averaged to account for the intermittent nature of the exposure. We have used simulation as a means of determining the potential magnitude of these uncertainties. Simulations using models having various EMATs have allowed exploration of the strengths and weaknesses of various approaches to time averaging of exposures and impact on risk estimates associated with intermittent exposures to lead in soil. The International Commission of Radiological Protection (ICRP) model of lead pharmacokinetics in humans simulates lead intakes that can vary in intensity over time spans as small as one day, allowing for the simulation of intermittent exposures to lead as a series of discrete daily exposure events. The ICRP model was used to compare the outcomes (blood lead concentration) of various time-averaging adjustments for approximating the time-averaged intake of lead associated with various intermittent exposure patterns. Results of these analyses suggest that standard approaches to time averaging (e.g., U.S. EPA) that estimate the long-term daily exposure concentration can, in some cases, result in substantial underprediction of short-term variations in blood lead concentrations when used in models that operate with EMATs exceeding the shortest exposure duration that characterizes the intermittent exposure. Alternative time-averaging approaches recommended for use in lead risk assessment more reliably predict short-term periodic (e.g., seasonal) elevations in blood lead concentration that might result from intermittent exposures. In general, risk estimates will be improved by simulation on shorter time scales that more closely approximate the actual temporal dynamics of the exposure.  相似文献   

7.
Aggregate exposure metrics based on sums or weighted averages of component exposures are widely used in risk assessments of complex mixtures, such as asbestos-associated dusts and fibers. Allowed exposure levels based on total particle or fiber counts and estimated ambient concentrations of such mixtures may be used to make costly risk-management decisions intended to protect human health and to remediate hazardous environments. We show that, in general, aggregate exposure information alone may be inherently unable to guide rational risk-management decisions when the components of the mixture differ significantly in potency and when the percentage compositions of the mixture exposures differ significantly across locations. Under these conditions, which are not uncommon in practice, aggregate exposure metrics may be "worse than useless," in that risk-management decisions based on them are less effective than decisions that ignore the aggregate exposure information and select risk-management actions at random. The potential practical significance of these results is illustrated by a case study of 27 exposure scenarios in El Dorado Hills, California, where applying an aggregate unit risk factor (from EPA's IRIS database) to aggregate exposure metrics produces average risk estimates about 25 times greater - and of uncertain predictive validity - compared to risk estimates based on specific components of the mixture that have been hypothesized to pose risks of human lung cancer and mesothelioma.  相似文献   

8.
Assessing exposures to hazards in order to characterize risk is at the core of occupational hygiene. Our study examined dropped ceiling systems commonly used in schools and commercial buildings and lay‐in ceiling panels that may have contained asbestos prior to the mid to late 1970s. However, most ceiling panels and tiles do not contain asbestos. Since asbestos risk relates to dose, we estimated the distribution of eight‐hour TWA concentrations and one‐year exposures (a one‐year dose equivalent) to asbestos fibers (asbestos f/cc‐years) for five groups of workers who may encounter dropped ceilings: specialists, generalists, maintenance workers, nonprofessional do‐it‐yourself (DIY) persons, and other tradespersons who are bystanders to ceiling work. Concentration data (asbestos f/cc) were obtained through two exposure assessment studies in the field and one chamber study. Bayesian and stochastic models were applied to estimate distributions of eight‐hour TWAs and annual exposures (dose). The eight‐hour TWAs for all work categories were below current and historic occupational exposure limits (OELs). Exposures to asbestos fibers from dropped ceiling work would be categorized as “highly controlled” for maintenance workers and “well controlled” for remaining work categories, according to the American Industrial Hygiene Association exposure control rating system. Annual exposures (dose) were found to be greatest for specialists, followed by maintenance workers, generalists, bystanders, and DIY. On a comparative basis, modeled dose and thus risk from dropped ceilings for all work categories were orders of magnitude lower than published exposures for other sources of banned friable asbestos‐containing building material commonly encountered in construction trades.  相似文献   

9.
Assessments of aggregate exposure to pesticides and other surface contamination in residential environments are often driven by assumptions about dermal contacts. Accurately predicting cumulative doses from realistic skin contact scenarios requires characterization of exposure scenarios, skin surface loading and unloading rates, and contaminant movement through the epidermis. In this article we (1) develop and test a finite-difference model of contaminant transport through the epidermis; (2) develop archetypal exposure scenarios based on behavioral data to estimate characteristic loading and unloading rates; and (3) quantify 24-hour accumulation below the epidermis by applying a Monte Carlo simulation of these archetypal exposure scenarios. The numerical model, called Transient Transport through the epiDERMis (TTDERM), allows us to account for variable exposure times and time between exposures, temporal and spatial variations in skin and compound properties, and uncertainty in model parameters. Using TTDERM we investigate the use of a macro-activity parameter (cumulative contact time) for predicting daily (24-hour) integrated uptake of pesticides during complex exposure scenarios. For characteristic child behaviors and hand loading and unloading rates, we find that a power law represents the relationship between cumulative contact time and cumulative mass transport through the skin. With almost no loss of reliability, this simple relationship can be used in place of the more complex micro-activity simulations that require activity data on one- to five-minute intervals. The methods developed in this study can be used to guide dermal exposure model refinements and exposure measurement study design.  相似文献   

10.
《Risk analysis》2018,38(6):1223-1238
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide‐handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach.  相似文献   

11.
Exposure scenarios are a critical part of risk assessment; however, representative scenarios are not generally available for tribal communities where a traditional subsistence lifestyle and diet are relevant and actively encouraged. This article presents portions of a multipathway exposure scenario developed by AESE, Inc. in conjunction with the Spokane Tribal Cultural Resources Program. The scenario serves as the basis for a screening-level reasonable maximum exposure (RME) developed for the Midnite Uranium Mine Superfund site. The process used in developing this scenario balances the need to characterize exposures without revealing proprietary information. The scenario and resulting RME reflect the subsistence use of original and existing natural resources by a hypothetical but representative family living on the reservation at or near the mine site. The representative family lives in a house in a sparsely populated conifer forest, tends a home garden, partakes in a high rate of subsistence activities (hunting, gathering, fishing), uses a sweat lodge daily, has a regular schedule of other cultural activities, and has members employed in outdoor monitoring of natural and cultural resources. The scenario includes two largely subsistence diets based on fish or game, both of which include native plants and home-grown produce. Data gaps and sources of uncertainty are identified. Additional information that risk assessors and agencies need to understand before doing any kind of risk assessment or public health assessment in tribal situations is presented.  相似文献   

12.
The current approach to health risk assessment of toxic waste sites in the U.S. may lead to considerable expenditure of resources without any meaningful reduction in population exposure. Risk assessment methods used generally ignore background exposures and consider only incremental risk estimates for maximally exposed individuals. Such risk estimates do not address true public health risks to which background exposures also contribute. The purpose of this paper is to recommend a new approach to risk assessment and risk management concerning toxic waste sites. Under this new approach, which we have called public health risk assessment, chemical substances would be classified into a level of concern based on the potential health risks associated with typical national and regional background exposures. Site assessment would then be based on the level of concern for the particular pollutants involved and the potential contribution of site contaminants to typical background human exposures. While various problems can be foreseen with this approach, the key advantage is that resources would be allocated to reduce the most important sources of human exposure, and site remediation decisions could be simplified by focussing on exposure assessment rather than questionable risk extrapolations.  相似文献   

13.
Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard “point” risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.  相似文献   

14.
Armand Maul 《Risk analysis》2014,34(9):1606-1617
Microbial risk assessment is dependent on several biological and environmental factors that affect both the exposure characteristics to the biological agents and the mechanisms of pathogenicity involved in the pathogen‐host relationship. Many exposure assessment studies still focus on the location parameters of the probability distribution representing the concentration of the pathogens and/or toxin. However, the mean or median by themselves are insufficient to evaluate the adverse effects that are associated with a given level of exposure. Therefore, the effects on the risk of disease of a number of factors, including the shape parameters characterizing the distribution patterns of the pathogen in their environment, were investigated. The statistical models, which were developed to provide a better understanding of the factors influencing the risk, highlight the role of heterogeneity and its consequences on the commonly used risk assessment paradigm. Indeed, the heterogeneity characterizing the spatial and temporal distribution of the pathogen and/or the toxin contained in the water or food consumed is shown to be a major factor that may influence the magnitude of the risk dramatically. In general, the risk diminishes with higher levels of heterogeneity. This scheme is totally inverted in the presence of a threshold in the dose‐response relationship, since heterogeneity will then have a tremendous impact, namely, by magnifying the risk when the mean concentration of pathogens is below the threshold. Moreover, the approach of this article may be useful for risk ranking analysis, regarding different exposure conditions, and may also lead to improved water and food quality guidelines.  相似文献   

15.
Detection of heavy metals at trace or higher levels in foods and food ingredients is not unexpected given the widespread unavoidable presence of several metals in nature, coupled with advancement in analytical methods and lowering limits of detection. To assist risk managers with a rapid risk assessment when facing these situations, a metal dietary exposure screening tool (MDEST) was developed. The tool uses food intake rates based on the National Health and Nutrition Examination Survey 2005–2010 consumption data for the U.S. population two+ years and up and for infants age six months to <two years based on the Nestlé Feeding Infants and Toddlers Study, and existing exposure limits for several frequently detected metals (e.g., inorganic arsenic, cadmium, chromium, lead, and mercury). The tool has data entry fields for detected concentrations and includes algorithms that combine metal levels with consumption data to generate screening‐level exposure estimates, which it then compares to MDEST assigned default portions of the exposure limits in the risk characterization module. As a screening‐level tool, the risk assessment output is intentionally conservative, public health protective, and useful for a rapid assessment to set aside issues that are not of concern. Issues that cannot be readily resolved using this screening tool will need to be further evaluated with more refined input data that are tailored to the specific question or situation under consideration.  相似文献   

16.
This article presents a process for an integrated policy analysis that combines risk assessment and benefit-cost analysis. This concept, which explicitly combines the two types of related analyses, seems to contradict the long-accepted risk analysis paradigm of separating risk assessment and risk management since benefit-cost analysis is generally considered to be a part of risk management. Yet that separation has become a problem because benefit-cost analysis uses risk assessment results as a starting point and considerable debate over the last several years focused on the incompatibility of the use of upper bounds or "safe" point estimates in many risk assessments with benefit-cost analysis. The problem with these risk assessments is that they ignore probabilistic information. As advanced probabilistic techniques for risk assessment emerge and economic analysts receive distributions of risks instead of point estimates, the artificial separation between risk analysts and the economic/decision analysts complicates the overall analysis. In addition, recent developments in countervailing risk theory suggest that combining the risk and benefit-cost analyses is required to fully understand the complexity of choices and tradeoffs faced by the decisionmaker. This article also argues that the separation of analysis and management is important, but that benefit-cost analysis has been wrongly classified into the risk management category and that the analytical effort associated with understanding the economic impacts of risk reduction actions need to be part of a broader risk assessment process.  相似文献   

17.
The volume and variety of manufactured chemicals is increasing, although little is known about the risks associated with the frequency and extent of human exposure to most chemicals. The EPA and the recent signing of the Lautenberg Act have both signaled the need for high-throughput methods to characterize and screen chemicals based on exposure potential, such that more comprehensive toxicity research can be informed. Prior work of Mitchell et al. using multicriteria decision analysis tools to prioritize chemicals for further research is enhanced here, resulting in a high-level chemical prioritization tool for risk-based screening. Reliable exposure information is a key gap in currently available engineering analytics to support predictive environmental and health risk assessments. An elicitation with 32 experts informed relative prioritization of risks from chemical properties and human use factors, and the values for each chemical associated with each metric were approximated with data from EPA's CP_CAT database. Three different versions of the model were evaluated using distinct weight profiles, resulting in three different ranked chemical prioritizations with only a small degree of variation across weight profiles. Future work will aim to include greater input from human factors experts and better define qualitative metrics.  相似文献   

18.
Very little quantitative analysis is currently available on the cumulative effects of exposure to multiple hazardous agents that have either similar or different mechanisms of action. Over the past several years, efforts have been made to develop the methodologies for risk assessment of chemical mixtures, but mixed exposures to two or more dissimilar agents such as radiation and one or more chemical agents have not yet been addressed in any substantive way. This article reviews the current understanding of the health risks arising from mixed exposures to ionizing radiation and specific chemicals. Specifically discussed is how mixed radiation/chemical exposures, when evaluated in aggregation, were linked to chronic health endpoints such as cancer and intermediate health outcomes such as chromosomal aberrations. Also considered is the extent to which the current practices are consistent with the scientific understanding of the health risks associated with mixed-agent exposures. From this the discussion moves to the research needs for assessing the cumulative health risks from aggregate exposures to ionizing radiation and chemicals. The evaluation indicates that essentially no guidance has been provided for conducting risk assessment for two agents with different mechanisms of action (i.e., energy deposition from ionizing radiation versus DNA interactions with chemicals) but similar biological endpoints (i.e., chromosomal aberrations, mutations, and cancer). The literature review also reveals the problems caused by the absence of both the basic science and an appropriate evaluation framework for the combined effects of mixed-agent exposures. This makes it difficult to determine whether there is truly no interaction or somehow the interaction is masked by the scale of effect observation or inappropriate dose-response assumptions.  相似文献   

19.
Children may be more susceptible to toxicity from some environmental chemicals than adults. This susceptibility may occur during narrow age periods (windows), which can last from days to years depending on the toxicant. Breathing rates specific to narrow age periods are useful to assess inhalation dose during suspected windows of susceptibility. Because existing breathing rates used in risk assessment are typically for broad age ranges or are based on data not representative of the population, we derived daily breathing rates for narrow age ranges of children designed to be more representative of the current U.S. children's population. These rates were derived using the metabolic conversion method of Layton (1993) and energy intake data adjusted to represent the U.S. population from a relatively recent dietary survey (CSFII 1994–1996, 1998). We calculated conversion factors more specific to children than those previously used. Both nonnormalized (L/day) and normalized (L/kg-day) breathing rates were derived and found comparable to rates derived using energy estimates that are accurate for the individuals sampled but not representative of the population. Estimates of breathing rate variability within a population can be used with stochastic techniques to characterize the range of risk in the population from inhalation exposures. For each age and age-gender group, we present the mean, standard error of the mean, percentiles (50th, 90th, and 95th), geometric mean, standard deviation, 95th percentile, and best-fit parametric models of the breathing rate distributions. The standard errors characterize uncertainty in the parameter estimate, while the percentiles describe the combined interindividual and intra-individual variability of the sampled population. These breathing rates can be used for risk assessment of subchronic and chronic inhalation exposures of narrow age groups of children.  相似文献   

20.
Linear, no-threshold relationships are typically reported for time series studies of air pollution and mortality. Since regulatory standards and economic valuations typically assume some threshold level, we evaluated the fundamental question of the impact of exposure misclassification on the persistence of underlying personal-level thresholds when personal data are aggregated to the population level in the assessment of exposure-response relationships. As an example, we measured personal exposures to two particle metrics, PM2.5 and sulfate (SO4(2-)), for a sample of lung disease patients and compared these with exposures estimated from ambient measurements Previous work has shown that ambient:personal correlations for PM2.5 are much lower than for SO4(2-), suggesting that ambient PM2.5 measurements misclassify exposures to PM2.5. We then developed a method by which the measured:estimated exposure relationships for these patients were used to simulate personal exposures for a larger population and then to estimate individual-level mortality risks under different threshold assumptions. These individual risks were combined to obtain the population risk of death, thereby exhibiting the prominence (and the value) of the threshold in the relationship between risk and estimated exposure. Our results indicated that for poorly classified exposures (PM2.5 in this example) population-level thresholds were apparent at lower ambient concentrations than specified common personal thresholds, while for well-classified exposures (e.g., SO4(2-)), the apparent thresholds were similar to these underlying personal thresholds. These results demonstrate that surrogate metrics that are not highly correlated with personal exposures obscure the presence of thresholds in epidemiological studies of larger populations, while exposure indicators that are highly correlated with personal exposures can accurately reflect underlying personal thresholds.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号