首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The Waste Isolation Pilot Plant (WIPP) is a geological repository for disposal of U.S. defense transuranic radioactive waste. Built and operated by the U.S. Department of Energy (DOE), it is located in the Permian age salt beds in southeastern New Mexico at a depth of 655 m. Performance assessment for the repository's compliance with the 10,000-year containment standards was completed in 1996 and the U.S. Environmental Protection Agency (EPA) certified in 1998 that the repository meets compliance with the EPA standards 40 CFR 191 and 40 CFR 194. The Environmental Evaluation Group (EEG) review of the DOE's application for certification identified a number of issues. These related to the scenarios, conceptual models, and values of the input parameters used in the calculations. It is expected that these issues will be addressed and resolved during the first 5-year recertification process that began with the first receipt of waste at WIPP on March 26, 1999, and scheduled to be completed in March 2004.  相似文献   

2.
Treatment of Uncertainty in Performance Assessments for Complex Systems   总被引:13,自引:0,他引:13  
When viewed at a high level, performance assessments (PAs) for complex systems involve two types of uncertainty: stochastic uncertainty, which arises because the system under study can behave in many different ways, and subjective uncertainty, which arises from a lack of knowledge about quantities required within the computational implementation of the PA. Stochastic uncertainty is typically incorporated into a PA with an experimental design based on importance sampling and leads to the final results of the PA being expressed as a complementary cumulative distribution function (CCDF). Subjective uncertainty is usually treated with Monte Carlo techniques and leads to a distribution of CCDFs. This presentation discusses the use of the Kaplan/Garrick ordered triple representation for risk in maintaining a distinction between stochastic and subjective uncertainty in PAs for complex systems. The topics discussed include (1) the definition of scenarios and the calculation of scenario probabilities and consequences, (2) the separation of subjective and stochastic uncertainties, (3) the construction of CCDFs required in comparisons with regulatory standards (e.g., 40 CFR Part 191, Subpart B for the disposal of radioactive waste), and (4) the performance of uncertainty and sensitivity studies. Results obtained in a preliminary PA for the Waste Isolation Pilot Plant, an uncertainty and sensitivity analysis of the MACCS reactor accident consequence analysis model, and the NUREG-1150 probabilistic risk assessments are used for illustration.  相似文献   

3.
A Monte Carlo procedure for the construction of complementary cumulative distribution functions (CCDFs) for comparison with the U.S. Environmental Protection Agency (EPA) release limits for radioactive waste disposal (40 CFR 191, Subpart B) is described and illustrated with results from a recent performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The Monte Carlo procedure produces CCDF estimates similar to those obtained with importance sampling in several recent PAs for the WIPP. The advantages of the Monte Carlo procedure over importance sampling include increased resolution in the calculation of probabilities for complex scenarios involving drilling intrusions and better use of the necessarily limited number of mechanistic calculations that underlie CCDF construction.  相似文献   

4.
Performance Assessment (PA) is the use of mathematical models to simulate the long-term behavior of engineered and geologic barriers in a nuclear waste repository; methods of uncertainty analysis are used to assess effects of parametric and conceptual uncertainties associated with the model system upon the uncertainty in outcomes of the simulation. PA is required by the U.S. Environmental Protection Agency as part of its certification process for geologic repositories for nuclear waste. This paper is a dialogue to explore the value and limitations of PA. Two skeptics acknowledge the utility of PA in organizing the scientific investigations that are necessary for confident siting and licensing of a repository; however, they maintain that the PA process, at least as it is currently implemented, is an essentially unscientific process with shortcomings that may provide results of limited use in evaluating actual effects on public health and safety. Conceptual uncertainties in a PA analysis can be so great that results can be confidently applied only over short time ranges, the antithesis of the purpose behind long-term, geologic disposal. Two proponents of PA agree that performance assessment is unscientific, but only in the sense that PA is an engineering analysis that uses existing scientific knowledge to support public policy decisions, rather than an investigation intended to increase fundamental knowledge of nature; PA has different goals and constraints than a typical scientific study. The proponents describe an ideal, six-step process for conducting generalized PA, here called probabilistic systems analysis (PSA); they note that virtually all scientific content of a PA is introduced during the model-building steps of a PSA; they contend that a PA based on simple but scientifically acceptable mathematical models can provide useful and objective input to regulatory decision makers. The value of the results of any PA must lie between these two views and will depend on the level of knowledge of the site, the degree to which models capture actual physical and chemical processes, the time over which extrapolations are made, and the proper evaluation of health risks attending implementation of the repository. The challenge is in evaluating whether the quality of the PA matches the needs of decision makers charged with protecting the health and safety of the public.  相似文献   

5.
David Okrent 《Risk analysis》1999,19(5):877-901
This article begins with some history of the derivation of 40 CFR Part 191, the U.S. Environmental Protection Agency (EPA) standard that governs the geologic disposal of spent nuclear fuel and high-level and transuranic radioactive wastes. This is followed by criticisms of the standard that were made by a Sub-Committee of the EPA Science Advisory Board, by the staff of the U.S. Nuclear Regulatory Commission, and by a panel of the National Academies of Science and Engineering. The large disparity in the EPA approaches to regulation of disposal of radioactive wastes and disposal of hazardous, long-lived, nonradioactive chemical waste is illustrated. An examination of the intertwined matters of intergenerational equity and the discounting of future health effects follows, together with a discussion of the conflict between intergenerational equity and intragenerational equity. Finally, issues related to assumptions in the regulations concerning the future state of society and the biosphere are treated, as is the absence of any national philosophy or guiding policy for how to deal with societal activities that pose very long-term risks.  相似文献   

6.
Massive efforts are underway to clean up hazardous and radioactive waste sites located throughout the United States. To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate, and effects of hazardous chemicals and radioactive materials found at these sites. Although the U.S. Environmental Protection Agency (EPA), the U.S. Department of Energy (DOE), and the U.S. Nuclear Regulatory Commission (NRC)have provided preliminary guidance to promote the use of computer models for remediation purposes, no agency has produced directed guidance on models that must be used in these efforts. As a result, model selection is currently done on an ad hoc basis. This is administratively ineffective and costly, and can also result in technically inconsistent decision-making. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE, and NRC was initiated. The purpose of this project was to: (1)identify models being used for hazardous and radioactive waste site assessment purposes; and (2)describe and classify these models. This report presents the results of this study. A mail survey was conducted to identify models in use. The survey was sent to 550 persons engaged in the cleanup of hazardous and radioactive waste sites; 87 individuals responded. They represented organizations including federal agencies, national laboratories, and contractor organizations. The respondents identified 127 computer models that were being used to help support cleanup decision-making. There were a few models that appeared to be used across a large number of sites (e.g., RESRAD). In contrast, the survey results also suggested that most sites were using models which were not reported in use elsewhere. Information is presented on the types of models being used and the characteristics of the models in use. Also shown is a list of models available, but not identified in the survey itself.  相似文献   

7.
At the request of the U.S. Environmental Protection Agency (EPA), the National Research Council (NRC) recently completed a major report, Science and Decisions: Advancing Risk Assessment, that is intended to strengthen the scientific basis, credibility, and effectiveness of risk assessment practices and subsequent risk management decisions. The report describes the challenges faced by risk assessment and the need to consider improvements in both the technical analyses of risk assessments (i.e., the development and use of scientific information to improve risk characterization) and the utility of risk assessments (i.e., making assessments more relevant and useful for risk management decisions). The report tackles a number of topics relating to improvements in the process, including the design and framing of risk assessments, uncertainty and variability characterization, selection and use of defaults, unification of cancer and noncancer dose‐response assessment, cumulative risk assessment, and the need to increase EPA's capacity to address these improvements. This article describes and summarizes the NRC report, with an eye toward its implications for risk assessment practices at EPA.  相似文献   

8.
In their regulations, the U.S. Environmental Protection Agency and the U.S. Nuclear Regulatory Commission permit the omission of features, events, or processes with probabilities of <10(-4) in 10(4) yr (e.g., a constant frequency of <10(-8) per yr) in assessments of the performance of radioactive waste disposal systems. Igneous intrusion (or "volcanism") of a geologic repository at Yucca Mountain for radioactive waste is one disruptive event that has a probability with a range of uncertainty that straddles this regulatory criterion and is considered directly in performance assessment calculations. A self-sustained nuclear chain reaction (or "criticality") is another potentially disruptive event to consider, although it was never found to be important when evaluating the efficacy of radioactive waste disposal since the early 1970s. The thesis of this article is that the consideration of the joint event--volcanism and criticality--occurring in any 10,000-year period following closure can be eliminated from performance calculations at Yucca Mountain. The probability of the joint event must be less than the fairly well-accepted but low probability of volcanism. Furthermore, volcanism does not "remove" or "fail" existing hydrologic or geochemical constraints at Yucca Mountain that tend to prevent concentration of fissile material. Prior to general corrosion failure of waste packages, the mean release of fissile mass caused by a low-probability, igneous intrusive event is so small that the probability of a critical event is remote, even for highly enriched spent nuclear fuel owned by the U.S. Department of Energy. After widespread failure of packages occurs, the probability of the joint event is less than the probability of criticality because of the very small influence of volcanism on the mean fissile mass release. Hence, volcanism plays an insignificant role in inducing criticality over any 10(4)-yr period. We also argue that the Oklo reactors serve as a natural analogue and provide a rough bound on probability of criticality given favorable hydrologic or geochemical conditions on the scale of the repository that is less than 0.10. Because the product of this bound with the probability of volcanism represents the probability of the joint event and the product is less than 10(-4) in 10(4) yr, consideration of the joint event can be eliminated from performance calculations.  相似文献   

9.
Yuri Dublyansky 《Risk analysis》2007,27(6):1455-1468
The U.S. Code of Federal Regulations, 10 CFR Part 63, stipulates that the expected performance of the geological high-level nuclear waste repository must be demonstrated through a total system performance assessment (TSPA). The TSPA represents an analysis that identifies all features, events, and processes (FEPs) that might affect the disposal system and examines the effects of the identified FEPs upon the performance of the system. Secondary minerals from the thick unsaturated zone of Yucca Mountain were deposited from waters with temperatures up to 70-90 degrees C. U-Pb dating constrained the ages of the elevated temperatures to the period between 10 and 5-6 million years ago. Relatively youthful circulation of thermal waters (hydrothermal activity) would be of concern for the safety of the disposal facility. A phenomenological model was advanced by the U.S. Department of Energy (DOE), which proposed that the minerals were deposited by infiltrating meteoric waters that were heated upon contact with the bedrock; it was hypothesized that the latter was conductively heated by a shallow silicic magma body. The model rendered processes responsible for elevated water temperatures inconsequential for the safety of the proposed nuclear waste facility. However, attempts by DOE at validating the model by means of numeric thermal simulations and analogue system observations were unsuccessful. Regulations specify two criteria for exclusion of a FEP from consideration in the TSPA: low probability and low consequence. The lack of a plausible phenomenological model makes it impossible to apply either of these two criteria to the FEP Hydrothermal Activity. Despite the lack of a valid criterion for exclusion, it was excluded from the TSPA. Both the development of DOE's thermal model and the formal FEP analysis were associated with deviations from DOE's quality assurance regulations.  相似文献   

10.
Nuclear facilities have long been seen as the top of the list of locally unwanted land uses (LULUs), with nuclear waste repositories generating the greatest opposition. Focusing on the case of the Waste Isolation Pilot Plant (WIPP) in southern New Mexico, we test competing hypotheses concerning the sources of opposition and support for siting the facility, including demographics, proximity, political ideology, and partisanship, and the unfolding policy process over time. This study tracks the changes of risk perception and acceptance of WIPP over a decade, using measures taken from 35 statewide surveys of New Mexico citizens spanning an 11‐year period from fall 1990 to summer 2001. This time span includes periods before and after WIPP became operational. We find that acceptance of WIPP is greater among those whose residences are closest to the WIPP facility. Surprisingly, and contrary to expectations drawn from the broader literature, acceptance is also greater among those who live closest to the nuclear waste transportation route. We also find that ideology, partisanship, government approval, and broader environmental concerns influence support for WIPP acceptance. Finally, the sequence of procedural steps taken toward formal approval of WIPP by government agencies proved to be important to gaining public acceptance, the most significant being the opening of the WIPP facility itself.  相似文献   

11.
Multimedia modelers from the United States Environmental Protection Agency (EPA) and the United States Department of Energy (DOE) collaborated to conduct a detailed and quantitative benchmarking analysis of three multimedia models. The three models—RESRAD (DOE), MMSOILS (EPA), and MEPAS (DOE)—represent analytically-based tools that are used by the respective agencies for performing human exposure and health risk assessments. The study is performed by individuals who participate directly in the ongoing design, development, and application of the models. Model form and function are compared by applying the models to a series of hypothetical problems, first isolating individual modules (e.g., atmospheric, surface water, groundwater) and then simulating multimedia-based risk resulting from contaminant release from a single source to multiple environmental media. Study results show that the models differ with respect to environmental processes included (i.e., model features) and the mathematical formulation and assumptions related to the implementation of solutions. Depending on the application, numerical estimates resulting from the models may vary over several orders-of-magnitude. On the other hand, two or more differences may offset each other such that model predictions are virtually equal. The conclusion from these results is that multimedia models are complex due to the integration of the many components of a risk assessment and this complexity must be fully appreciated during each step of the modeling process (i.e., model selection, problem conceptualization, model application, and interpretation of results).  相似文献   

12.
Biomagnification of organochlorine and other persistent organic contaminants by higher trophic level organisms represents one of the most significant sources of uncertainty and variability in evaluating potential risks associated with disposal of dredged materials. While it is important to distinguish between population variability (e.g., true population heterogeneity in fish weight, and lipid content) and uncertainty (e.g., measurement error), they can be operationally difficult to define separately in probabilistic estimates of human health and ecological risk. We propose a disaggregation of uncertain and variable parameters based on: (1) availability of supporting data; (2) the specific management and regulatory context (in this case, of the U.S. Army Corps of Engineers/U.S. Environmental Protection Agency tiered approach to dredged material management); and (3) professional judgment and experience in conducting probabilistic risk assessments. We describe and quantitatively evaluate several sources of uncertainty and variability in estimating risk to human health from trophic transfer of polychlorinated biphenyls (PCBs) using a case study of sediments obtained from the New York-New Jersey Harbor and being evaluated for disposal at an open water off-shore disposal site within the northeast region. The estimates of PCB concentrations in fish and dietary doses of PCBs to humans ingesting fish are expressed as distributions of values, of which the arithmetic mean or mode represents a particular fractile. The distribution of risk values is obtained using a food chain biomagnification model developed by Gobas by specifying distributions for input parameters disaggregated to represent either uncertainty or variability. Only those sources of uncertainty that could be quantified were included in the analysis. Results for several different two-dimensional Latin Hypercube analyses are provided to evaluate the influence of the uncertain versus variable disaggregation of model parameters. The analysis suggests that variability in human exposure parameters is greater than the uncertainty bounds on any particular fractile, given the described assumptions.  相似文献   

13.
Model uncertainty is a primary source of uncertainty in the assessment of the performance of repositories for the disposal of nuclear wastes, due to the complexity of the system and the large spatial and temporal scales involved. This work considers multiple assumptions on the system behavior and corresponding alternative plausible modeling hypotheses. To characterize the uncertainty in the correctness of the different hypotheses, the opinions of different experts are treated probabilistically or, in alternative, by the belief and plausibility functions of the Dempster‐Shafer theory. A comparison is made with reference to a flow model for the evaluation of the hydraulic head distributions present at a radioactive waste repository site. Three experts are assumed available for the evaluation of the uncertainties associated with the hydrogeological properties of the repository and the groundwater flow mechanisms.  相似文献   

14.
This paper presents the history of the Environmental Restoration Priority System (ERPS), a decision aid developed by the U.S. Department of Energy (DOE) to help determine how to allocate funds for cleaning up hazardous waste sites. Although praised in technical peer review, the system was strongly criticized by stakeholders external to the DOE. Ultimately, and in the midst of a National Academy of Sciences review, DOE shelved the system. The rise and fall of ERPS provides useful lessons for analysts hoping to improve risk management in the public sector.  相似文献   

15.
The dose‐response analyses of cancer and noncancer health effects of aldrin and dieldrin were evaluated using current methodology, including benchmark dose analysis and the current U.S. Environmental Protection Agency (U.S. EPA) guidance on body weight scaling and uncertainty factors. A literature review was performed to determine the most appropriate adverse effect endpoints. Using current methodology and information, the estimated reference dose values were 0.0001 and 0.00008 mg/kg‐day for aldrin and dieldrin, respectively. The estimated cancer slope factors for aldrin and dieldrin were 3.4 and 7.0 (mg/kg‐day)?1, respectively (i.e., about 5‐ and 2.3‐fold lower risk than the 1987 U.S. EPA assessments). Because aldrin and dieldrin are no longer used as pesticides in the United States, they are presumed to be a low priority for additional review by the U.S. EPA. However, because they are persistent and still detected in environmental samples, quantitative risk assessments based on the best available methods are required. Recent epidemiologic studies do not demonstrate a causal association between aldrin and dieldrin and human cancer risk. The proposed reevaluations suggest that these two compounds pose a lower human health risk than currently reported by the U.S. EPA.  相似文献   

16.
《Risk analysis》2018,38(1):163-176
The U.S. Environmental Protection Agency (EPA) uses health risk assessment to help inform its decisions in setting national ambient air quality standards (NAAQS). EPA's standard approach is to make epidemiologically‐based risk estimates based on a single statistical model selected from the scientific literature, called the “core” model. The uncertainty presented for “core” risk estimates reflects only the statistical uncertainty associated with that one model's concentration‐response function parameter estimate(s). However, epidemiologically‐based risk estimates are also subject to “model uncertainty,” which is a lack of knowledge about which of many plausible model specifications and data sets best reflects the true relationship between health and ambient pollutant concentrations. In 2002, a National Academies of Sciences (NAS) committee recommended that model uncertainty be integrated into EPA's standard risk analysis approach. This article discusses how model uncertainty can be taken into account with an integrated uncertainty analysis (IUA) of health risk estimates. It provides an illustrative numerical example based on risk of premature death from respiratory mortality due to long‐term exposures to ambient ozone, which is a health risk considered in the 2015 ozone NAAQS decision. This example demonstrates that use of IUA to quantitatively incorporate key model uncertainties into risk estimates produces a substantially altered understanding of the potential public health gain of a NAAQS policy decision, and that IUA can also produce more helpful insights to guide that decision, such as evidence of decreasing incremental health gains from progressive tightening of a NAAQS.  相似文献   

17.
Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.  相似文献   

18.
Hybrid Processing of Stochastic and Subjective Uncertainty Data   总被引:1,自引:0,他引:1  
Uncertainty analyses typically recognize separate stochastic and subjective sources of uncertainty, but do not systematically combine the two, although a large amount of data used in analyses is partly stochastic and partly subjective. We have developed methodology for mathematically combining stochastic and subjective sources of data uncertainty, based on new "hybrid number" approaches. The methodology can be utilized in conjunction with various traditional techniques, such as PRA (probabilistic risk assessment) and risk analysis decision support. Hybrid numbers have been previously examined as a potential method to represent combinations of stochastic and subjective information, but mathematical processing has been impeded by the requirements inherent in the structure of the numbers, e.g., there was no known way to multiply hybrids. In this paper, we will demonstrate methods for calculating with hybrid numbers that avoid the difficulties. By formulating a hybrid number as a probability distribution that is only fuzzily known, or alternatively as a random distribution of fuzzy numbers, methods are demonstrated for the full suite of arithmetic operations, permitting complex mathematical calculations. It will be shown how information about relative subjectivity (the ratio of subjective to stochastic knowledge about a particular datum) can be incorporated. Techniques are also developed for conveying uncertainty information visually, so that the stochastic and subjective components of the uncertainty, as well as the ratio of knowledge about the two, are readily apparent. The techniques demonstrated have the capability to process uncertainty information for independent, uncorrelated data, and for some types of dependent and correlated data. Example applications are suggested, illustrative problems are shown, and graphical results are given.  相似文献   

19.
As part of its preparation to review a potential license application from the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission (NRC) is examining the performance of the proposed Yucca Mountain nuclear waste repository. In this regard, we evaluated postclosure repository performance using Monte Carlo analyses with an NRC-developed system model that has 950 input parameters, of which 330 are sampled to represent system uncertainties. The quantitative compliance criterion for dose was established by NRC to protect inhabitants who might be exposed to any releases from the repository. The NRC criterion limits the peak-of-the-mean dose, which in our analysis is estimated by averaging the potential exposure at any instant in time for all Monte Carlo realizations, and then determining the maximum value of the mean curve within 10000 years, the compliance period. This procedure contrasts in important ways with a more common measure of risk based on the mean of the ensemble of peaks from each Monte Carlo realization. The NRC chose the former (peak-of-the-mean) because it more correctly represents the risk to an exposed individual. Procedures for calculating risk in the expected case of slow repository degradation differ from those for low-probability cases of disruption by external forces such as volcanism. We also explored the possibility of risk dilution (i.e., lower calculated risk) that could result from arbitrarily defining wide probability distributions for certain parameters. Finally, our sensitivity analyses to identify influential parameters used two approaches: (1). the ensemble of doses from each Monte Carlo realization at the time of the peak risk (i.e., peak-of-the-mean) and (2). the ensemble of peak doses calculated from each realization within 10000 years. The latter measure appears to have more discriminatory power than the former for many parameters (based on the greater magnitude of the sensitivity coefficient), but can yield different rankings, especially for parameters that influence the timing of releases.  相似文献   

20.
This article presents a general model for estimating population heterogeneity and "lack of knowledge" uncertainty in methylmercury (MeHg) exposure assessments using two-dimensional Monte Carlo analysis. Using data from fish-consuming populations in Bangladesh, Brazil, Sweden, and the United Kingdom, predictive model estimates of dietary MeHg exposures were compared against those derived from biomarkers (i.e., [Hg]hair and [Hg]blood). By disaggregating parameter uncertainty into components (i.e., population heterogeneity, measurement error, recall error, and sampling error) estimates were obtained of the contribution of each component to the overall uncertainty. Steady-state diet:hair and diet:blood MeHg exposure ratios were estimated for each population and were used to develop distributions useful for conducting biomarker-based probabilistic assessments of MeHg exposure. The 5th and 95th percentile modeled MeHg exposure estimates around mean population exposure from each of the four study populations are presented to demonstrate lack of knowledge uncertainty about a best estimate for a true mean. Results from a U.K. study population showed that a predictive dietary model resulted in a 74% lower lack of knowledge uncertainty around a central mean estimate relative to a hair biomarker model, and also in a 31% lower lack of knowledge uncertainty around central mean estimate relative to a blood biomarker model. Similar results were obtained for the Brazil and Bangladesh populations. Such analyses, used here to evaluate alternative models of dietary MeHg exposure, can be used to refine exposure instruments, improve information used in site management and remediation decision making, and identify sources of uncertainty in risk estimates.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号