首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In earlier work we assembled a database of classical pharmacokinetic parameters (e.g., elimination half-lives; volumes of distribution) in children and adults. These data were then analyzed to define mean differences between adults and children of various age groups. In this article, we first analyze the variability in half-life observations where individual data exist. The major findings are as follows. The age groups defined in the earlier analysis of arithmetic mean data (0-1 week premature; 0-1 week full term; 1 week to 2 months; 2-6 months; 6 months to 2 years; 2-12 years; and 12-18 years) are reasonable for depicting child/adult pharmacokinetic differences, but data for some of the earliest age groups are highly variable. The fraction of individual children's half-lives observed to exceed the adult mean half-life by more than the 3.2-fold uncertainty factor commonly attributed to interindividual pharmacokinetic variability is 27% (16/59) for the 0-1 week age group, and 19% (5/26) in the 1 week to 2 month age group, compared to 0/87 for all the other age groups combined between 2 months and 18 years. Children within specific age groups appear to differ from adults with respect to the amount of variability and the form of the distribution of half-lives across the population. The data indicate departure from simple unimodal distributions, particularly in the 1 week to 2 month age group, suggesting that key developmental steps affecting drug removal tend to occur in that period. Finally, in preparation for age-dependent physiologically-based pharmacokinetic modeling, nationally representative NHANES III data are analyzed for distributions of body size and fat content. The data from about age 3 to age 10 reveal important departures from simple unimodal distributional forms-in the direction suggesting a subpopulation of children that are markedly heavier than those in the major mode. For risk assessment modeling, this means that analysts will need to consider "mixed" distributions (e.g., two or more normal or log-normal modes) in which the proportions of children falling within the major versus highweight/fat modes in the mixture changes as a function of age. Biologically, the most natural interpretation of this is that these subpopulations represent children who have or have not yet received particular signals for change in growth pattern. These apparently distinct subpopulations would be expected to exhibit different disposition of xenobiotics, particularly those that are highly lipophilic and poorly metabolized.  相似文献   

2.
The appearance of measurement error in exposure and risk factor data potentially affects any inferences regarding variability and uncertainty because the distribution representing the observed data set deviates from the distribution that represents an error-free data set. A methodology for improving the characterization of variability and uncertainty with known measurement errors in data is demonstrated in this article based on an observed data set, known measurement error, and a measurement-error model. A practical method for constructing an error-free data set is presented and a numerical method based upon bootstrap pairs, incorporating two-dimensional Monte Carlo simulation, is introduced to address uncertainty arising from measurement error in selected statistics. When measurement error is a large source of uncertainty, substantial differences between the distribution representing variability of the observed data set and the distribution representing variability of the error-free data set will occur. Furthermore, the shape and range of the probability bands for uncertainty differ between the observed and error-free data set. Failure to separately characterize contributions from random sampling error and measurement error will lead to bias in the variability and uncertainty estimates. However, a key finding is that total uncertainty in mean can be properly quantified even if measurement and random sampling errors cannot be separated. An empirical case study is used to illustrate the application of the methodology.  相似文献   

3.
Environmental tobacco smoke (ETS) is a major contributor to indoor human exposures to fine particulate matter of 2.5 μm or smaller (PM2.5). The Stochastic Human Exposure and Dose Simulation for Particulate Matter (SHEDS‐PM) Model developed by the U.S. Environmental Protection Agency estimates distributions of outdoor and indoor PM2.5 exposure for a specified population based on ambient concentrations and indoor emissions sources. A critical assessment was conducted of the methodology and data used in SHEDS‐PM for estimation of indoor exposure to ETS. For the residential microenvironment, SHEDS uses a mass‐balance approach, which is comparable to best practices. The default inputs in SHEDS‐PM were reviewed and more recent and extensive data sources were identified. Sensitivity analysis was used to determine which inputs should be prioritized for updating. Data regarding the proportion of smokers and “other smokers” and cigarette emission rate were found to be important. SHEDS‐PM does not currently account for in‐vehicle ETS exposure; however, in‐vehicle ETS‐related PM2.5 levels can exceed those in residential microenvironments by a factor of 10 or more. Therefore, a mass‐balance‐based methodology for estimating in‐vehicle ETS PM2.5 concentration is evaluated. Recommendations are made regarding updating of input data and algorithms related to ETS exposure in the SHEDS‐PM model. Interindividual variability for ETS exposure was quantified. Geographic variability in ETS exposure was quantified based on the varying prevalence of smokers in five selected locations in the United States.  相似文献   

4.
Biomarkers such as DNA adducts have significant potential to improve quantitative risk assessment by characterizing individual differences in metabolism of genotoxins and DNA repair and accounting for some of the factors that could affect interindividual variation in cancer risk. Inherent uncertainty in laboratory measurements and within-person variability of DNA adduct levels over time are putatively unrelated to cancer risk and should be subtracted from observed variation to better estimate interindividual variability of response to carcinogen exposure. A total of 41 volunteers, both smokers and nonsmokers, were asked to provide a peripheral blood sample every 3 weeks for several months in order to specifically assess intraindividual variability of polycyclic aromatic hydrocarbon (PAH)-DNA adduct levels. The intraindividual variance in PAH-DNA adduct levels, together with measurement uncertainty (laboratory variability and unaccounted for differences in exposure), constituted roughly 30% of the overall variance. An estimated 70% of the total variance was contributed by interindividual variability and is probably representative of the true biologic variability of response to carcinogenic exposure in lymphocytes. The estimated interindividual variability in DNA damage after subtracting intraindividual variability and measurement uncertainty was 24-fold. Inter-individual variance was higher (52-fold) in persons who constitutively lack the Glutathione S-Transferase M1 (GSTM1) gene which is important in the detoxification pathway of PAH. Risk assessment models that do not consider the variability of susceptibility to DNA damage following carcinogen exposure may underestimate risks to the general population, especially for those people who are most vulnerable.  相似文献   

5.
The health‐related damages associated with emissions from coal‐fired power plants can vary greatly across facilities as a function of plant, site, and population characteristics, but the degree of variability and the contributing factors have not been formally evaluated. In this study, we modeled the monetized damages associated with 407 coal‐fired power plants in the United States, focusing on premature mortality from fine particulate matter (PM2.5). We applied a reduced‐form chemistry‐transport model accounting for primary PM2.5 emissions and the influence of sulfur dioxide (SO2) and nitrogen oxide (NOx) emissions on secondary particulate formation. Outputs were linked with a concentration‐response function for PM2.5‐related mortality that incorporated nonlinearities and model uncertainty. We valued mortality with a value of statistical life approach, characterizing and propagating uncertainties in all model elements. At the median of the plant‐specific uncertainty distributions, damages across plants ranged from $30,000 to $500,000 per ton of PM2.5, $6,000 to $50,000 per ton of SO2, $500 to $15,000 per ton of NOx, and $0.02 to $1.57 per kilowatt‐hour of electricity generated. Variability in damages per ton of emissions was almost entirely explained by population exposure per unit emissions (intake fraction), which itself was related to atmospheric conditions and the population size at various distances from the power plant. Variability in damages per kilowatt‐hour was highly correlated with SO2 emissions, related to fuel and control technology characteristics, but was also correlated with atmospheric conditions and population size at various distances. Our findings emphasize that control strategies that consider variability in damages across facilities would yield more efficient outcomes.  相似文献   

6.
Physical property values are used in environmental risk assessments to estimate media and risk-based concentrations. However, considerable variability has recently been reported with such values. To evaluate potential variability in physical parameter values supporting a variety of regulatory programs, eight data sources were chosen for evaluation, and chemicals appearing in at least four sources were selected. There were 755 chemicals chosen. In addition, chemicals in seven environmentally important subgroups were also identified for evaluation. Nine parameters were selected for analysis-molecular weight (MolWt), melting point (MeltPt), boiling point (BoilPt), vapor pressure (VP), water solubility (AqSOL), Henry's law constant (HLC), octanol-water partition coefficient (Kow), and diffusion coefficients in air (Dair) and water (Dwater). Results show that while 71% of constituents had equal MolWts across data sources, <3% of the constituents had equivalent parameter values across data sources for AqSOL, VP, or HLC. Considerable dissimilarity between certain sources was also observed. Furthermore, measures of dispersion showed considerable variation in data sets for Kow, VP, AqSOL, and HLC compared to measures for MolWt, MeltPt, BoilPt, or Dwater. The magnitude of the observed variability was also noteworthy. For example, the 95th percentile ratio of maximum/minimum parameter values ranged from 1.0 for MolWt to well over 1.0E + 06 for VP, and HLC. Risk and exposure metrics also varied by similar magnitudes. Results with environmentally important subgroups were similar. These results show that there is considerable variability in physical parameter values from standard sources, and that the observed variability could affect potential risk estimates and perhaps risk management decisions.  相似文献   

7.
Benzene is myelotoxic and leukemogenic in humans exposed at high doses (>1 ppm, more definitely above 10 ppm) for extended periods. However, leukemia risks at lower exposures are uncertain. Benzene occurs widely in the work environment and also indoor air, but mostly below 1 ppm, so assessing the leukemia risks at these low concentrations is important. Here, we describe a human physiologically-based pharmacokinetic (PBPK) model that quantifies tissue doses of benzene and its key metabolites, benzene oxide, phenol, and hydroquinone after inhalation and oral exposures. The model was integrated into a statistical framework that acknowledges sources of variation due to inherent intra- and interindividual variation, measurement error, and other data collection issues. A primary contribution of this work is the estimation of population distributions of key PBPK model parameters. We hypothesized that observed interindividual variability in the dosimetry of benzene and its metabolites resulted primarily from known or estimated variability in key metabolic parameters and that a statistical PBPK model that explicitly included variability in only those metabolic parameters would sufficiently describe the observed variability. We then identified parameter distributions for the PBPK model to characterize observed variability through the use of Markov chain Monte Carlo analysis applied to two data sets. The identified parameter distributions described most of the observed variability, but variability in physiological parameters such as organ weights may also be helpful to faithfully predict the observed human-population variability in benzene dosimetry.  相似文献   

8.
The objective of this study was to link arsenic exposure and influenza A (H1N1) infection‐induced respiratory effects to assess the impact of arsenic‐contaminated drinking water on exacerbation risk of A (H1N1)‐associated lung function. The homogeneous Poisson process was used to approximate the related processes between arsenic exposure and influenza‐associated lung function exacerbation risk. We found that (i) estimated arsenic‐induced forced expiratory volume in 1 second (FEV1) reducing rates ranged from 0.116 to 0.179 mL/μg for age 15–85 years, (ii) estimated arsenic‐induced A (H1N1) viral load increasing rate was 0.5 mL/μg, (iii) estimated A (H1N1) virus‐induced FEV1 reducing rate was 0.10 mL/logTCID50, and (iv) the relationship between arsenic exposure and A (H1N1)‐associated respiratory symptoms scores (RSS) can be described by a Hill model. Here we showed that maximum RSS at day 2 postinfection for Taiwan, West Bengal (India), and the United States were estimated to be in the severe range of 0.83, 0.89, and 0.81, respectively, indicating that chronic arsenic exposure and A (H1N1) infection together are most likely to pose potential exacerbations risk of lung function, although a 50% probability of lung function exacerbations risk induced by arsenic and influenza infection was within the mild and moderate ranges of RSS at day 1 and 2 postinfection. We concluded that avoidance of drinking arsenic‐containing water could significantly reduce influenza respiratory illness and that need will become increasingly urgent as the novel H1N1 pandemic influenza virus infects people worldwide.  相似文献   

9.
Population and diary sampling methods are employed in exposure models to sample simulated individuals and their daily activity on each simulation day. Different sampling methods may lead to variations in estimated human exposure. In this study, two population sampling methods (stratified‐random and random‐random) and three diary sampling methods (random resampling, diversity and autocorrelation, and Markov‐chain cluster [MCC]) are evaluated. Their impacts on estimated children's exposure to ambient fine particulate matter (PM2.5) are quantified via case studies for children in Wake County, NC for July 2002. The estimated mean daily average exposure is 12.9 μg/m3 for simulated children using the stratified population sampling method, and 12.2 μg/m3 using the random sampling method. These minor differences are caused by the random sampling among ages within census tracts. Among the three diary sampling methods, there are differences in the estimated number of individuals with multiple days of exposures exceeding a benchmark of concern of 25 μg/m3 due to differences in how multiday longitudinal diaries are estimated. The MCC method is relatively more conservative. In case studies evaluated here, the MCC method led to 10% higher estimation of the number of individuals with repeated exposures exceeding the benchmark. The comparisons help to identify and contrast the capabilities of each method and to offer insight regarding implications of method choice. Exposure simulation results are robust to the two population sampling methods evaluated, and are sensitive to the choice of method for simulating longitudinal diaries, particularly when analyzing results for specific microenvironments or for exposures exceeding a benchmark of concern.  相似文献   

10.
Physical property values are used in environmental risk assessments to estimate media and risk-based concentrations. Recently, however, considerable variability has been reported with such values. To evaluate potential variability in physical parameter values supporting a variety of regulatory programs, eight data sources were chosen for evaluation, and chemicals appearing in at least four sources were selected. There were 755 chemicals chosen. In addition, chemicals in seven environmentally important subgroups were also identified for evaluation. Nine parameters were selected for analysis--molecular weight (MolWt), melting point (MeltPt), boiling point (BoilPt), vapor pressure (VP), water solubility (AqSOL), Henry's law constant (HLC), octanol-water partition coefficient (Kow), and diffusion coefficients in air (Dair) and water (Dwater). Results show that while 71% of constituents had equal MolWts across data sources, <3% of the constituents had equivalent parameter values across data sources for AqSOL, VP, or HLC. Considerable dissimilarity between certain sources was also observed. Furthermore, measures of dispersion showed considerable variation in data sets for Kow, VP, AqSOL, and HLC compared to measures for MolWt, MeltPt, BoilPt, or Dwater. The magnitude of the observed variability was also noteworthy. For example, the 95th percentile ratio of maximum/minimum parameter values ranged from 1.0 for MolWt to well over 1.0 x 10(6) for VP and HLC. Risk and exposure metrics also varied by similar magnitudes. Results with environmentally important subgroups were similar. These results show that there is considerable variability in physical parameter values from standard sources, and that the observed variability could affect potential risk estimates and perhaps risk management decisions.  相似文献   

11.
《Risk analysis》2018,38(8):1738-1757
We developed a risk assessment of human salmonellosis associated with consumption of alfalfa sprouts in the United States to evaluate the public health impact of applying treatments to seeds (0–5‐log10 reduction in Salmonella ) and testing spent irrigation water (SIW) during production. The risk model considered variability and uncertainty in Salmonella contamination in seeds, Salmonella growth and spread during sprout production, sprout consumption, and Salmonella dose response. Based on an estimated prevalence of 2.35% for 6.8 kg seed batches and without interventions, the model predicted 76,600 (95% confidence interval (CI) 15,400 – 248,000) cases/year. Risk reduction (by 5 ‐ to 7‐fold) predicted from a 1‐log10 seed treatment alone was comparable to SIW testing alone, and each additional 1‐log10 seed treatment was predicted to provide a greater risk reduction than SIW testing. A 3‐log10 or a 5‐log10 seed treatment reduced the predicted cases/year to 139 (95% CI 33 – 448) or 1.4 (95% CI <1 – 4.5), respectively. Combined with SIW testing, a 3‐log10 or 5‐log10 seed treatment reduced the cases/year to 45 (95% CI 10–146) or <1 (95% CI <1 – 1.5), respectively. If the SIW coverage was less complete (i.e., less representative), a smaller risk reduction was predicted, e.g., a combined 3‐log10 seed treatment and SIW testing with 20% coverage resulted in an estimated 92 (95% CI 22 – 298) cases/year. Analysis of alternative scenarios using different assumptions for key model inputs showed that the predicted relative risk reductions are robust. This risk assessment provides a comprehensive approach for evaluating the public health impact of various interventions in a sprout production system.  相似文献   

12.
Understanding Uncertainty   总被引:8,自引:0,他引:8  
There is more information we don't know than we do know for making most critical decisions involving risks. Our focus must be on understanding and effectively dealing with what we don't know. As a first step in achieving this focus, a classification of the types of uncertainties that must be addressed and the sources of these types of uncertainties is presented. The purpose is to provide a framework for discussion about addressing uncertainty, particularly in risk analyses.
Both uncertainty and variability of information are addressed using four main classes:
  • 1) 

    Metrical uncertainty and variability in measurement,

  • 2) 

    Structural uncertainty due to complexity, including models and their validation,

  • 3) 

    Temporal uncertainty in future and past states

  • 4) 

    Translational uncertainty in explaining uncertain results.


The factors that contribute uncertainty and error to these classes are identified, and their interrelationships indicated. Both subjective and objective aspects are addressed.  相似文献   

13.
Marc Kennedy  Andy Hart 《Risk analysis》2009,29(10):1427-1442
We propose new models for dealing with various sources of variability and uncertainty that influence risk assessments for dietary exposure. The uncertain or random variables involved can interact in complex ways, and the focus is on methodology for integrating their effects and on assessing the relative importance of including different uncertainty model components in the calculation of dietary exposures to contaminants, such as pesticide residues. The combined effect is reflected in the final inferences about the population of residues and subsequent exposure assessments. In particular, we show how measurement uncertainty can have a significant impact on results and discuss novel statistical options for modeling this uncertainty. The effect of measurement error is often ignored, perhaps due to the laboratory process conforming to the relevant international standards, for example, or is treated in an  ad hoc  way. These issues are common to many dietary risk analysis problems, and the methods could be applied to any food and chemical of interest. An example is presented using data on carbendazim in apples and consumption surveys of toddlers.  相似文献   

14.
Risk‐based, background, and laboratory quantitation limit‐derived standards for carcinogenic polycyclic aromatic hydrocarbons (cPAHs) in residential and nonresidential soils vary across the northeast region of the United States. The magnitude and extent of this variation, however, have not been systematically studied. This article examines the technical basis and methodology used by eight northeastern states in the development of risk‐based screening values, guidelines, and standards for cPAHs in soils. Exposure pathways, human receptors, algorithms, and input variables used by each state in the calculation of acceptable human health risks are identified and reviewed within the context of environmental policy and regulatory impacts. Emphasis is placed on a comparative analysis of multipathway exposures (incidental ingestion, dermal contact, and particulate inhalation) and key science‐policy decisions that have led to the promulgation and adoption of different exposure criteria for cPAHs in the Northeast. More than 425 data points and 20 distinct exposure factors across eight state programs, 18 age subgroups, six activity scenarios, and three exposure pathways were systematically evaluated. Risk‐based values for one state varied either above or below risk‐based, background or laboratory quantitation limit‐derived standards of another state for the same cPAH and receptor. Standards for cPAHs in soils were found to differ significantly across the northeast region—in some cases, by one or two orders of magnitude. While interstate differences can be expected to persist, future changes in federal guidance could mean a shift in risk drivers, compliance status, or calculated cumulative risks for individual properties impacted by PAH releases.  相似文献   

15.
Biomagnification of organochlorine and other persistent organic contaminants by higher trophic level organisms represents one of the most significant sources of uncertainty and variability in evaluating potential risks associated with disposal of dredged materials. While it is important to distinguish between population variability (e.g., true population heterogeneity in fish weight, and lipid content) and uncertainty (e.g., measurement error), they can be operationally difficult to define separately in probabilistic estimates of human health and ecological risk. We propose a disaggregation of uncertain and variable parameters based on: (1) availability of supporting data; (2) the specific management and regulatory context (in this case, of the U.S. Army Corps of Engineers/U.S. Environmental Protection Agency tiered approach to dredged material management); and (3) professional judgment and experience in conducting probabilistic risk assessments. We describe and quantitatively evaluate several sources of uncertainty and variability in estimating risk to human health from trophic transfer of polychlorinated biphenyls (PCBs) using a case study of sediments obtained from the New York-New Jersey Harbor and being evaluated for disposal at an open water off-shore disposal site within the northeast region. The estimates of PCB concentrations in fish and dietary doses of PCBs to humans ingesting fish are expressed as distributions of values, of which the arithmetic mean or mode represents a particular fractile. The distribution of risk values is obtained using a food chain biomagnification model developed by Gobas by specifying distributions for input parameters disaggregated to represent either uncertainty or variability. Only those sources of uncertainty that could be quantified were included in the analysis. Results for several different two-dimensional Latin Hypercube analyses are provided to evaluate the influence of the uncertain versus variable disaggregation of model parameters. The analysis suggests that variability in human exposure parameters is greater than the uncertainty bounds on any particular fractile, given the described assumptions.  相似文献   

16.
There is considerable debate as to the most appropriate metric for characterizing the mortality impacts of air pollution. Life expectancy has been advocated as an informative measure. Although the life‐table calculus is relatively straightforward, it becomes increasingly cumbersome when repeated over large numbers of geographic areas and for multiple causes of death. Two simplifying assumptions were evaluated: linearity of the relation between excess rate ratio and change in life expectancy, and additivity of cause‐specific life‐table calculations. We employed excess rate ratios linking PM2.5 and mortality from cerebrovascular disease, chronic obstructive pulmonary disease, ischemic heart disease, and lung cancer derived from a meta‐analysis of worldwide cohort studies. As a sensitivity analysis, we employed an integrated exposure response function based on the observed risk of PM2.5 over a wide range of concentrations from ambient exposure, indoor exposure, second‐hand smoke, and personal smoking. Impacts were estimated in relation to a change in PM2.5 from 19.5 μg/m3 estimated for Toronto to an estimated natural background concentration of 1.8 μg/m3. Estimated changes in life expectancy varied linearly with excess rate ratios, but at higher values the relationship was more accurately represented as a nonlinear function. Changes in life expectancy attributed to specific causes of death were additive with maximum error of 10%. Results were sensitive to assumptions about the air pollution concentration below which effects on mortality were not quantified. We have demonstrated valid approximations comprising expression of change in life expectancy as a function of excess mortality and summation across multiple causes of death.  相似文献   

17.
This paper proposes a model of decision under ambiguity deemed vector expected utility, or VEU. In this model, an uncertain prospect, or Savage act, is assessed according to (a) a baseline expected‐utility evaluation, and (b) an adjustment that reflects the individual's perception of ambiguity and her attitudes toward it. The adjustment is itself a function of the act's exposure to distinct sources of ambiguity, as well as its variability. The key elements of the VEU model are a baseline probability and a collection of random variables, or adjustment factors, which represent acts exposed to distinct ambiguity sources and also reflect complementarities among ambiguous events. The adjustment to the baseline expected‐utility evaluation of an act is a function of the covariance of its utility profile with each adjustment factor, which reflects exposure to the corresponding ambiguity source. A behavioral characterization of the VEU model is provided. Furthermore, an updating rule for VEU preferences is proposed and characterized. The suggested updating rule facilitates the analysis of sophisticated dynamic choice with VEU preferences.  相似文献   

18.
We analyze the benefits of inventory pooling in a multi‐location newsvendor framework. Using a number of common demand distributions, as well as the distribution‐free approximation, we compare the centralized (pooled) system with the decentralized (non‐pooled) system. We investigate the sensitivity of the absolute and relative reduction in costs to the variability of demand and to the number of locations (facilities) being pooled. We show that for the distributions considered, the absolute benefit of risk pooling increases with variability, and the relative benefit stays fairly constant, as long as the coefficient of variation of demand stays in the low range. However, under high‐variability conditions, both measures decrease to zero as the demand variability is increased. We show, through analytical results and computational experiments, that these effects are due to the different operating regimes exhibited by the system under different levels of variability: as the variability is increased, the system switches from the normal operation to the effective and then complete shutdown regimes; the decrease in the benefits of risk pooling is associated with the two latter stages. The centralization allows the system to remain in the normal operation regime under higher levels of variability compared to the decentralized system.  相似文献   

19.
The objectives of this study are to understand tradeoffs between forest carbon and timber values, and evaluate the impact of uncertainty in improved forest management (IFM) carbon offset projects to improve forest management decisions. The study uses probabilistic simulation of uncertainty in financial risk for three management scenarios (clearcutting in 45‐ and 65‐year rotations and no harvest) under three carbon price schemes (historic voluntary market prices, cap and trade, and carbon prices set to equal net present value (NPV) from timber‐oriented management). Uncertainty is modeled for value and amount of carbon credits and wood products, the accuracy of forest growth model forecasts, and four other variables relevant to American Carbon Registry methodology. Calculations use forest inventory data from a 1,740 ha forest in western Washington State, using the Forest Vegetation Simulator (FVS) growth model. Sensitivity analysis shows that FVS model uncertainty contributes more than 70% to overall NPV variance, followed in importance by variability in inventory sample (3–14%), and short‐term prices for timber products (8%), while variability in carbon credit price has little influence (1.1%). At regional average land‐holding costs, a no‐harvest management scenario would become revenue‐positive at a carbon credit break‐point price of $14.17/Mg carbon dioxide equivalent (CO2e). IFM carbon projects are associated with a greater chance of both large payouts and large losses to landowners. These results inform policymakers and forest owners of the carbon credit price necessary for IFM approaches to equal or better the business‐as‐usual strategy, while highlighting the magnitude of financial risk and reward through probabilistic simulation.  相似文献   

20.
《Risk analysis》2018,38(1):194-209
This article presents the findings from a numerical simulation study that was conducted to evaluate the performance of alternative statistical analysis methods for background screening assessments when data sets are generated with incremental sampling methods (ISMs). A wide range of background and site conditions are represented in order to test different ISM sampling designs. Both hypothesis tests and upper tolerance limit (UTL) screening methods were implemented following U.S. Environmental Protection Agency (USEPA) guidance for specifying error rates. The simulations show that hypothesis testing using two‐sample t ‐tests can meet standard performance criteria under a wide range of conditions, even with relatively small sample sizes. Key factors that affect the performance include unequal population variances and small absolute differences in population means. UTL methods are generally not recommended due to conceptual limitations in the technique when applied to ISM data sets from single decision units and due to insufficient power given standard statistical sample sizes from ISM.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号