首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 140 毫秒
1.
Indirect exposures to 2,3,7,8-tetrachlorodibenzo- p -dioxin (TCDD) and other toxic materials released in incinerator emissions have been identified as a significant concern for human health. As a result, regulatory agencies and researchers have developed specific approaches for evaluating exposures from indirect pathways. This paper presents a quantitative assessment of the effect of uncertainty and variation in exposure parameters on the resulting estimates of TCDD dose rates received by individuals indirectly exposed to incinerator emissions through the consumption of home-grown beef. The assessment uses a nested Monte Carlo model that separately characterizes uncertainty and variation in dose rate estimates. Uncertainty resulting from limited data on the fate and transport of TCDD are evaluated, and variations in estimated dose rates in the exposed population that result from location-specific parameters and individuals'behaviors are characterized. The analysis indicates that lifetime average daily dose rates for individuals living within 10 km of a hypothetical incinerator range over three orders of magnitude. In contrast, the uncertainty in the dose rate distribution appears to vary by less than one order of magnitude, based on the sources of uncertainty included in this analysis. Current guidance for predicting exposures from indirect exposure pathways was found to overestimate the intakes for typical and high-end individuals.  相似文献   

2.
Exposure duration is an important component in determining long-term dose rates associated with exposure to environmental contaminants. Surveys of exposed populations collect information on individuals' past behaviors, including the durations of a behavior up to the time of the survey. This paper presents an empirical approach for determining the distribution of total durations that is consistent with the distribution past durations obtained from surveys. This approach is appropriate where the rates of beginning and ending a behavior are relatively constant over time. The approach allows the incorporation of information on the distribution of age in a population into the determination of the distribution of durations. The paper also explores the impact of "longevity" bias on survey data. A case study of the application of this approach to two angler populations is also provided. The results of the case study have characteristics similar to the results reported by Israeli and Nelson ( Risk Anal. 12, 65-72 (1992)) from their analytical model of residential duration. Specifically, the average period of time for the total duration in the entire population is shorter than the average period of time reported for historical duration in the surveyed individuals.  相似文献   

3.
This paper presents a method of estimating long-term exposures to point source emissions. The method consists of a Monte Carlo exposure model (PSEM or Point Source Exposure Model) that combines data on population mobility and mortality with information on daily activity patterns. The approach behind the model can be applied to a wide variety of exposure scenarios. In this paper, PSEM is used to characterize the range and distribution of lifetime equivalent doses received by inhalation of air contaminated by the emissions of a point source. The output of the model provides quantitative information on the dose, age, and gender of highly exposed individuals. The model is then used in an example risk assessment. Finally, future uses of the model's approach are discussed.  相似文献   

4.
This paper presents an approach for characterizing the probability of adverse effects occurring in a population exposed to dose rates in excess of the Reference Dose (RfD). The approach uses a linear threshold (hockey stick) model of response and is based on the current system of uncertainty factors used in setting RfDs. The approach requires generally available toxicological estimates such as No-Observed-Adverse-Effect Levels (NOAELs) or Benchmark Doses and doses at which adverse effects are observed in 50% of the test animals (ED50s). In this approach, Monte Carlo analysis is used to characterize the uncertainty in the dose response slope based on the range and magnitude of the key sources of uncertainty in setting protective doses. The method does not require information on the shape of the dose response curve for specific chemicals, but is amenable to the inclusion of such data. The approach is applied to four compounds to produce estimates of response rates for dose rates greater than the RfD  相似文献   

5.
The current approach to health risk assessment of toxic waste sites in the U.S. may lead to considerable expenditure of resources without any meaningful reduction in population exposure. Risk assessment methods used generally ignore background exposures and consider only incremental risk estimates for maximally exposed individuals. Such risk estimates do not address true public health risks to which background exposures also contribute. The purpose of this paper is to recommend a new approach to risk assessment and risk management concerning toxic waste sites. Under this new approach, which we have called public health risk assessment, chemical substances would be classified into a level of concern based on the potential health risks associated with typical national and regional background exposures. Site assessment would then be based on the level of concern for the particular pollutants involved and the potential contribution of site contaminants to typical background human exposures. While various problems can be foreseen with this approach, the key advantage is that resources would be allocated to reduce the most important sources of human exposure, and site remediation decisions could be simplified by focussing on exposure assessment rather than questionable risk extrapolations.  相似文献   

6.
The concepts of expert systems and decision support systems have received considerable attention recently. While systems have been proposed for various problem areas in business, difficulties still exist in the knowledge acquisition phase of development. This paper presents a recursive partitioning analysis (RPA) approach to knowledge acquisition. The RPA production system approach was applied to data sets representing the mortgage, commercial, and consumer lending problems. Comparison of the classification rates across these problems to the results of a generalized inductive inference production system (Quinlan's ID3 algorithm) and across the mortgage and commercial lending problems to traditional statistical modeling approaches indicated that the RPA approach provided superior results while using fewer variables.  相似文献   

7.
The ultimate goal of the research reported in this series of three articles is to derive distributions of doses of selected environmental tobacco smoke (ETS)-related chemicals for nonsmoking workers. This analysis uses data from the 16-City Study collected with personal monitors over the course of one workday in workplaces where smoking occurred. In this article, we describe distributions of ETS chemical concentrations and the characteristics of those distributions (e.g., whether the distribution was log normal for a given constituent) for the workplace exposure. Next, we present population parameters relevant for estimating dose distributions and the methods used for estimating those dose distributions. Finally, we derive distributions of doses of selected ETS-related constituents obtained in the workplace for people in smoking work environments. Estimating dose distributions provided information beyond the usual point estimate of dose and showed that the preponderance of individuals exposed to ETS in the workplace were exposed at the low end of the dose distribution curve. The results of this analysis include estimations of hourly maxima and time-weighted average (TWA) doses of nicotine from workplace exposures to ETS (extrapolated from 1 day to 1 week) and doses derived from modeled lung burdens of ultraviolet-absorbing particulate matter (UVPM) and solanesol resulting from workplace exposures to ETS (extrapolated from 1 day to 1 year).  相似文献   

8.
Scientists at the CIIT Centers for Health Research (Conolly et al., 2000, 2003; Kimbell et al., 2001a, 2001b) developed a two-stage clonal expansion model of formaldehyde-induced nasal cancers in the F344 rat that made extensive use of mechanistic information. An inference of their modeling approach was that formaldehyde-induced tumorigenicity could be optimally explained without the role of formaldehyde's mutagenic action. In this article, we examine the strength of this result and modify select features to examine the sensitivity of the predicted dose response to select assumptions. We implement solutions to the two-stage cancer model that are valid for nonhomogeneous models (i.e., models with time-dependent parameters), thus accounting for time dependence in variables. In this reimplementation, we examine the sensitivity of model predictions to pooling historical and concurrent control data, and to lumping sacrificed animals in which tumors were discovered incidentally with those in which death was caused by the tumors. We found the CIIT model results were not significantly altered with the nonhomogeneous solutions. Dose-response predictions below the range of exposures where tumors occurred in the bioassays were highly sensitive to the choice of control data. In the range of exposures where tumors were observed, the model attributed up to 74% of the added tumor probability to formaldehyde's mutagenic action when our reanalysis restricted the use of the National Toxicology Program (NTP) historical control data to only those obtained from inhalation exposures. Model results were insensitive to hourly or daily temporal variations in DNA protein cross-link (DPX) concentration, a surrogate for the dose-metric linked to formaldehyde-induced mutations, prompting us to utilize weekly averages for this quantity. Various other biological and mathematical uncertainties in the model have been retained unmodified in this analysis. These include model specification of initiated cell division and death rates, and uncertainty and variability in the dose response for cell replication rates, issues that will be considered in a future paper.  相似文献   

9.
Dale Hattis 《Risk analysis》1990,10(2):303-316
Neither experimental animal exposures nor real-life human exposures are delivered at a constant level over a full lifetime. Although there are strong theoretical reasons why all pharmacokinetic processes must "go linear" at the limit of low dose rates, fluctuations in dose rate may produce nonlinearities that either increase or decrease actual risks relative to what would be expected for constant lifetime exposure. This paper discusses quantitative theory and specific examples for a number of processes that can be expected to give rise to pharmacokinetic nonlinearities at high dose rates–including transport processes (e.g., renal tubular secretion), activating and detoxifying metabolism, DNA repair, and enhancement of cell replication following gross toxicity in target tissues. At the extreme, full saturation of a detoxification or DNA repair process has the potential to create as much as a dose2 dependence of risk on dose delivered in a single burst, and if more than one detoxification step becomes fully saturated, this can be compounded. Effects via changes in cell replication rates, which appear likely to be largely responsible for the steep upward turning curve of formaldehyde carcinogenesis in rats, can be even more profound over a relatively narrow range of dosage. General suggestions are made for experimental methods to detect nonlinearities arising from the various sources in premarket screening programs.  相似文献   

10.
Paul Price 《Risk analysis》2020,40(12):2572-2583
All individuals are exposed to multiple chemicals from multiple sources. These combined exposures are a concern because they may cause adverse effects that would not occur from an exposure recieved from any single source. Studies of combined chemical exposures, however, have found that the risks posed by such combined exposures are almost always driven by exposures from a few chemicals and sources and frequently by a single chemical from a single source. Here, a series of computer simulations of combined exposures are used to investigate when multiple sources of chemicals drive the largest risks in a population and when a single chemical from a single source is responsible for the largest risks. The analysis found that combined exposures drive the largest risks when the interindividual variation of source-specific doses is small, moderate-to-high correlations occur between the source-specific doses, and the number of sources affecting an individual varies across individuals. These findings can be used to identify sources with the greatest potential to cause combined exposures of concern.  相似文献   

11.
The traditional multistage (MS) model of carcinogenesis implies several empirically testable properties for dose-response functions. These include convex (linear or upward-curving) cumulative hazards as a function of dose; symmetric effects on lifetime tumor probability of transition rates at different stages; cumulative hazard functions that increase without bound as stage-specific transition rates increase without bound; and identical tumor probabilities for individuals with identical parameters and exposures. However, for at least some chemicals, cumulative hazards are not convex functions of dose. This paper shows that none of these predicted properties is implied by the mechanistic assumptions of the MS model itself. Instead, they arise from the simplifying "rare-tumor" approximations made in the usual mathematical analysis of the model. An alternative exact probabilistic analysis of the MS model with only two stages is presented, both for the usual case where a carcinogen acts on both stages simultaneously, and also for idealized initiation-promotion experiments in which one stage at a time is affected. The exact two-stage model successfully fits bioassay data for chemicals (e.g., 1,3-butadiene) with concave cumulative hazard functions that are not well-described by the traditional MS model. Qualitative properties of the exact two-stage model are described and illustrated by least-squares fits to several real datasets. The major contribution is to show that properties of the traditional MS model family that appear to be inconsistent with empirical data for some chemicals can be explained easily if an exact, rather than an approximate model, is used. This suggests that it may be worth using the exact model in cases where tumor rates are not negligible (e.g., in which they exceed 10%). This includes the majority of bioassay experiments currently being performed.  相似文献   

12.
Daily soil/dust ingestion rates typically used in exposure and risk assessments are based on tracer element studies, which have a number of limitations and do not separate contributions from soil and dust. This article presents an alternate approach of modeling soil and dust ingestion via hand and object mouthing of children, using EPA's SHEDS model. Results for children 3 to <6 years old show that mean and 95th percentile total ingestion of soil and dust values are 68 and 224 mg/day, respectively; mean from soil ingestion, hand‐to‐mouth dust ingestion, and object‐to‐mouth dust ingestion are 41 mg/day, 20 mg/day, and 7 mg/day, respectively. In general, hand‐to‐mouth soil ingestion was the most important pathway, followed by hand‐to‐mouth dust ingestion, then object‐to‐mouth dust ingestion. The variability results are most sensitive to inputs on surface loadings, soil‐skin adherence, hand mouthing frequency, and hand washing frequency. The predicted total soil and dust ingestion fits a lognormal distribution with geometric mean = 35.7 and geometric standard deviation = 3.3. There are two uncertainty distributions, one below the 20th percentile and the other above. Modeled uncertainties ranged within a factor of 3–30. Mean modeled estimates for soil and dust ingestion are consistent with past information but lower than the central values recommended in the 2008 EPA Child‐Specific Exposure Factors Handbook. This new modeling approach, which predicts soil and dust ingestion by pathway, source type, population group, geographic location, and other factors, offers a better characterization of exposures relevant to health risk assessments as compared to using a single value.  相似文献   

13.
In evaluating the risk of exposure to health hazards, characterizing the dose‐response relationship and estimating acceptable exposure levels are the primary goals. In analyses of health risks associated with exposure to ionizing radiation, while there is a clear agreement that moderate to high radiation doses cause harmful effects in humans, little has been known about the possible biological effects at low doses, for example, below 0.1 Gy, which is the dose range relevant to most radiation exposures of concern today. A conventional approach to radiation dose‐response estimation based on simple parametric forms, such as the linear nonthreshold model, can be misleading in evaluating the risk and, in particular, its uncertainty at low doses. As an alternative approach, we consider a Bayesian semiparametric model that has a connected piece‐wise‐linear dose‐response function with prior distributions having an autoregressive structure among the random slope coefficients defined over closely spaced dose categories. With a simulation study and application to analysis of cancer incidence data among Japanese atomic bomb survivors, we show that this approach can produce smooth and flexible dose‐response estimation while reasonably handling the risk uncertainty at low doses and elsewhere. With relatively few assumptions and modeling options to be made by the analyst, the method can be particularly useful in assessing risks associated with low‐dose radiation exposures.  相似文献   

14.
For the vast majority of chemicals that have cancer potency estimates on IRIS, the underlying database is deficient with respect to early-life exposures. This data gap has prevented derivation of cancer potency factors that are relevant to this time period, and so assessments may not fully address children's risks. This article provides a review of juvenile animal bioassay data in comparison to adult animal data for a broad array of carcinogens. This comparison indicates that short-term exposures in early life are likely to yield a greater tumor response than short-term exposures in adults, but similar tumor response when compared to long-term exposures in adults. This evidence is brought into a risk assessment context by proposing an approach that: (1) does not prorate children's exposures over the entire life span or mix them with exposures that occur at other ages; (2) applies the cancer slope factor from adult animal or human epidemiology studies to the children's exposure dose to calculate the cancer risk associated with the early-life period; and (3) adds the cancer risk for young children to that for older children/adults to yield a total lifetime cancer risk. The proposed approach allows for the unique exposure and pharmacokinetic factors associated with young children to be fully weighted in the cancer risk assessment. It is very similar to the approach currently used by U.S. EPA for vinyl chloride. The current analysis finds that the database of early life and adult cancer bioassays supports extension of this approach from vinyl chloride to other carcinogens of diverse mode of action. This approach should be enhanced by early-life data specific to the particular carcinogen under analysis whenever possible.  相似文献   

15.
Won J. Lee 《决策科学》1993,24(1):76-87
This paper presents a geometric programming (GP) approach to finding a profit-maximizing selling price and order quantity for a retailer. Demand is treated as a nonlinear function of price with a constant elasticity. The proposed GP approach finds optimal solutions for both no-quantity discounts and continuous quantity discounts cases. This approach is superior to the traditional approaches of solving a system of nonlinear equations. Since the profit function is not concave, the traditional approaches may require an exhaustive search, especially for the continuous discounts schedule case. By applying readily available theories in GP, we easily can find global optimal solutions for both cases. More importantly, the GP approach provides lower and upper bounds on the optimal profit level and sensitivity results which are unavailable from the traditional approaches. These bounding and sensitivity results are further utilized to provide additional important managerial implications on pricing and lot-sizing policies.  相似文献   

16.
Setting action levels or limits for health protection is complicated by uncertainty in the dose-response relation across a range of hazards and exposures. To address this issue, we consider the classic newsboy problem. The principles used to manage uncertainty for that case are applied to two stylized exposure examples, one for high dose and high dose rate radiation and the other for ammonia. Both incorporate expert judgment on uncertainty quantification in the dose-response relationship. The mathematical technique of probabilistic inversion also plays a key role. We propose a coupled approach, whereby scientists quantify the dose-response uncertainty using techniques such as structured expert judgment with performance weights and probabilistic inversion, and stakeholders quantify associated loss rates.  相似文献   

17.
Humans are continuously exposed to chemicals with suspected or proven endocrine disrupting chemicals (EDCs). Risk management of EDCs presents a major unmet challenge because the available data for adverse health effects are generated by examining one compound at a time, whereas real‐life exposures are to mixtures of chemicals. In this work, we integrate epidemiological and experimental evidence toward a whole mixture strategy for risk assessment. To illustrate, we conduct the following four steps in a case study: (1) identification of single EDCs (“bad actors”)—measured in prenatal blood/urine in the SELMA study—that are associated with a shorter anogenital distance (AGD) in baby boys; (2) definition and construction of a “typical” mixture consisting of the “bad actors” identified in Step 1; (3) experimentally testing this mixture in an in vivo animal model to estimate a dose–response relationship and determine a point of departure (i.e., reference dose [RfD]) associated with an adverse health outcome; and (4) use a statistical measure of “sufficient similarity” to compare the experimental RfD (from Step 3) to the exposure measured in the human population and generate a “similar mixture risk indicator” (SMRI). The objective of this exercise is to generate a proof of concept for the systematic integration of epidemiological and experimental evidence with mixture risk assessment strategies. Using a whole mixture approach, we could find a higher rate of pregnant women under risk (13%) when comparing with the data from more traditional models of additivity (3%), or a compound‐by‐compound strategy (1.6%).  相似文献   

18.
汇率的非线性组合预测方法研究   总被引:5,自引:2,他引:3  
近年来的经济统计研究表明,组合预测比单项预测具有更高的预测精度,但线性组合预测方法在汇率的组合建模与预测方面存在着较大的局限性。本文提出了一种基于模糊神经网络的汇率非线性组合建模与预测新方法,并给出了相应的混合学习算法。对于英镑、法朗、瑞士法朗、日本元对美元等汇率时间序列的组合建模与预测结果表明,该方法具有很强的学习与泛化能力,在处理外汇市场这种具有一定程度不确定性的非线性系统的组合建模与预测方面有很好的应用价值。  相似文献   

19.
供应链融资业务中钢材质押贷款动态质押率设定的VaR方法   总被引:2,自引:0,他引:2  
异于债券、股票等质押融资业务,存货质押业务动态质押的核心在于预测其长期价格风险。从分析存货质押市场收益率的统计特征出发,以场外现货交易为主的钢材((HRB335)日数据为例,建立能刻画钢材收益率序列异方差性和尖峰厚尾特性的VaR-GARCH(1,1)-GED模型。同时,提出置于多风险窗口下运用样本外预测未来质押期内钢材价格风险水平,给出厚尾分布下长期风险VaR的计算解析式,得出与银行风险承受能力相一致的质押率。进而,基于失效率法则建立长期风险的碰撞序列函数,回测多风险窗口下长期VaR值。实证分析显示,模型得到的质押率在控制好风险的同时降低了效率损失,为商业银行提供一种动态质押率的风险管理模式和框架。  相似文献   

20.
To better understand the risk of exposure to food allergens, food challenge studies are designed to slowly increase the dose of an allergen delivered to allergic individuals until an objective reaction occurs. These dose‐to‐failure studies are used to determine acceptable intake levels and are analyzed using parametric failure time models. Though these models can provide estimates of the survival curve and risk, their parametric form may misrepresent the survival function for doses of interest. Different models that describe the data similarly may produce different dose‐to‐failure estimates. Motivated by predictive inference, we developed a Bayesian approach to combine survival estimates based on posterior predictive stacking, where the weights are formed to maximize posterior predictive accuracy. The approach defines a model space that is much larger than traditional parametric failure time modeling approaches. In our case, we use the approach to include random effects accounting for frailty components. The methodology is investigated in simulation, and is used to estimate allergic population eliciting doses for multiple food allergens.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号