首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The traditional multistage (MS) model of carcinogenesis implies several empirically testable properties for dose-response functions. These include convex (linear or upward-curving) cumulative hazards as a function of dose; symmetric effects on lifetime tumor probability of transition rates at different stages; cumulative hazard functions that increase without bound as stage-specific transition rates increase without bound; and identical tumor probabilities for individuals with identical parameters and exposures. However, for at least some chemicals, cumulative hazards are not convex functions of dose. This paper shows that none of these predicted properties is implied by the mechanistic assumptions of the MS model itself. Instead, they arise from the simplifying "rare-tumor" approximations made in the usual mathematical analysis of the model. An alternative exact probabilistic analysis of the MS model with only two stages is presented, both for the usual case where a carcinogen acts on both stages simultaneously, and also for idealized initiation-promotion experiments in which one stage at a time is affected. The exact two-stage model successfully fits bioassay data for chemicals (e.g., 1,3-butadiene) with concave cumulative hazard functions that are not well-described by the traditional MS model. Qualitative properties of the exact two-stage model are described and illustrated by least-squares fits to several real datasets. The major contribution is to show that properties of the traditional MS model family that appear to be inconsistent with empirical data for some chemicals can be explained easily if an exact, rather than an approximate model, is used. This suggests that it may be worth using the exact model in cases where tumor rates are not negligible (e.g., in which they exceed 10%). This includes the majority of bioassay experiments currently being performed.  相似文献   

2.
In the evaluation of chemical compounds for carcinogenic risk, regulatory agencies such as the U.S. Environmental Protection Agency and National Toxicology Program (NTP) have traditionally fit a dose-response model to data from rodent bioassays, and then used the fitted model to estimate a Virtually Safe Dose or the dose corresponding to a very small increase (usually 10(-6)) in risk over background. Much recent interest has been directed at incorporating additional scientific information regarding the properties of the specific chemical under investigation into the risk assessment process, including biological mechanisms of cancer induction, metabolic pathways, and chemical structure and activity. Despite the fact that regulatory agencies are currently poised to allow use of nonlinear dose-response models based on the concept of an underlying threshold for nongenotoxic chemicals, there have been few attempts to investigate the overall relationship between the shape of dose-response curves and mutagenicity. Using data from an historical database of NTP cancer bioassays, the authors conducted a repeated-measures Analysis of the estimated shape from fitting extended Weibull dose-response curves. It was concluded that genotoxic chemicals have dose-response curves that are closer to linear than those for nongenotoxic chemicals, though on average, both types of compounds have dose-response curves that are convex and the effect of genotoxicity is small.  相似文献   

3.
Data from a human feeding trial with healthy men were used to develop a dose-response model for 13 strains of Salmonella and to determine the effects of strain variation on the shape of the dose-response curve. Dose-response data for individual strains were fit to a three-phase linear model to determine minimum, median, and maximum illness doses, which were used to define Pert distributions in a computer simulation model. Pert distributions for illness dose of individual strains were combined in an Excel spreadsheet using a discrete distribution to model strain prevalence. In addition, a discrete distribution was used to model dose groups and thus create a model that simulated human feeding trials. During simulation of the model with @Risk, an illness dose and a dose consumed were randomly assigned to each consumption event in the simulated feeding trial and if the illness dose was greater than the dose consumed then the model predicted no illness, otherwise the model predicted that an illness would occur. To verify the dose-response model predictions, the original feeding trial was simulated. The dose-response model predicted a median of 69 (range of 43-101) illnesses compared to 74 in the original trial. Thus, its predictions were in agreement with the data used to develop it. However, predictions of the model are only valid for eggnog, healthy men, and the strains and doses of Salmonella used to develop it. When multiple strains of Salmonella were simulated together, the predicted dose-response curves were irregular in shape. Thus, the sigmoid shape of dose-response curves in feeding trials with one strain of Salmonella may not accurately reflect dose response in naturally contaminated food where multiple strains may be present.  相似文献   

4.
The alleviation of food-borne diseases caused by microbial pathogen remains a great concern in order to ensure the well-being of the general public. The relation between the ingested dose of organisms and the associated infection risk can be studied using dose-response models. Traditionally, a model selected according to a goodness-of-fit criterion has been used for making inferences. In this article, we propose a modified set of fractional polynomials as competitive dose-response models in risk assessment. The article not only shows instances where it is not obvious to single out one best model but also illustrates that model averaging can best circumvent this dilemma. The set of candidate models is chosen based on biological plausibility and rationale and the risk at a dose common to all these models estimated using the selected models and by averaging over all models using Akaike's weights. In addition to including parameter estimation inaccuracy, like in the case of a single selected model, model averaging accounts for the uncertainty arising from other competitive models. This leads to a better and more honest estimation of standard errors and construction of confidence intervals for risk estimates. The approach is illustrated for risk estimation at low dose levels based on Salmonella typhi and Campylobacter jejuni data sets in humans. Simulation studies indicate that model averaging has reduced bias, better precision, and also attains coverage probabilities that are closer to the 95% nominal level compared to best-fitting models according to Akaike information criterion.  相似文献   

5.
Pregnant CD-1 mice were exposed to cortisone acetate at doses ranging from 20 to 100 mg/kg/ day on days 10-13 by oral and intramuscular routes. Multiple replicate assays were conducted under identical conditions to assess the reproducibility of the dose–response curve for cleft palate. The data were fitted to the probit, logistic, multistage or Armitage-Doll, and Weibull dose-response model separately for each route of exposure. The curves were then tested for parallel slopes (probit and logistic models) or coincidence of model parameters (multistage and Weibull models). The 19 replicate experiments had a wide range of slope estimates, wider for the oral than for the intramuscular experiments. For all models and both routes of exposure the null hypothesis of equality of slopes was rejected at a significant level of p < 0.001. For the intramuscular group of replicates, rejection of slope equality could in part be explained by not maintaining a standard dosing regime. The rejection of equivalence of dose-response curves from replicate studies showed that it is difficult to reproduce dose-response data of a single study within the limits defined by the dose-response model. This has important consequences for quantitative risk assessment, public health measures, or development of mechanistic theories which are typically based on a single animal bioassay.  相似文献   

6.
Legionnaires' disease (LD), first reported in 1976, is an atypical pneumonia caused by bacteria of the genus Legionella, and most frequently by L. pneumophila (Lp). Subsequent research on exposure to the organism employed various animal models, and with quantitative microbial risk assessment (QMRA) techniques, the animal model data may provide insights on human dose-response for LD. This article focuses on the rationale for selection of the guinea pig model, comparison of the dose-response model results, comparison of projected low-dose responses for guinea pigs, and risk estimates for humans. Based on both in vivo and in vitro comparisons, the guinea pig (Cavia porcellus) dose-response data were selected for modeling human risk. We completed dose-response modeling for the beta-Poisson (approximate and exact), exponential, probit, logistic, and Weibull models for Lp inhalation, mortality, and infection (end point elevated body temperature) in guinea pigs. For mechanistic reasons, including low-dose exposure probability, further work on human risk estimates for LD employed the exponential and beta-Poisson models. With an exposure of 10 colony-forming units (CFU) (retained dose), the QMRA model predicted a mild infection risk of 0.4 (as evaluated by seroprevalence) and a clinical severity LD case (e.g., hospitalization and supportive care) risk of 0.0009. The calculated rates based on estimated human exposures for outbreaks used for the QMRA model validation are within an order of magnitude of the reported LD rates. These validation results suggest the LD QMRA animal model selection, dose-response modeling, and extension to human risk projections were appropriate.  相似文献   

7.
Using Average Lifetime Dose Rate for Intermittent Exposures to Carcinogens   总被引:2,自引:0,他引:2  
The effect of using the average dose rate over a lifetime as a representative measure of exposure to carcinogens is investigated by comparing the true theoretical multistage intermittent-dosing lifetime low-dose excess risk to the theoretical multistage continuous-dosing lifetime risk corresponding to the average lifetime dose rate. It is concluded that low-dose risk estimates based on the average lifetime dose rate may overestimate the true risk by several orders of magnitude, but that they never underestimate the true risk by more than a factor of k/r, where k is the total number of stages in the multistage model and r is the number of stages that are dose-related.  相似文献   

8.
U.S. Environment Protection Agency benchmark doses for dichotomous cancer responses are often estimated using a multistage model based on a monotonic dose‐response assumption. To account for model uncertainty in the estimation process, several model averaging methods have been proposed for risk assessment. In this article, we extend the usual parameter space in the multistage model for monotonicity to allow for the possibility of a hormetic dose‐response relationship. Bayesian model averaging is used to estimate the benchmark dose and to provide posterior probabilities for monotonicity versus hormesis. Simulation studies show that the newly proposed method provides robust point and interval estimation of a benchmark dose in the presence or absence of hormesis. We also apply the method to two data sets on carcinogenic response of rats to 2,3,7,8‐tetrachlorodibenzo‐p‐dioxin.  相似文献   

9.
Dose-response curves were developed for the immobilization response in Daphnia magna to four toxicants. The purpose of this work was to study the effect of the form of the model and the number of concentration levels used on the estimates of typical low-dose effective concentrations (1%, 5%, 10%). The generalized four-parameter logistic model was used as the reference. When using 12 concentration levels, one of the logistic family two- or three-parameter models was shown reliably to represent each of these various sets of dose-response data, and to provide adequate estimates of EC01 and EC05, as well as EC10 and EC50. For two of the toxicants, an asymmetric model was required. When reducing the number of concentrations to five, the EC10 and EC50 were well estimated by the probit model, with acceptable results at the EC05 level.  相似文献   

10.
11.
The choice of a dose-response model is decisive for the outcome of quantitative risk assessment. Single-hit models have played a prominent role in dose-response assessment for pathogenic microorganisms, since their introduction. Hit theory models are based on a few simple concepts that are attractive for their clarity and plausibility. These models, in particular the Beta Poisson model, are used for extrapolation of experimental dose-response data to low doses, as are often present in drinking water or food products. Unfortunately, the Beta Poisson model, as it is used throughout the microbial risk literature, is an approximation whose validity is not widely known. The exact functional relation is numerically complex, especially for use in optimization or uncertainty analysis. Here it is shown that although the discrepancy between the Beta Poisson formula and the exact function is not very large for many data sets, the differences are greatest at low doses--the region of interest for many risk applications. Errors may become very large, however, in the results of uncertainty analysis, or when the data contain little low-dose information. One striking property of the exact single-hit model is that it has a maximum risk curve, limiting the upper confidence level of the dose-response relation. This is due to the fact that the risk cannot exceed the probability of exposure, a property that is not retained in the Beta Poisson approximation. This maximum possible response curve is important for uncertainty analysis, and for risk assessment of pathogens with unknown properties.  相似文献   

12.
In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow‐tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures.  相似文献   

13.
Stochastic two-stage clonal expansion (TSCE) models of carcinogenesis offer the following clear theoretical explanation for U-shaped cancer dose-response relations. Low doses that kill initiated (premalignant) cells thereby create a protective effect. At higher doses, this effect is overwhelmed by an increase in the net number of initiated cells. The sum of these two effects, from cell killing and cell proliferation, gives a U-shaped or J-shaped dose-response relation. This article shows that exposures that do not kill, repair, or decrease cell populations, but that only hasten transitions that lead to cancer, can also generate U-shaped and J-shaped dose-response relations in a competing-risk (modified TSCE) framework where exposures disproportionately hasten transitions into carcinogenic pathways with relatively long times to tumor. Quantitative modeling of the competing effects of more transitions toward carcinogenesis (risk increasing) and a higher proportion of transitions into the slower pathway (risk reducing) shows that a J-shaped dose-response relation can occur even if exposure increases the number of initiated cells at every positive dose level. This suggests a possible new explanation for hormetic dose-response relations in response to carcinogenic exposures that do not have protective (cell-killing) effects. In addition, the examples presented emphasize the role of time in hormesis: exposures that monotonically increase risks at younger ages may nonetheless produce a U-shaped or J-shaped dose-response relation for lifetime risk of cancer.  相似文献   

14.
Statistical fatigue life of a ductile alloy specimen is traditionally divided into three stages, namely, crack nucleation, small crack growth, and large crack growth. Crack nucleation and small crack growth show a wide variation and hence a big spread on cycles versus crack length graph. Relatively, large crack growth shows a lesser variation. Therefore, different models are fitted to the different stages of the fatigue evolution process, thus treating different stages as different phenomena. With these independent models, it is impossible to predict one phenomenon based on the information available about the other phenomenon. Experimentally, it is easier to carry out crack length measurements of large cracks compared to nucleating cracks and small cracks. Thus, it is easier to collect statistical data for large crack growth compared to the painstaking effort it would take to collect statistical data for crack nucleation and small crack growth. This article presents a fracture mechanics-based stochastic model of fatigue crack growth in ductile alloys that are commonly encountered in mechanical structures and machine components. The model has been validated by Ray (1998) for crack propagation by various statistical fatigue data. Based on the model, this article proposes a technique to predict statistical information of fatigue crack nucleation and small crack growth properties that uses the statistical properties of large crack growth under constant amplitude stress excitation. The statistical properties of large crack growth under constant amplitude stress excitation can be obtained via experiments.  相似文献   

15.
James Chen 《Risk analysis》1993,13(5):559-564
A dose-response model is often fit to bioassay data to provide a mathematical relationship between the incidence of a developmental malformation and dose of a toxicant. To utilize the interrelations among the fetal weight, incidence of malformation and number of the live fetuses, a conditional Gaussian regression chain model is proposed to model the dose-response function for developmental malformation incidence using the litter size and/or the fetal weight as covariates. The litter size is modeled as a function of dose, the fetal weight is modeled as a function of dose conditional on the litter size, and the malformation incidence is modeled as a function of dose conditional on both the litter size and the fetal weight, which itself is also conditional on the litter size. Data from a developmental experiment conducted at the National Center for Toxicological Research to investigate the growth stunting and increased incidence of cleft palate induced by Dexamethasone (DEX) exposure in rats was used as an illustration.  相似文献   

16.
In a quantitative model with uncertain inputs, the uncertainty of the output can be summarized by a risk measure. We propose a sensitivity analysis method based on derivatives of the output risk measure, in the direction of model inputs. This produces a global sensitivity measure, explicitly linking sensitivity and uncertainty analyses. We focus on the case of distortion risk measures, defined as weighted averages of output percentiles, and prove a representation of the sensitivity measure that can be evaluated on a Monte Carlo sample, as a weighted average of gradients over the input space. When the analytical model is unknown or hard to work with, nonparametric techniques are used for gradient estimation. This process is demonstrated through the example of a nonlinear insurance loss model. Furthermore, the proposed framework is extended in order to measure sensitivity to constant model parameters, uncertain statistical parameters, and random factors driving dependence between model inputs.  相似文献   

17.
The underlying assumptions of the Rai and Van Ryzin dose-response model for reproductive toxicological data are evaluated on the basis of existing experimental data. The model under consideration is unusual in its use of litter size to completely account for extra-binomial variation in the data by associating litter size with reproductive outcome. The experimental data show that controlling litter size is not sufficient to account for the litter-to-litter variability in responses. It is also shown that the two linear components of the Rai and Van Ryzin model are inappropriate. For the component which applies to the dam, the data suggest a strong nonlinearity, supported by rejection of the linear model via statistical hypothesis tests. In the component involving litter size, a relationship with dose is not apparent. The litter size parameters offer considerable potential for bias in estimation; bias which is at least partly masked by the model having good prediction characteristics due to the increased number of parameters. A simulation study is presented to illustrate how the Rai and Van Ryzin model can exaggerate litter size effects on the probability of response when the simulated data arise from a model involving a nonlinear dam component, common to this type of data, and no effect of litter size.  相似文献   

18.
Domino Effect Analysis Using Bayesian Networks   总被引:1,自引:0,他引:1  
A new methodology is introduced based on Bayesian network both to model domino effect propagation patterns and to estimate the domino effect probability at different levels. The flexible structure and the unique modeling techniques offered by Bayesian network make it possible to analyze domino effects through a probabilistic framework, considering synergistic effects, noisy probabilities, and common cause failures. Further, the uncertainties and the complex interactions among the domino effect components are captured using Bayesian network. The probabilities of events are updated in the light of new information, and the most probable path of the domino effect is determined on the basis of the new data gathered. This study shows how probability updating helps to update the domino effect model either qualitatively or quantitatively. The methodology is applied to a hypothetical example and also to an earlier‐studied case study. These examples accentuate the effectiveness of Bayesian network in modeling domino effects in processing facility.  相似文献   

19.
The probability of tumor and hazard function are calculated in a stochastic two-stage model for carcinogenesis when the parameters of the mode are time-dependent. The method used is called the method of characteristics.  相似文献   

20.
《Risk analysis》2018,38(2):255-271
Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real‐time risk analysis. The real‐time evolving situation is considered dependent on the changing bottom‐hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号