首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The paper applies classical statistical principles to yield new tools for risk assessment and makes new use of epidemiological data for human risk assessment. An extensive clinical and epidemiological study of workers engaged in the manufacturing and formulation of aldrin and dieldrin provides occupational hygiene and biological monitoring data on individual exposures over the years of employment and provides unusually accurate measures of individual lifetime average daily doses. In the cancer dose-response modeling, each worker is treated as a separate experimental unit with his own unique dose. Maximum likelihood estimates of added cancer risk are calculated for multistage, multistage-Weibull, and proportional hazards models. Distributional characterizations of added cancer risk are based on bootstrap and relative likelihood techniques. The cancer mortality data on these male workers suggest that low-dose exposures to aldrin and dieldrin do not significantly increase human cancer risk and may even decrease the human hazard rate for all types of cancer combined at low doses (e.g., 1 g/kg/day). The apparent hormetic effect in the best fitting dose-response models for this data set is statistically significant. The decrease in cancer risk at low doses of aldrin and dieldrin is in sharp contrast to the U.S. Environmental Protection Agency's upper bound on cancer potency based on mouse liver tumors. The EPA's upper bound implies that lifetime average daily doses of 0.0000625 and 0.00625 g/kg body weight/day would correspond to increased cancer risks of 0.000001 and 0.0001, respectively. However, the best estimate from the Pernis epidemiological data is that there is no increase in cancer risk in these workers at these doses or even at doses as large as 2 g/kg/day.  相似文献   

2.
Use of Mechanistic Models to Estimate Low-Dose Cancer Risks   总被引:1,自引:0,他引:1  
Kenny S. Crump 《Risk analysis》1994,14(6):1033-1038
The utility of mechanistic models of cancer for predicting cancer risks at low doses is examined. Based upon a general approximation to the dose-response that is valid at low doses, it is shown that at low doses the dose-response predicted by a mechanistic model is a linear combination of the dose-responses for each of the physiological parameters in the model that are affected by exposure. This demonstrates that, unless the mechanistic model provides a theoretical basis for determining the dose-responses for these parameters, the extrapolation of risks to low doses using a mechanistic model is basically "curve fitting," just as is the case when extrapolating using statistical models. This suggests that experiments to generate data for use in mechanistic models should emphasize measuring the dose-response for dose-related parameters as accurately as possible and at the lowest feasible doses.  相似文献   

3.
While microbial risk assessment (MRA) has been used for over 25 years, traditional dose-response analysis has only predicted the overall risk of adverse consequences from exposure to a given dose. An important issue for consequence assessment from bioterrorist and other microbiological exposure is the distribution of cases over time due to the initial exposure. In this study, the classical exponential and beta-Poisson dose-response models were modified to include exponential-power dependency of time post inoculation (TPI) or its simplified form, exponential-reciprocal dependency of TPI, to quantify the time of onset of an effect presumably associated with the kinetics of in vivo bacterial growth. Using the maximum likelihood estimation approach, the resulting time-dose-response models were found capable of providing statistically acceptable fits to all tested pooled animal survival dose-response data. These new models can consequently describe the development of animal infectious response over time and represent observed responses fairly accurately. This is the first study showing that a time-dose-response model can be developed for describing infections initiated by various pathogens. It provides an advanced approach for future MRA frameworks.  相似文献   

4.
Since the National Food Safety Initiative of 1997, risk assessment has been an important issue in food safety areas. Microbial risk assessment is a systematic process for describing and quantifying a potential to cause adverse health effects associated with exposure to microorganisms. Various dose-response models for estimating microbial risks have been investigated. We have considered four two-parameter models and four three-parameter models in order to evaluate variability among the models for microbial risk assessment using infectivity and illness data from studies with human volunteers exposed to a variety of microbial pathogens. Model variability is measured in terms of estimated ED01s and ED10s, with the view that these effective dose levels correspond to the lower and upper limits of the 1% to 10% risk range generally recommended for establishing benchmark doses in risk assessment. Parameters of the statistical models are estimated using the maximum likelihood method. In this article a weighted average of effective dose estimates from eight two- and three-parameter dose-response models, with weights determined by the Kullback information criterion, is proposed to address model uncertainties in microbial risk assessment. The proposed procedures for incorporating model uncertainties and making inferences are illustrated with human infection/illness dose-response data sets.  相似文献   

5.
The choice of a dose-response model is decisive for the outcome of quantitative risk assessment. Single-hit models have played a prominent role in dose-response assessment for pathogenic microorganisms, since their introduction. Hit theory models are based on a few simple concepts that are attractive for their clarity and plausibility. These models, in particular the Beta Poisson model, are used for extrapolation of experimental dose-response data to low doses, as are often present in drinking water or food products. Unfortunately, the Beta Poisson model, as it is used throughout the microbial risk literature, is an approximation whose validity is not widely known. The exact functional relation is numerically complex, especially for use in optimization or uncertainty analysis. Here it is shown that although the discrepancy between the Beta Poisson formula and the exact function is not very large for many data sets, the differences are greatest at low doses--the region of interest for many risk applications. Errors may become very large, however, in the results of uncertainty analysis, or when the data contain little low-dose information. One striking property of the exact single-hit model is that it has a maximum risk curve, limiting the upper confidence level of the dose-response relation. This is due to the fact that the risk cannot exceed the probability of exposure, a property that is not retained in the Beta Poisson approximation. This maximum possible response curve is important for uncertainty analysis, and for risk assessment of pathogens with unknown properties.  相似文献   

6.
Charles N. Haas 《Risk analysis》2011,31(10):1576-1596
Human Brucellosis is one of the most common zoonotic diseases worldwide. Disease transmission often occurs through the handling of domestic livestock, as well as ingestion of unpasteurized milk and cheese, but can have enhanced infectivity if aerosolized. Because there is no human vaccine available, rising concerns about the threat of Brucellosis to human health and its inclusion in the Center for Disease Control's Category B Bioterrorism/Select Agent List make a better understanding of the dose‐response relationship of this microbe necessary. Through an extensive peer‐reviewed literature search, candidate dose‐response data were appraised so as to surpass certain standards for quality. The statistical programming language, “R,” was used to compute the maximum likelihood estimation to fit two models, the exponential and the approximate beta‐Poisson (widely used for quantitative risk assessment) to dose‐response data. Dose‐response models were generated for prevalent species of Brucella: Br. suis, Br. melitensis, and Br. abortus. Dose‐response models were created for aerosolized Br. suis exposure to guinea pigs from pooled studies. A parallel model for guinea pigs inoculated through both aerosol and subcutaneous routes with Br. melitensis showed that the median infectious dose corresponded to a 30 colony‐forming units (CFU) dose of Br. suis, much less than the N50 dose of about 94 CFU for Br. melitensis organisms. When Br. melitensis was tested subcutaneously on mice, the N50 dose was higher, 1,840 CFU. A dose‐response model was constructed from pooled data for mice, rhesus macaques, and humans inoculated through three routes (subcutaneously/aerosol/intradermally) with Br. melitensis.  相似文献   

7.
Climate change may impact waterborne and foodborne infectious disease, but to what extent is uncertain. Estimating climate‐change‐associated relative infection risks from exposure to viruses, bacteria, or parasites in water or food is critical for guiding adaptation measures. We present a computational tool for strategic decision making that describes the behavior of pathogens using location‐specific input data under current and projected climate conditions. Pathogen‐pathway combinations are available for exposure to norovirus, Campylobacter, Cryptosporidium, and noncholera Vibrio species via drinking water, bathing water, oysters, or chicken fillets. Infection risk outcomes generated by the tool under current climate conditions correspond with those published in the literature. The tool demonstrates that increasing temperatures lead to increasing risks for infection with Campylobacter from consuming raw/undercooked chicken fillet and for Vibrio from water exposure. Increasing frequencies of drought generally lead to an elevated infection risk of exposure to persistent pathogens such as norovirus and Cryptosporidium, but decreasing risk of exposure to rapidly inactivating pathogens, like Campylobacter. The opposite is the case with increasing annual precipitation; an upsurge of heavy rainfall events leads to more peaks in infection risks in all cases. The interdisciplinary tool presented here can be used to guide climate change adaptation strategies focused on infectious diseases.  相似文献   

8.
Quantitative microbial risk assessment (QMRA) is widely accepted for characterizing the microbial risks associated with food, water, and wastewater. Single‐hit dose‐response models are the most commonly used dose‐response models in QMRA. Denoting as the probability of infection at a given mean dose d, a three‐parameter generalized QMRA beta‐Poisson dose‐response model, , is proposed in which the minimum number of organisms required for causing infection, Kmin, is not fixed, but a random variable following a geometric distribution with parameter . The single‐hit beta‐Poisson model, , is a special case of the generalized model with Kmin = 1 (which implies ). The generalized beta‐Poisson model is based on a conceptual model with greater detail in the dose‐response mechanism. Since a maximum likelihood solution is not easily available, a likelihood‐free approximate Bayesian computation (ABC) algorithm is employed for parameter estimation. By fitting the generalized model to four experimental data sets from the literature, this study reveals that the posterior median estimates produced fall short of meeting the required condition of = 1 for single‐hit assumption. However, three out of four data sets fitted by the generalized models could not achieve an improvement in goodness of fit. These combined results imply that, at least in some cases, a single‐hit assumption for characterizing the dose‐response process may not be appropriate, but that the more complex models may be difficult to support especially if the sample size is small. The three‐parameter generalized model provides a possibility to investigate the mechanism of a dose‐response process in greater detail than is possible under a single‐hit model.  相似文献   

9.
The effect of bioaerosol size was incorporated into predictive dose‐response models for the effects of inhaled aerosols of Francisella tularensis (the causative agent of tularemia) on rhesus monkeys and guinea pigs with bioaerosol diameters ranging between 1.0 and 24 μm. Aerosol‐size‐dependent models were formulated as modification of the exponential and β‐Poisson dose‐response models and model parameters were estimated using maximum likelihood methods and multiple data sets of quantal dose‐response data for which aerosol sizes of inhaled doses were known. Analysis of F. tularensis dose‐response data was best fit by an exponential dose‐response model with a power function including the particle diameter size substituting for the rate parameter k scaling the applied dose. There were differences in the pathogen's aerosol‐size‐dependence equation and models that better represent the observed dose‐response results than the estimate derived from applying the model developed by the International Commission on Radiological Protection (ICRP, 1994) that relies on differential regional lung deposition for human particle exposure.  相似文献   

10.
Tucker Burch 《Risk analysis》2019,39(3):599-615
The assumptions underlying quantitative microbial risk assessment (QMRA) are simple and biologically plausible, but QMRA predictions have never been validated for many pathogens. The objective of this study was to validate QMRA predictions against epidemiological measurements from outbreaks of waterborne gastrointestinal disease. I screened 2,000 papers and identified 12 outbreaks with the necessary data: disease rates measured using epidemiological methods and pathogen concentrations measured in the source water. Eight of the 12 outbreaks were caused by Cryptosporidium, three by Giardia, and one by norovirus. Disease rates varied from 5.5 × 10?6 to 1.1 × 10?2 cases/person‐day, and reported pathogen concentrations varied from 1.2 × 10?4 to 8.6 × 102 per liter. I used these concentrations with single‐hit dose–response models for all three pathogens to conduct QMRA, producing both point and interval predictions of disease rates for each outbreak. Comparison of QMRA predictions to epidemiological measurements showed good agreement; interval predictions contained measured disease rates for 9 of 12 outbreaks, with point predictions off by factors of 1.0–120 (median = 4.8). Furthermore, 11 outbreaks occurred at mean doses of less than 1 pathogen per exposure. Measured disease rates for these outbreaks were clearly consistent with a single‐hit model, and not with a “two‐hit” threshold model. These results demonstrate the validity of QMRA for predicting disease rates due to Cryptosporidium and Giardia.  相似文献   

11.
This article describes the development of a weighted composite dose-response model for human salmonellosis. Data from previously reported human challenge studies were categorized into two different groups representing low and moderately virulent/pathogenic Salmonella strains based on a disease end point. Because epidemiological data indicate that some Salmonella strains are particularly pathogenic, and in the absence of human feeding study data for such strains, Shigella dysenteriae was used as a proxy for highly virulent strains. Three single-hit dose-response models were applied to the human feeding study data and evaluated for best fit using maximum likelihood estimation: (1) the exponential (E-1pop), (2) the two-subpopulation exponential (E-2pop), and (3) the Beta-Poisson (BP). Based on the goodness-of-fit test, the E-1pop and BP were the best-fit models for low and moderately virulent/pathogenic Salmonella strains, and the E-2pop and BP models were better for highly virulent/pathogenic strains. Epistemic analysis was conducted by determining the degree of confidence associated with the selected models, which was found to be 50%/50% (E-1pop/BP) for low and moderately pathogenic Salmonella strains, and 9.8%/90.2% (E-2pop/BP) for highly virulent strains. The degree of confidence for each component model and variations in the proportion of strains within each virulence/pathogenicity category were incorporated into the overall composite model. This study describes the influence of variation in strain virulence and host susceptibility on the shape of the population dose-response relationship.  相似文献   

12.
Spatial and/or temporal clustering of pathogens will invalidate the commonly used assumption of Poisson‐distributed pathogen counts (doses) in quantitative microbial risk assessment. In this work, the theoretically predicted effect of spatial clustering in conventional “single‐hit” dose‐response models is investigated by employing the stuttering Poisson distribution, a very general family of count distributions that naturally models pathogen clustering and contains the Poisson and negative binomial distributions as special cases. The analysis is facilitated by formulating the dose‐response models in terms of probability generating functions. It is shown formally that the theoretical single‐hit risk obtained with a stuttering Poisson distribution is lower than that obtained with a Poisson distribution, assuming identical mean doses. A similar result holds for mixed Poisson distributions. Numerical examples indicate that the theoretical single‐hit risk is fairly insensitive to moderate clustering, though the effect tends to be more pronounced for low mean doses. Furthermore, using Jensen's inequality, an upper bound on risk is derived that tends to better approximate the exact theoretical single‐hit risk for highly overdispersed dose distributions. The bound holds with any dose distribution (characterized by its mean and zero inflation index) and any conditional dose‐response model that is concave in the dose variable. Its application is exemplified with published data from Norovirus feeding trials, for which some of the administered doses were prepared from an inoculum of aggregated viruses. The potential implications of clustering for dose‐response assessment as well as practical risk characterization are discussed.  相似文献   

13.
In this paper, we propose the incremental group testing model for the gap closing problem, which assumes that we can tell the difference between the outcome of testing a subset S, and the outcome of testing S {x}. We also give improvements over currently best results in literature for some other models.  相似文献   

14.
We compare the regulatory implications of applying the traditional (linearized) and exact two-stage dose–response models to animal carcinogenic data. We analyze dose–response data from six studies, representing five different substances, and we determine the goodness-of-fit of each model as well as the 95% confidence lower limit of the dose corresponding to a target excess risk of 10–5 (the target risk dose TRD). For the two concave datasets, we find that the exact model gives a substantially better fit to the data than the traditional model, and that the exact model gives a TRD that is an order of magnitude lower than that given by the traditional model. In the other cases, the exact model gives a fit equivalent to or better than the traditional model. We also show that although the exact two-stage model may exhibit dose–response concavity at moderate dose levels, it is always linear or sublinear, and never supralinear, in the low-dose limit. Because regulatory concern is almost always confined to the low-dose region extrapolation, supralinear behavior seems not to be of regulatory concern in the exact two-stage model. Finally, we find that when performing this low-dose extrapolation in cases of dose–response concavity, extrapolating the model fit leads to a more conservative TRD than taking a linear extrapolation from 10% excess risk. We conclude with a set of recommendations.  相似文献   

15.
Uncertainty in Cancer Risk Estimates   总被引:1,自引:0,他引:1  
Several existing databases compiled by Gold et al.(1–3) for carcinogenesis bioassays are examined to obtain estimates of the reproducibility of cancer rates across experiments, strains, and rodent species. A measure of carcinogenic potency is given by the TD50 (daily dose that causes a tumor type in 50% of the exposed animals that otherwise would not develop the tumor in a standard lifetime). The lognormal distribution can be used to model the uncertainty of the estimates of potency (TD50) and the ratio of TD50's between two species. For near-replicate bioassays, approximately 95% of the TD50's are estimated to be within a factor of 4 of the mean. Between strains, about 95% of the TD50's are estimated to be within a factor of 11 of their mean, and the pure genetic component of variability is accounted for by a factor of 6.8. Between rats and mice, about 95% of the TD50's are estimated to be within a factor of 32 of the mean, while between humans and experimental animals the factor is 110 for 20 chemicals reported by Allen et al.(4) The common practice of basing cancer risk estimates on the most sensitive rodent species-strain-sex and using interspecies dose scaling based on body surface area appears to overestimate cancer rates for these 20 human carcinogens by about one order of magnitude on the average. Hence, for chemicals where the dose-response is nearly linear below experimental doses, cancer risk estimates based on animal data are not necessarily conservative and may range from a factor of 10 too low for human carcinogens up to a factor of 1000 too high for approximately 95% of the chemicals tested to date. These limits may need to be modified for specific chemicals where additional mechanistic or pharmacokinetic information may suggest alterations or where particularly sensitive subpopu-lations may be exposed. Supralinearity could lead to anticonservative estimates of cancer risk. Underestimating cancer risk by a specific factor has a much larger impact on the actual number of cancer cases than overestimates of smaller risks by the same factor. This paper does not address the uncertainties in high to low dose extrapolation. If the dose-response is sufficiently nonlinear at low doses to produce cancer risks near zero, then low-dose risk estimates based on linear extrapolation are likely to overestimate risk and the limits of uncertainty cannot be established.  相似文献   

16.
Toxoplasma gondii is a protozoan parasite that is responsible for approximately 24% of deaths attributed to foodborne pathogens in the United States. It is thought that a substantial portion of human T. gondii infections is acquired through the consumption of meats. The dose‐response relationship for human exposures to T. gondii‐infected meat is unknown because no human data are available. The goal of this study was to develop and validate dose‐response models based on animal studies, and to compute scaling factors so that animal‐derived models can predict T. gondii infection in humans. Relevant studies in literature were collected and appropriate studies were selected based on animal species, stage, genotype of T. gondii, and route of infection. Data were pooled and fitted to four sigmoidal‐shaped mathematical models, and model parameters were estimated using maximum likelihood estimation. Data from a mouse study were selected to develop the dose‐response relationship. Exponential and beta‐Poisson models, which predicted similar responses, were selected as reasonable dose‐response models based on their simplicity, biological plausibility, and goodness fit. A confidence interval of the parameter was determined by constructing 10,000 bootstrap samples. Scaling factors were computed by matching the predicted infection cases with the epidemiological data. Mouse‐derived models were validated against data for the dose‐infection relationship in rats. A human dose‐response model was developed as P (d) = 1–exp (–0.0015 × 0.005 × d) or P (d) = 1–(1 + d × 0.003 / 582.414)?1.479. Both models predict the human response after consuming T. gondii‐infected meats, and provide an enhanced risk characterization in a quantitative microbial risk assessment model for this pathogen.  相似文献   

17.
Food‐borne infection is caused by intake of foods or beverages contaminated with microbial pathogens. Dose‐response modeling is used to estimate exposure levels of pathogens associated with specific risks of infection or illness. When a single dose‐response model is used and confidence limits on infectious doses are calculated, only data uncertainty is captured. We propose a method to estimate the lower confidence limit on an infectious dose by including model uncertainty and separating it from data uncertainty. The infectious dose is estimated by a weighted average of effective dose estimates from a set of dose‐response models via a Kullback information criterion. The confidence interval for the infectious dose is constructed by the delta method, where data uncertainty is addressed by a bootstrap method. To evaluate the actual coverage probabilities of the lower confidence limit, a Monte Carlo simulation study is conducted under sublinear, linear, and superlinear dose‐response shapes that can be commonly found in real data sets. Our model‐averaging method achieves coverage close to nominal in almost all cases, thus providing a useful and efficient tool for accurate calculation of lower confidence limits on infectious doses.  相似文献   

18.
The 2-INTERVAL PATTERN problem is to find the largest constrained pattern in a set of 2-intervals. The constrained pattern is a subset of the given 2-intervals such that any pair of them are R-comparable, where model . The problem stems from the study of general representation of RNA secondary structures. In this paper, we give three improved algorithms for different models. Firstly, an O(n{log} n +L) algorithm is proposed for the case , where is the total length of all 2-intervals (density d is the maximum number of 2-intervals over any point). This improves previous O(n 2log n) algorithm. Secondly, we use dynamic programming techniques to obtain an O(nlog n + dn) algorithm for the case R = { <, ⊏ }, which improves previous O(n 2) result. Finally, we present another algorithm for the case with disjoint support(interval ground set), which improves previous O(n 2n) upper bound. A preliminary version of this article appears in Proceedings of the 16th Annual International Symposium on Algorithms and Computation, Springer LNCS, Vol. 3827, pp. 412–421, Hainan, China, December 19–21, 2005.  相似文献   

19.
This study develops dose–response models for Ebolavirus using previously published data sets from the open literature. Two such articles were identified in which three different species of nonhuman primates were challenged by aerosolized Ebolavirus in order to study pathology and clinical disease progression. Dose groups were combined and pooled across each study in order to facilitate modeling. The endpoint of each experiment was death. The exponential and exact beta-Poisson models were fit to the data using maximum likelihood estimation. The exact beta-Poisson was deemed the recommended model because it more closely approximated the probability of response at low doses though both models provided a good fit. Although transmission is generally considered to be dominated by person-to-person contact, aerosolization is a possible route of exposure. If possible, this route of exposure could be particularly concerning for persons in occupational roles managing contaminated liquid wastes from patients being treated for Ebola infection and the wastewater community responsible for disinfection. Therefore, this study produces a necessary mathematical relationship between exposure dose and risk of death for the inhalation route of exposure that can support quantitative microbial risk assessment aimed at informing risk mitigation strategies including personal protection policies against occupational exposures.  相似文献   

20.
In this study we introduce a generalized support vector classification problem: Let X i , i=1,…,n be mutually exclusive sets of pattern vectors such that all pattern vectors x i,k , k=1,…,|X i | have the same class label y i . Select only one pattern vector $x_{i,k^{*}}In this study we introduce a generalized support vector classification problem: Let X i , i=1,…,n be mutually exclusive sets of pattern vectors such that all pattern vectors x i,k , k=1,…,|X i | have the same class label y i . Select only one pattern vector from each set X i such that the margin between the set of selected positive and negative pattern vectors are maximized. This problem is formulated as a quadratic mixed 0-1 programming problem, which is a generalization of the standard support vector classifiers. The quadratic mixed 0-1 formulation is shown to be -hard. An alternative approach is proposed with the free slack concept. Primal and dual formulations are introduced for linear and nonlinear classification. These formulations provide flexibility to the separating hyperplane to identify the pattern vectors with large margin. Iterative elimination and direct selection methods are developed to select such pattern vectors using the alternative formulations. These methods are compared with a na?ve method on simulated data. The iterative elimination method is also applied to neural data from a visuomotor categorical discrimination task to classify highly cognitive brain activities.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号