首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Applying a hockey stick parametric dose-response model to data on late or retarded development in Iraqi children exposed in utero to methylmercury, with mercury (Hg) exposure characterized by the peak Hg concentration in mothers'hair during pregnancy, Cox et al. calculated the "best statistical estimate" of the threshold for health effects as 10 ppm Hg in hair with a 95% range of uncertainty of between 0 and 13.6 ppm.(1)A new application of the hockey stick model to the Iraqi data shows, however, that the statistical upper limit of the threshold based on the hockey stick model could be as high as 255 ppm. Furthermore, the maximum likelihood estimate of the threshold using a different parametric model is virtually zero. These and other analyses demonstrate that threshold estimates based on parametric models exhibit high statistical variability and model dependency, and are highly sensitive to the precise definition of an abnormal response. Consequently, they are not a reliable basis for setting a reference dose (RfD) for methylmercury. Benchmark analyses and statistical analyses useful for deriving NOAELs are also presented. We believe these latter analyses—particularly the benchmark analyses—generally form a sounder basis for determining RfDs than the type of hockey stick analysis presented by Cox et al. However, the acute nature of the exposures, as well as other limitations in the Iraqi data suggest that other data may be more appropriate for determining acceptable human exposures to methylmercury.  相似文献   

2.
The effect of bioaerosol size was incorporated into predictive dose‐response models for the effects of inhaled aerosols of Francisella tularensis (the causative agent of tularemia) on rhesus monkeys and guinea pigs with bioaerosol diameters ranging between 1.0 and 24 μm. Aerosol‐size‐dependent models were formulated as modification of the exponential and β‐Poisson dose‐response models and model parameters were estimated using maximum likelihood methods and multiple data sets of quantal dose‐response data for which aerosol sizes of inhaled doses were known. Analysis of F. tularensis dose‐response data was best fit by an exponential dose‐response model with a power function including the particle diameter size substituting for the rate parameter k scaling the applied dose. There were differences in the pathogen's aerosol‐size‐dependence equation and models that better represent the observed dose‐response results than the estimate derived from applying the model developed by the International Commission on Radiological Protection (ICRP, 1994) that relies on differential regional lung deposition for human particle exposure.  相似文献   

3.
Kevin M. Crofton 《Risk analysis》2012,32(10):1784-1797
Traditional additivity models provide little flexibility in modeling the dose–response relationships of the single agents in a mixture. While the flexible single chemical required (FSCR) methods allow greater flexibility, its implicit nature is an obstacle in the formation of the parameter covariance matrix, which forms the basis for many statistical optimality design criteria. The goal of this effort is to develop a method for constructing the parameter covariance matrix for the FSCR models, so that (local) alphabetic optimality criteria can be applied. Data from Crofton et al. are provided as motivation; in an experiment designed to determine the effect of 18 polyhalogenated aromatic hydrocarbons on serum total thyroxine (T4), the interaction among the chemicals was statistically significant. Gennings et al. fit the FSCR interaction threshold model to the data. The resulting estimate of the interaction threshold was positive and within the observed dose region, providing evidence of a dose‐dependent interaction. However, the corresponding likelihood‐ratio‐based confidence interval was wide and included zero. In order to more precisely estimate the location of the interaction threshold, supplemental data are required. Using the available data as the first stage, the Ds‐optimal second‐stage design criterion was applied to minimize the variance of the hypothesized interaction threshold. Practical concerns associated with the resulting design are discussed and addressed using the penalized optimality criterion. Results demonstrate that the penalized Ds‐optimal second‐stage design can be used to more precisely define the interaction threshold while maintaining the characteristics deemed important in practice.  相似文献   

4.
A novel method was used to incorporate in vivo host–pathogen dynamics into a new robust outbreak model for legionellosis. Dose‐response and time‐dose‐response (TDR) models were generated for Legionella longbeachae exposure to mice via the intratracheal route using a maximum likelihood estimation approach. The best‐fit TDR model was then incorporated into two L. pneumophila outbreak models: an outbreak that occurred at a spa in Japan, and one that occurred in a Melbourne aquarium. The best‐fit TDR from the murine dosing study was the beta‐Poisson with exponential‐reciprocal dependency model, which had a minimized deviance of 32.9. This model was tested against other incubation distributions in the Japan outbreak, and performed consistently well, with reported deviances ranging from 32 to 35. In the case of the Melbourne outbreak, the exponential model with exponential dependency was tested against non‐time‐dependent distributions to explore the performance of the time‐dependent model with the lowest number of parameters. This model reported low minimized deviances around 8 for the Weibull, gamma, and lognormal exposure distribution cases. This work shows that the incorporation of a time factor into outbreak distributions provides models with acceptable fits that can provide insight into the in vivo dynamics of the host‐pathogen system.  相似文献   

5.
In case of low-dose exposure to a substance, its concentration in cells is likely to be stochastic. Assessing the consequences of this stochasticity in toxicological risk assessment requires the coupling of macroscopic dynamics models describing whole-body kinetics with microscopic tools designed to simulate stochasticity. In this article, we propose an approach to approximate stochastic cell concentration of butadiene in the cells of diverse organs. We adapted the dynamics equations of a physiologically based pharmacokinetic (PBPK) model and used a stochastic simulator for the system of equations that we derived. We then coupled kinetics simulations with a deterministic hockey stick model of carcinogenicity. Stochasticity induced substantial modifications relative to dose-response curve, compared with the deterministic situation. In particular, there was nonlinearity in the response and the stochastic apparent threshold was lower than the deterministic one. The approach that we developed could easily be extended to other biological studies to assess the influence of stochasticity at macroscopic scale for compound dynamics at the cell level.  相似文献   

6.
Toxoplasma gondii is a protozoan parasite that is responsible for approximately 24% of deaths attributed to foodborne pathogens in the United States. It is thought that a substantial portion of human T. gondii infections is acquired through the consumption of meats. The dose‐response relationship for human exposures to T. gondii‐infected meat is unknown because no human data are available. The goal of this study was to develop and validate dose‐response models based on animal studies, and to compute scaling factors so that animal‐derived models can predict T. gondii infection in humans. Relevant studies in literature were collected and appropriate studies were selected based on animal species, stage, genotype of T. gondii, and route of infection. Data were pooled and fitted to four sigmoidal‐shaped mathematical models, and model parameters were estimated using maximum likelihood estimation. Data from a mouse study were selected to develop the dose‐response relationship. Exponential and beta‐Poisson models, which predicted similar responses, were selected as reasonable dose‐response models based on their simplicity, biological plausibility, and goodness fit. A confidence interval of the parameter was determined by constructing 10,000 bootstrap samples. Scaling factors were computed by matching the predicted infection cases with the epidemiological data. Mouse‐derived models were validated against data for the dose‐infection relationship in rats. A human dose‐response model was developed as P (d) = 1–exp (–0.0015 × 0.005 × d) or P (d) = 1–(1 + d × 0.003 / 582.414)?1.479. Both models predict the human response after consuming T. gondii‐infected meats, and provide an enhanced risk characterization in a quantitative microbial risk assessment model for this pathogen.  相似文献   

7.
We consider the problem of estimating the probability of detection (POD) of flaws in an industrial steel component. Modeled as an increasing function of the flaw height, the POD characterizes the detection process; it is also involved in the estimation of the flaw size distribution, a key input parameter of physical models describing the behavior of the steel component when submitted to extreme thermodynamic loads. Such models are used to assess the resistance of highly reliable systems whose failures are seldom observed in practice. We develop a Bayesian method to estimate the flaw size distribution and the POD function, using flaw height measures from periodic in‐service inspections conducted with an ultrasonic detection device, together with measures from destructive lab experiments. Our approach, based on approximate Bayesian computation (ABC) techniques, is applied to a real data set and compared to maximum likelihood estimation (MLE) and a more classical approach based on Markov Chain Monte Carlo (MCMC) techniques. In particular, we show that the parametric model describing the POD as the cumulative distribution function (cdf) of a log‐normal distribution, though often used in this context, can be invalidated by the data at hand. We propose an alternative nonparametric model, which assumes no predefined shape, and extend the ABC framework to this setting. Experimental results demonstrate the ability of this method to provide a flexible estimation of the POD function and describe its uncertainty accurately.  相似文献   

8.
The exposure-response relationship for airborne hexavalent chromium exposure and lung cancer mortality is well described by a linear relative rate model. However, categorical analyses have been interpreted to suggest the presence of a threshold. This study investigates nonlinear features of the exposure response in a cohort of 2,357 chemical workers with 122 lung cancer deaths. In Poisson regression, a simple model representing a two-step carcinogenesis process was evaluated. In a one-stage context, fractional polynomials were investigated. Cumulative exposure dose metrics were examined corresponding to cumulative exposure thresholds, exposure intensity (concentration) thresholds, dose-rate effects, and declining burden of accumulated effect on future risk. A simple two-stage model of carcinogenesis provided no improvement in fit. The best-fitting one-stage models used simple cumulative exposure with no threshold for exposure intensity and had sufficient power to rule out thresholds as large as 30 microg/m3 CrO3 (16 microg/m3 as Cr+6) (one-sided 95% confidence limit, likelihood ratio test). Slightly better-fitting models were observed with cumulative exposure thresholds of 0.03 and 0.5 mg-yr/m3 (as CrO3) with and without an exposure-race interaction term, respectively. With the best model, cumulative exposure thresholds as large as 0.4 mg-yr/m3 CrO3 were excluded (two-sided upper 95% confidence limit, likelihood ratio test). A small departure from dose-rate linearity was observed, corresponding to (intensity)0.8 but was not statistically significant. Models in which risk-inducing damage burdens declined over time, based on half-lives ranging from 0.1 to 40 years, fit less well than assuming a constant burden. A half-life of 8 years or less was excluded (one-sided 95% confidence limit). Examination of nonlinear features of the hexavalent chromium-lung cancer exposure response in a population used in a recent risk assessment supports using the traditional (lagged) cumulative exposure paradigm: no intensity (concentration) threshold, linearity in intensity, and constant increment in risk following exposure.  相似文献   

9.
Survival models are developed to predict response and time‐to‐response for mortality in rabbits following exposures to single or multiple aerosol doses of Bacillus anthracis spores. Hazard function models were developed for a multiple‐dose data set to predict the probability of death through specifying functions of dose response and the time between exposure and the time‐to‐death (TTD). Among the models developed, the best‐fitting survival model (baseline model) is an exponential dose–response model with a Weibull TTD distribution. Alternative models assessed use different underlying dose–response functions and use the assumption that, in a multiple‐dose scenario, earlier doses affect the hazard functions of each subsequent dose. In addition, published mechanistic models are analyzed and compared with models developed in this article. None of the alternative models that were assessed provided a statistically significant improvement in fit over the baseline model. The general approach utilizes simple empirical data analysis to develop parsimonious models with limited reliance on mechanistic assumptions. The baseline model predicts TTDs consistent with reported results from three independent high‐dose rabbit data sets. More accurate survival models depend upon future development of dose–response data sets specifically designed to assess potential multiple‐dose effects on response and time‐to‐response. The process used in this article to develop the best‐fitting survival model for exposure of rabbits to multiple aerosol doses of B. anthracis spores should have broad applicability to other host–pathogen systems and dosing schedules because the empirical modeling approach is based upon pathogen‐specific empirically‐derived parameters.  相似文献   

10.
We study a two‐product inventory model that allows substitution. Both products can be used to supply demand over a selling season of N periods, with a one‐time replenishment opportunity at the beginning of the season. A substitution could be offered even when the demanded product is available. The substitution rule is flexible in the sense that the seller can choose whether or not to offer substitution and at what price or discount level, and the customer may or may not accept the offer, with the acceptance probability being a decreasing function of the substitution price. The decisions are the replenishment quantities at the beginning of the season, and the dynamic substitution‐pricing policy in each period of the season. Using a stochastic dynamic programming approach, we present a complete solution to the problem. Furthermore, we show that the objective function is concave and submodular in the inventory levels—structural properties that facilitate the solution procedure and help identify threshold policies for the optimal substitution/pricing decisions. Finally, with a state transformation, we also show that the objective function is ‐concave, which allows us to derive similar structural properties of the optimal policy for multiple‐season problems.  相似文献   

11.
In order to develop a dose‐response model for SARS coronavirus (SARS‐CoV), the pooled data sets for infection of transgenic mice susceptible to SARS‐CoV and infection of mice with murine hepatitis virus strain 1, which may be a clinically relevant model of SARS, were fit to beta‐Poisson and exponential models with the maximum likelihood method. The exponential model (k= 4.1 × l02) could describe the dose‐response relationship of the pooled data sets. The beta‐Poisson model did not provide a statistically significant improvement in fit. With the exponential model, the infectivity of SARS‐CoV was calculated and compared with those of other coronaviruses. The does of SARS‐CoV corresponding to 10% and 50% responses (illness) were estimated at 43 and 280 PFU, respectively. Its estimated infectivity was comparable to that of HCoV‐229E, known as an agent of human common cold, and also similar to those of some animal coronaviruses belonging to the same genetic group. Moreover, the exponential model was applied to the analysis of the epidemiological data of SARS outbreak that occurred at an apartment complex in Hong Kong in 2003. The estimated dose of SARS‐CoV for apartment residents during the outbreak, which was back‐calculated from the reported number of cases, ranged from 16 to 160 PFU/person, depending on the floor. The exponential model developed here is the sole dose‐response model for SARS‐CoV at the present and would enable us to understand the possibility for reemergence of SARS.  相似文献   

12.
Q fever is a zoonotic disease caused by the intracellular gram‐negative bacterium Coxiella burnetii (C. burnetii), which only multiplies within the phagolysosomal vacuoles. Q fever may manifest as acute or chronic disease. The acute form is generally not fatal and manifestes as self‐controlled febrile illness. Chronic Q fever is usually characterized by endocarditis. Many animal models, including humans, have been studied for Q fever infection through various exposure routes. The studies considered different endpoints including death for animal models and clinical signs for human infection. In this article, animal experimental data available in the open literature were fit to suitable dose‐response models using maximum likelihood estimation. Research results for tests of severe combined immunodeficient mice inoculated intraperitoneally (i.p.) with C. burnetii were best estimated with the Beta‐Poisson dose‐response model. Similar inoculation (i.p.) trial outcomes conducted on C57BL/6J mice were best fit by an exponential model, whereas those tests run on C57BL/10ScN mice were optimally represented by a Beta‐Poisson dose‐response model.  相似文献   

13.
Charles N. Haas 《Risk analysis》2011,31(10):1576-1596
Human Brucellosis is one of the most common zoonotic diseases worldwide. Disease transmission often occurs through the handling of domestic livestock, as well as ingestion of unpasteurized milk and cheese, but can have enhanced infectivity if aerosolized. Because there is no human vaccine available, rising concerns about the threat of Brucellosis to human health and its inclusion in the Center for Disease Control's Category B Bioterrorism/Select Agent List make a better understanding of the dose‐response relationship of this microbe necessary. Through an extensive peer‐reviewed literature search, candidate dose‐response data were appraised so as to surpass certain standards for quality. The statistical programming language, “R,” was used to compute the maximum likelihood estimation to fit two models, the exponential and the approximate beta‐Poisson (widely used for quantitative risk assessment) to dose‐response data. Dose‐response models were generated for prevalent species of Brucella: Br. suis, Br. melitensis, and Br. abortus. Dose‐response models were created for aerosolized Br. suis exposure to guinea pigs from pooled studies. A parallel model for guinea pigs inoculated through both aerosol and subcutaneous routes with Br. melitensis showed that the median infectious dose corresponded to a 30 colony‐forming units (CFU) dose of Br. suis, much less than the N50 dose of about 94 CFU for Br. melitensis organisms. When Br. melitensis was tested subcutaneously on mice, the N50 dose was higher, 1,840 CFU. A dose‐response model was constructed from pooled data for mice, rhesus macaques, and humans inoculated through three routes (subcutaneously/aerosol/intradermally) with Br. melitensis.  相似文献   

14.
This paper discusses a consistent bootstrap implementation of the likelihood ratio (LR) co‐integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates of the underlying vector autoregressive (VAR) model that obtain under the reduced rank null hypothesis. A full asymptotic theory is provided that shows that, unlike the bootstrap procedure in Swensen (2006) where a combination of unrestricted and restricted estimates from the VAR model is used, the resulting bootstrap data are I(1) and satisfy the null co‐integration rank, regardless of the true rank. This ensures that the bootstrap LR test is asymptotically correctly sized and that the probability that the bootstrap sequential procedure selects a rank smaller than the true rank converges to zero. Monte Carlo evidence suggests that our bootstrap procedures work very well in practice.  相似文献   

15.
Spatial and/or temporal clustering of pathogens will invalidate the commonly used assumption of Poisson‐distributed pathogen counts (doses) in quantitative microbial risk assessment. In this work, the theoretically predicted effect of spatial clustering in conventional “single‐hit” dose‐response models is investigated by employing the stuttering Poisson distribution, a very general family of count distributions that naturally models pathogen clustering and contains the Poisson and negative binomial distributions as special cases. The analysis is facilitated by formulating the dose‐response models in terms of probability generating functions. It is shown formally that the theoretical single‐hit risk obtained with a stuttering Poisson distribution is lower than that obtained with a Poisson distribution, assuming identical mean doses. A similar result holds for mixed Poisson distributions. Numerical examples indicate that the theoretical single‐hit risk is fairly insensitive to moderate clustering, though the effect tends to be more pronounced for low mean doses. Furthermore, using Jensen's inequality, an upper bound on risk is derived that tends to better approximate the exact theoretical single‐hit risk for highly overdispersed dose distributions. The bound holds with any dose distribution (characterized by its mean and zero inflation index) and any conditional dose‐response model that is concave in the dose variable. Its application is exemplified with published data from Norovirus feeding trials, for which some of the administered doses were prepared from an inoculum of aggregated viruses. The potential implications of clustering for dose‐response assessment as well as practical risk characterization are discussed.  相似文献   

16.
Yacov Y Haimes 《Risk analysis》2012,32(11):1834-1845
Natural and human‐induced disasters affect organizations in myriad ways because of the inherent interconnectedness and interdependencies among human, cyber, and physical infrastructures, but more importantly, because organizations depend on the effectiveness of people and on the leadership they provide to the organizations they serve and represent. These human–organizational–cyber–physical infrastructure entities are termed systems of systems. Given the multiple perspectives that characterize them, they cannot be modeled effectively with a single model. The focus of this article is: (i) the centrality of the states of a system in modeling; (ii) the efficacious role of shared states in modeling systems of systems, in identification, and in the meta‐modeling of systems of systems; and (iii) the contributions of the above to strategic preparedness, response to, and recovery from catastrophic risk to such systems. Strategic preparedness connotes a decision‐making process and its associated actions. These must be: implemented in advance of a natural or human‐induced disaster, aimed at reducing consequences (e.g., recovery time, community suffering, and cost), and/or controlling their likelihood to a level considered acceptable (through the decisionmakers’ implicit and explicit acceptance of various risks and tradeoffs). The inoperability input‐output model (IIM), which is grounded on Leontief's input/output model, has enabled the modeling of interdependent subsystems. Two separate modeling structures are introduced. These are: phantom system models (PSM), where shared states constitute the essence of modeling coupled systems; and the IIM, where interdependencies among sectors of the economy are manifested by the Leontief matrix of technological coefficients. This article demonstrates the potential contributions of these two models to each other, and thus to more informative modeling of systems of systems schema. The contributions of shared states to this modeling and to systems identification are presented with case studies.  相似文献   

17.
Threshold models have a wide variety of applications in economics. Direct applications include models of separating and multiple equilibria. Other applications include empirical sample splitting when the sample split is based on a continuously‐distributed variable such as firm size. In addition, threshold models may be used as a parsimonious strategy for nonparametric function estimation. For example, the threshold autoregressive model (TAR) is popular in the nonlinear time series literature. Threshold models also emerge as special cases of more complex statistical frameworks, such as mixture models, switching models, Markov switching models, and smooth transition threshold models. It may be important to understand the statistical properties of threshold models as a preliminary step in the development of statistical tools to handle these more complicated structures. Despite the large number of potential applications, the statistical theory of threshold estimation is undeveloped. It is known that threshold estimates are super‐consistent, but a distribution theory useful for testing and inference has yet to be provided. This paper develops a statistical theory for threshold estimation in the regression context. We allow for either cross‐section or time series observations. Least squares estimation of the regression parameters is considered. An asymptotic distribution theory for the regression estimates (the threshold and the regression slopes) is developed. It is found that the distribution of the threshold estimate is nonstandard. A method to construct asymptotic confidence intervals is developed by inverting the likelihood ratio statistic. It is shown that this yields asymptotically conservative confidence regions. Monte Carlo simulations are presented to assess the accuracy of the asymptotic approximations. The empirical relevance of the theory is illustrated through an application to the multiple equilibria growth model of Durlauf and Johnson (1995).  相似文献   

18.
Comparison of Six Dose-Response Models for Use with Food-Borne Pathogens   总被引:6,自引:0,他引:6  
Food-related illness in the United States is estimated to affect over six million people per year and cost the economy several billion dollars. These illnesses and costs could be reduced if minimum infectious doses were established and used as the basis of regulations and monitoring. However, standard methodologies for dose-response assessment are not yet formulated for microbial risk assessment. The objective of this study was to compare dose-response models for food-borne pathogens and determine which models were most appropriate for a range of pathogens. The statistical models proposed in the literature and chosen for comparison purposes were log-normal, log-logistic, exponential, -Poisson and Weibull-Gamma. These were fit to four data sets also taken from published literature, Shigella flexneri, Shigella dysenteriae,Campylobacter jejuni, and Salmonella typhosa, using the method of maximum likelihood. The Weibull-gamma, the only model with three parameters, was also the only model capable of fitting all the data sets examined using the maximum likelihood estimation for comparisons. Infectious doses were also calculated using each model. Within any given data set, the infectious dose estimated to affect one percent of the population ranged from one order of magnitude to as much as nine orders of magnitude, illustrating the differences in extrapolation of the dose response models. More data are needed to compare models and examine extrapolation from high to low doses for food-borne pathogens.  相似文献   

19.
In nonlinear panel data models, the incidental parameter problem remains a challenge to econometricians. Available solutions are often based on ingenious, model‐specific methods. In this paper, we propose a systematic approach to construct moment restrictions on common parameters that are free from the individual fixed effects. This is done by an orthogonal projection that differences out the unknown distribution function of individual effects. Our method applies generally in likelihood models with continuous dependent variables where a condition of non‐surjectivity holds. The resulting method‐of‐moments estimators are root‐N consistent (for fixed T) and asymptotically normal, under regularity conditions that we spell out. Several examples and a small‐scale simulation exercise complete the paper.  相似文献   

20.
This paper analyzes the properties of standard estimators, tests, and confidence sets (CS's) for parameters that are unidentified or weakly identified in some parts of the parameter space. The paper also introduces methods to make the tests and CS's robust to such identification problems. The results apply to a class of extremum estimators and corresponding tests and CS's that are based on criterion functions that satisfy certain asymptotic stochastic quadratic expansions and that depend on the parameter that determines the strength of identification. This covers a class of models estimated using maximum likelihood (ML), least squares (LS), quantile, generalized method of moments, generalized empirical likelihood, minimum distance, and semi‐parametric estimators. The consistency/lack‐of‐consistency and asymptotic distributions of the estimators are established under a full range of drifting sequences of true distributions. The asymptotic sizes (in a uniform sense) of standard and identification‐robust tests and CS's are established. The results are applied to the ARMA(1, 1) time series model estimated by ML and to the nonlinear regression model estimated by LS. In companion papers, the results are applied to a number of other models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号