首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到16条相似文献,搜索用时 15 毫秒
1.
A novel method was used to incorporate in vivo host–pathogen dynamics into a new robust outbreak model for legionellosis. Dose‐response and time‐dose‐response (TDR) models were generated for Legionella longbeachae exposure to mice via the intratracheal route using a maximum likelihood estimation approach. The best‐fit TDR model was then incorporated into two L. pneumophila outbreak models: an outbreak that occurred at a spa in Japan, and one that occurred in a Melbourne aquarium. The best‐fit TDR from the murine dosing study was the beta‐Poisson with exponential‐reciprocal dependency model, which had a minimized deviance of 32.9. This model was tested against other incubation distributions in the Japan outbreak, and performed consistently well, with reported deviances ranging from 32 to 35. In the case of the Melbourne outbreak, the exponential model with exponential dependency was tested against non‐time‐dependent distributions to explore the performance of the time‐dependent model with the lowest number of parameters. This model reported low minimized deviances around 8 for the Weibull, gamma, and lognormal exposure distribution cases. This work shows that the incorporation of a time factor into outbreak distributions provides models with acceptable fits that can provide insight into the in vivo dynamics of the host‐pathogen system.  相似文献   

2.
In the Self Sufficiency Project (SSP) welfare demonstration, members of a randomly assigned treatment group could receive a subsidy for full‐time work. The subsidy was available for 3 years, but only to people who began working full time within 12 months of random assignment. A simple optimizing model suggests that the eligibility rules created an “establishment” incentive to find a job and leave welfare within a year of random assignment, and an “entitlement” incentive to choose work over welfare once eligibility was established. Building on this insight, we develop an econometric model of welfare participation that allows us to separate the two effects and estimate the impact of the earnings subsidy on welfare entry and exit rates among those who achieved eligibility. The combination of the two incentives explains the time profile of the experimental impacts, which peaked 15 months after random assignment and faded relatively quickly. Our findings suggest that about half of the peak impact of SSP was attributable to the establishment incentive. Despite the extra work effort generated by SSP, the program had no lasting impact on wages and little or no long‐run effect on welfare participation.  相似文献   

3.
We provide an exact myopic analysis for an N‐stage serial inventory system with batch ordering, linear ordering costs, and nonstationary demands under a finite planning horizon. We characterize the optimality conditions of the myopic nested batching newsvendor (NBN) policy and the myopic independent batching newsvendor (IBN) policy, which is a single‐stage approximation. We show that echelon reorder levels under the NBN policy are upper bounds of the counterparts under both the optimal policy and the IBN policy. In particular, we find that the IBN policy has bounded deviations from the optimal policy. We further extend our results to systems with martingale model of forecast evolution (MMFE) and advance demand information. Moreover, we provide a recursive computing procedure and optimality conditions for both heuristics which dramatically reduces computational complexity. We also find that the NBN problem under the MMFE faced by one stage has one more dimension for the forecast demand than the one faced by its downstream stage and that the NBN policy is optimal for systems with advance demand information and stationary problem data. Numerical studies demonstrate that the IBN policy outperforms on average the NBN policy over all tested instances when their optimality conditions are violated.  相似文献   

4.
Environmental Protection Agency (EPA) ambient air quality guidelines are meant to limit long‐term exposures of toxins to safe levels. Unfortunately, there is little guidance for what constitutes a safe level from a one‐time (or very infrequent) short exposure(s). In the case of mercury, a review of the derivation of the EPA ambient air quality standard shows that it implicitly assumes a tissue burden model. The time dependence of the tissue burden is commonly described in terms of a half‐life, a modeling assumption that presumes that the decline in the tissue burden after a single exposure can be approximately described as an exponential decay. In this article, we use a simple exponential tissue burden model to derive a time‐dependent no observable adverse effect level (NOAEL) for mercury concentrations in air. The model predicts that tissue body burden will asymptotically approach the EPA air quality level for long exposure times, and reach workplace standard levels for exposures of a few hours. The model was used along with data on mercury levels from experimental work done by the Maine Department of Environmental Protection to evaluate the risks from a broken compact fluorescent lamp in a residential setting. Mercury levels approached the NOAEL only when the debris was left in an almost sealed room. Normal common‐sense cleaning measures: removal of debris to an outside area, and ventilation of the room for several minutes, reduced exposures to less than 1% of the NOAEL.  相似文献   

5.
Regulatory agencies often perform microbial risk assessments to evaluate the change in the number of human illnesses as the result of a new policy that reduces the level of contamination in the food supply. These agencies generally have regulatory authority over the production and retail sectors of the farm‐to‐table continuum. Any predicted change in contamination that results from new policy that regulates production practices occurs many steps prior to consumption of the product. This study proposes a framework for conducting microbial food‐safety risk assessments; this framework can be used to quantitatively assess the annual effects of national regulatory policies. Advantages of the framework are that estimates of human illnesses are consistent with national disease surveillance data (which are usually summarized on an annual basis) and some of the modeling steps that occur between production and consumption can be collapsed or eliminated. The framework leads to probabilistic models that include uncertainty and variability in critical input parameters; these models can be solved using a number of different Bayesian methods. The Bayesian synthesis method performs well for this application and generates posterior distributions of parameters that are relevant to assessing the effect of implementing a new policy. An example, based on Campylobacter and chicken, estimates the annual number of illnesses avoided by a hypothetical policy; this output could be used to assess the economic benefits of a new policy. Empirical validation of the policy effect is also examined by estimating the annual change in the numbers of illnesses observed via disease surveillance systems.  相似文献   

6.
Charles N. Haas 《Risk analysis》2011,31(10):1576-1596
Human Brucellosis is one of the most common zoonotic diseases worldwide. Disease transmission often occurs through the handling of domestic livestock, as well as ingestion of unpasteurized milk and cheese, but can have enhanced infectivity if aerosolized. Because there is no human vaccine available, rising concerns about the threat of Brucellosis to human health and its inclusion in the Center for Disease Control's Category B Bioterrorism/Select Agent List make a better understanding of the dose‐response relationship of this microbe necessary. Through an extensive peer‐reviewed literature search, candidate dose‐response data were appraised so as to surpass certain standards for quality. The statistical programming language, “R,” was used to compute the maximum likelihood estimation to fit two models, the exponential and the approximate beta‐Poisson (widely used for quantitative risk assessment) to dose‐response data. Dose‐response models were generated for prevalent species of Brucella: Br. suis, Br. melitensis, and Br. abortus. Dose‐response models were created for aerosolized Br. suis exposure to guinea pigs from pooled studies. A parallel model for guinea pigs inoculated through both aerosol and subcutaneous routes with Br. melitensis showed that the median infectious dose corresponded to a 30 colony‐forming units (CFU) dose of Br. suis, much less than the N50 dose of about 94 CFU for Br. melitensis organisms. When Br. melitensis was tested subcutaneously on mice, the N50 dose was higher, 1,840 CFU. A dose‐response model was constructed from pooled data for mice, rhesus macaques, and humans inoculated through three routes (subcutaneously/aerosol/intradermally) with Br. melitensis.  相似文献   

7.
The effect of bioaerosol size was incorporated into predictive dose‐response models for the effects of inhaled aerosols of Francisella tularensis (the causative agent of tularemia) on rhesus monkeys and guinea pigs with bioaerosol diameters ranging between 1.0 and 24 μm. Aerosol‐size‐dependent models were formulated as modification of the exponential and β‐Poisson dose‐response models and model parameters were estimated using maximum likelihood methods and multiple data sets of quantal dose‐response data for which aerosol sizes of inhaled doses were known. Analysis of F. tularensis dose‐response data was best fit by an exponential dose‐response model with a power function including the particle diameter size substituting for the rate parameter k scaling the applied dose. There were differences in the pathogen's aerosol‐size‐dependence equation and models that better represent the observed dose‐response results than the estimate derived from applying the model developed by the International Commission on Radiological Protection (ICRP, 1994) that relies on differential regional lung deposition for human particle exposure.  相似文献   

8.
Leptospirosis is a preeminent zoonotic disease concentrated in tropical areas, and prevalent in both industrialized and rural settings. Dose‐response models were generated from 22 data sets reported in 10 different studies. All of the selected studies used rodent subjects, primarily hamsters, with the predominant endpoint as mortality with the challenge strain administered intraperitoneally. Dose‐response models based on a single evaluation postinfection displayed median lethal dose (LD50) estimates that ranged between 1 and 107 leptospirae depending upon the strain's virulence and the period elapsed since the initial exposure inoculation. Twelve of the 22 data sets measured the number of affected subjects daily over an extended period, so dose‐response models with time‐dependent parameters were estimated. Pooling between data sets produced seven common dose‐response models and one time‐dependent model. These pooled common models had data sets with different test subject hosts, and between disparate leptospiral strains tested on identical hosts. Comparative modeling was done with parallel tests to test the effects of a single different variable of either strain or test host and quantify the difference by calculating a dose multiplication factor. Statistical pooling implies that the mechanistic processes of leptospirosis can be represented by the same dose‐response model for different experimental infection tests even though they may involve different host species, routes, and leptospiral strains, although the cause of this pathophysiological phenomenon has not yet been identified.  相似文献   

9.
Wavelet analysis is a new mathematical method developed as a unified field of science over the last decade or so. As a spatially adaptive analytic tool, wavelets are useful for capturing serial correlation where the spectrum has peaks or kinks, as can arise from persistent dependence, seasonality, and other kinds of periodicity. This paper proposes a new class of generally applicable wavelet‐based tests for serial correlation of unknown form in the estimated residuals of a panel regression model, where error components can be one‐way or two‐way, individual and time effects can be fixed or random, and regressors may contain lagged dependent variables or deterministic/stochastic trending variables. Our tests are applicable to unbalanced heterogenous panel data. They have a convenient null limit N(0,1) distribution. No formulation of an alternative model is required, and our tests are consistent against serial correlation of unknown form even in the presence of substantial inhomogeneity in serial correlation across individuals. This is in contrast to existing serial correlation tests for panel models, which ignore inhomogeneity in serial correlation across individuals by assuming a common alternative, and thus have no power against the alternatives where the average of serial correlations among individuals is close to zero. We propose and justify a data‐driven method to choose the smoothing parameter—the finest scale in wavelet spectral estimation, making the tests completely operational in practice. The data‐driven finest scale automatically converges to zero under the null hypothesis of no serial correlation and diverges to infinity as the sample size increases under the alternative, ensuring the consistency of our tests. Simulation shows that our tests perform well in small and finite samples relative to some existing tests.  相似文献   

10.
A model for binary panel data is introduced which allows for state dependence and unobserved heterogeneity beyond the effect of available covariates. The model is of quadratic exponential type and its structure closely resembles that of the dynamic logit model. However, it has the advantage of being easily estimable via conditional likelihood with at least two observations (further to an initial observation) and even in the presence of time dummies among the regressors.  相似文献   

11.
This paper introduces a nonparametric Granger‐causality test for covariance stationary linear processes under, possibly, the presence of long‐range dependence. We show that the test is consistent and has power against contiguous alternatives converging to the parametric rate T−1/2. Since the test is based on estimates of the parameters of the representation of a VAR model as a, possibly, two‐sided infinite distributed lag model, we first show that a modification of Hannan's (1963, 1967) estimator is root‐ T consistent and asymptotically normal for the coefficients of such a representation. When the data are long‐range dependent, this method of estimation becomes more attractive than least squares, since the latter can be neither root‐ T consistent nor asymptotically normal as is the case with short‐range dependent data.  相似文献   

12.
《Risk analysis》2018,38(8):1685-1700
Military health risk assessors, medical planners, operational planners, and defense system developers require knowledge of human responses to doses of biothreat agents to support force health protection and chemical, biological, radiological, nuclear (CBRN) defense missions. This article reviews extensive data from 118 human volunteers administered aerosols of the bacterial agent Francisella tularensis , strain Schu S4, which causes tularemia. The data set includes incidence of early‐phase febrile illness following administration of well‐characterized inhaled doses of F. tularensis . Supplemental data on human body temperature profiles over time available from de‐identified case reports is also presented. A unified, logically consistent model of early‐phase febrile illness is described as a lognormal dose–response function for febrile illness linked with a stochastic time profile of fever. Three parameters are estimated from the human data to describe the time profile: incubation period or onset time for fever; rise time of fever; and near‐maximum body temperature. Inhaled dose‐dependence and variability are characterized for each of the three parameters. These parameters enable a stochastic model for the response of an exposed population through incorporation of individual‐by‐individual variability by drawing random samples from the statistical distributions of these three parameters for each individual. This model provides risk assessors and medical decisionmakers reliable representations of the predicted health impacts of early‐phase febrile illness for as long as one week after aerosol exposures of human populations to F. tularensis .  相似文献   

13.
14.
Mitchell J. Small 《Risk analysis》2011,31(10):1561-1575
A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose‐response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose‐response models (logistic and quantal‐linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5–10%. The results demonstrate that dose selection for studies that subsequently inform dose‐response models can benefit from consideration of how these models will be fit, combined, and interpreted.  相似文献   

15.
Risk factor selection is very important in the insurance industry, which helps precise rate making and studying the features of high‐quality insureds. Zero‐inflated data are common in insurance, such as the claim frequency data, and zero‐inflation makes the selection of risk factors quite difficult. In this article, we propose a new risk factor selection approach, EM adaptive LASSO, for a zero‐inflated Poisson regression model, which combines the EM algorithm and adaptive LASSO penalty. Under some regularity conditions, we show that, with probability approaching 1, important factors are selected and the redundant factors are excluded. We investigate the finite sample performance of the proposed method through a simulation study and the analysis of car insurance data from SAS Enterprise Miner database.  相似文献   

16.
Listeria monocytogenes is a leading cause of hospitalization, fetal loss, and death due to foodborne illnesses in the United States. A quantitative assessment of the relative risk of listeriosis associated with the consumption of 23 selected categories of ready‐to‐eat foods, published by the U.S. Department of Health and Human Services and the U.S. Department of Agriculture in 2003, has been instrumental in identifying the food products and practices that pose the greatest listeriosis risk and has guided the evaluation of potential intervention strategies. Dose‐response models, which quantify the relationship between an exposure dose and the probability of adverse health outcomes, were essential components of the risk assessment. However, because of data gaps and limitations in the available data and modeling approaches, considerable uncertainty existed. Since publication of the risk assessment, new data have become available for modeling L. monocytogenes dose‐response. At the same time, recent advances in the understanding of L. monocytogenes pathophysiology and strain diversity have warranted a critical reevaluation of the published dose‐response models. To discuss strategies for modeling L. monocytogenes dose‐response, the Interagency Risk Assessment Consortium (IRAC) and the Joint Institute for Food Safety and Applied Nutrition (JIFSAN) held a scientific workshop in 2011 (details available at http://foodrisk.org/irac/events/ ). The main findings of the workshop and the most current and relevant data identified during the workshop are summarized and presented in the context of L. monocytogenes dose‐response. This article also discusses new insights on dose‐response modeling for L. monocytogenes and research opportunities to meet future needs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号