首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到18条相似文献,搜索用时 0 毫秒
1.
Quantitative microbial risk assessment was used to assess the risk of norovirus gastroenteritis associated with consumption of raw vegetables irrigated with highly treated municipal wastewater, using Melbourne, Australia as an example. In the absence of local norovirus concentrations, three methods were developed: (1) published concentrations of norovirus in raw sewage, (2) an epidemiological method using Melbourne prevalence of norovirus, and (3) an adjustment of method 1 to account for prevalence of norovirus. The methods produced highly variable results with estimates of norovirus concentrations in raw sewage ranging from 104 per milliliter to 107 per milliliter and treated effluent from 1 × 10?3 per milliliter to 3 per milliliter (95th percentiles). Annual disease burden was very low using method 1, from 4 to 5 log10 disability adjusted life years (DALYs) below the 10?6 threshold (0.005–0.1 illnesses per year). Results of method 2 were higher, with some scenarios exceeding the threshold by up to 2 log10 DALYs (up to 95,000 illnesses per year). Method 3, thought to be most representative of Melbourne conditions, predicted annual disease burdens >2 log10 DALYs lower than the threshold (~4 additional cases per year). Sensitivity analyses demonstrated that input parameters used to estimate norovirus concentration accounted for much of the model output variability. This model, while constrained by a lack of knowledge of sewage concentrations, used the best available information and sound logic. Results suggest that current wastewater reuse behaviors in Melbourne are unlikely to cause norovirus risks in excess of the annual DALY health target.  相似文献   

2.
Enteric viruses are often detected in water used for crop irrigation. One concern is foodborne viral disease via the consumption of fresh produce irrigated with virus-contaminated water. Although the food industry routinely uses chemical sanitizers to disinfect post-harvest fresh produce, it remains unknown how sanitizer and fresh produce properties affect the risk of viral illness through fresh produce consumption. A quantitative microbial risk assessment model was conducted to estimate (i) the health risks associated with consumption of rotavirus (RV)-contaminated fresh produce with different surface properties (endive and kale) and (ii) how risks changed when using peracetic acid (PAA) or a surfactant-based sanitizer. The modeling results showed that the annual disease burden depended on the combination of sanitizer and vegetable type when vegetables were irrigated with RV-contaminated water. Global sensitivity analyses revealed that the most influential factors in the disease burden were RV concentration in irrigation water and postharvest disinfection efficacy. A postharvest disinfection efficacy of higher than 99% (2-log10) was needed to decrease the disease burden below the World Health Organization (WHO) threshold, even in scenarios with low RV concentrations in irrigation water (i.e., river water). All scenarios tested here with at least 99.9% (3-log10) disinfection efficacy had a disease burden lower than the WHO threshold, except for the endive treated with PAA. The disinfection efficacy for the endive treated with PAA was only about 80%, leading to a disease burden 100 times higher than the WHO threshold. These findings should be considered and incorporated into future models for estimating foodborne viral illness risks.  相似文献   

3.
T. Walton 《Risk analysis》2012,32(7):1122-1138
Through the use of case‐control analyses and quantitative microbial risk assessment (QMRA), relative risks of transmission of cryptosporidiosis have been evaluated (recreational water exposure vs. drinking water consumption) for a Canadian community with higher than national rates of cryptosporidiosis. A QMRA was developed to assess the risk of Cryptosporidium infection through the consumption of municipally treated drinking water. Simulations were based on site‐specific surface water contamination levels and drinking water treatment log10 reduction capacity for Cryptosporidium. Results suggested that the risk of Cryptosporidium infection via drinking water in the study community, assuming routine operation of the water treatment plant, was negligible (6 infections per 1013 persons per day—5th percentile: 2 infections per 1015 persons per day; 95th percentile: 3 infections per 1012 persons per day). The risk is essentially nonexistent during optimized, routine treatment operations. The study community achieves between 7 and 9 log10Cryptosporidium oocyst reduction through routine water treatment processes. Although these results do not preclude the need for constant vigilance by both water treatment and public health professionals in this community, they suggest that the cause of higher rates of cryptosporidiosis are more likely due to recreational water contact, or perhaps direct animal contact. QMRA can be successfully applied at the community level to identify data gaps, rank relative public health risks, and forecast future risk scenarios. It is most useful when performed in a collaborative way with local stakeholders, from beginning to end of the risk analysis paradigm.  相似文献   

4.
The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail‐to‐table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross‐contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research.  相似文献   

5.
Dose‐response models are essential to quantitative microbial risk assessment (QMRA), providing a link between levels of human exposure to pathogens and the probability of negative health outcomes. In drinking water studies, the class of semi‐mechanistic models known as single‐hit models, such as the exponential and the exact beta‐Poisson, has seen widespread use. In this work, an attempt is made to carefully develop the general mathematical single‐hit framework while explicitly accounting for variation in (1) host susceptibility and (2) pathogen infectivity. This allows a precise interpretation of the so‐called single‐hit probability and precise identification of a set of statistical independence assumptions that are sufficient to arrive at single‐hit models. Further analysis of the model framework is facilitated by formulating the single‐hit models compactly using probability generating and moment generating functions. Among the more practically relevant conclusions drawn are: (1) for any dose distribution, variation in host susceptibility always reduces the single‐hit risk compared to a constant host susceptibility (assuming equal mean susceptibilities), (2) the model‐consistent representation of complete host immunity is formally demonstrated to be a simple scaling of the response, (3) the model‐consistent expression for the total risk from repeated exposures deviates (gives lower risk) from the conventional expression used in applications, and (4) a model‐consistent expression for the mean per‐exposure dose that produces the correct total risk from repeated exposures is developed.  相似文献   

6.
To prevent and control foodborne diseases, there is a fundamental need to identify the foods that are most likely to cause illness. The goal of this study was to rank 25 commonly consumed food products associated with Salmonella enterica contamination in the Central Region of Mexico. A multicriteria decision analysis (MCDA) framework was developed to obtain an S. enterica risk score for each food product based on four criteria: probability of exposure to S. enterica through domestic food consumption (Se); S. enterica growth potential during home storage (Sg); per capita consumption (Pcc); and food attribution of S. enterica outbreak (So). Risk scores were calculated by the equation Se*W1+Sg*W2+Pcc*W3+So*W4, where each criterion was assigned a normalized value (1–5) and the relative weights (W) were defined by 22 experts’ opinion. Se had the largest effect on the risk score being the criterion with the highest weight (35%; IC95% 20%–60%), followed by So (24%; 5%–50%), Sg (23%; 10%–40%), and Pcc (18%; 10%–35%). The results identified chicken (4.4 ± 0.6), pork (4.2 ± 0.6), and beef (4.2 ± 0.5) as the highest risk foods, followed by seed fruits (3.6 ± 0.5), tropical fruits (3.4 ± 0.4), and dried fruits and nuts (3.4 ± 0.5), while the food products with the lowest risk were yogurt (2.1 ± 0.3), chorizo (2.1 ± 0.4), and cream (2.0 ± 0.3). Approaches with expert-based weighting and equal weighting showed good correlation (R= 0.96) and did not show significant differences among the ranking order in the top 20 tier. This study can help risk managers select interventions and develop targeted surveillance programs against S. enterica in high-risk food products.  相似文献   

7.
《Risk analysis》2018,38(4):638-652
The objective of this research was to analyze the impact of different cooking procedures (i.e., gas hob and traditional static oven) and levels of cooking (i.e., rare, medium, and well‐done) on inactivation of Listeria monocytogenes and Salmonella in pork loin chops. Moreover, the consumer's exposure to both microorganisms after simulation of meat leftover storage at home was assessed. The results showed that well‐done cooking in a static oven was the only treatment able to inactivate the tested pathogens. The other cooking combinations allowed to reach in the product temperatures always ≥73.6 °C, decreasing both pathogens between 6 log10 cfu/g and 7 log10 cfu/g. However, according to simulation results, the few cells surviving cooking treatments can multiply during storage by consumers up to 1 log10 cfu/g, with probabilities of 0.059 (gas hob) and 0.035 (static oven) for L. monocytogenes and 0.049 (gas hob) and 0.031 (static oven) for Salmonella . The key factors affecting consumer exposure in relation to storage practices were probability of pathogen occurrence after cooking, doneness degree, time of storage, and time of storage at room temperature. The results of this study can be combined with prevalence data and dose–response models in risk assessment models and included in guidelines for consumers on practices to be followed to manage cooking of pork meat at home.  相似文献   

8.
The inclusion of deep tissue lymph nodes (DTLNs) or nonvisceral lymph nodes contaminated with Salmonella in wholesale fresh ground pork (WFGP) production may pose risks to public health. To assess the relative contribution of DTLNs to human salmonellosis occurrence associated with ground pork consumption and to investigate potential critical control points in the slaughter‐to‐table continuum for the control of human salmonellosis in the United States, a quantitative microbial risk assessment (QMRA) model was established. The model predicted an average of 45 cases of salmonellosis (95% CI = [19, 71]) per 100,000 Americans annually due to WFGP consumption. Sensitivity analysis of all stochastic input variables showed that cooking temperature was the most influential parameter for reducing salmonellosis cases associated with WFGP meals, followed by storage temperature and Salmonella concentration on contaminated carcass surface before fabrication. The input variables were grouped to represent three main factors along the slaughter‐to‐table chain influencing Salmonella doses ingested via WFGP meals: DTLN‐related factors, factors at processing other than DTLNs, and consumer‐related factors. The evaluation of the impact of each group of factors by second‐order Monte Carlo simulation showed that DTLN‐related factors had the lowest impact on the risk estimate among the three groups of factors. These findings indicate that interventions to reduce Salmonella contamination in DTLNs or to remove DTLNs from WFGP products may be less critical for reducing human infections attributable to ground pork than improving consumers’ cooking habits or interventions of carcass decontamination at processing.  相似文献   

9.
Middle Eastern respiratory syndrome, an emerging viral infection with a global case fatality rate of 35.5%, caused major outbreaks first in 2012 and 2015, though new cases are continuously reported around the world. Transmission is believed to mainly occur in healthcare settings through aerosolized particles. This study uses Quantitative Microbial Risk Assessment to develop a generalizable model that can assist with interpreting reported outbreak data or predict risk of infection with or without the recommended strategies. The exposure scenario includes a single index patient emitting virus‐containing aerosols into the air by coughing, leading to short‐ and long‐range airborne exposures for other patients in the same room, nurses, healthcare workers, and family visitors. Aerosol transport modeling was coupled with Monte Carlo simulation to evaluate the risk of MERS illness for the exposed population. Results from a typical scenario show the daily mean risk of infection to be the highest for the nurses and healthcare workers (8.49 × 10?4 and 7.91 × 10?4, respectively), and the lowest for family visitors and patients staying in the same room (3.12 × 10?4 and 1.29 × 10?4, respectively). Sensitivity analysis indicates that more than 90% of the uncertainty in the risk characterization is due to the viral concentration in saliva. Assessment of risk interventions showed that respiratory masks were found to have a greater effect in reducing the risks for all the groups evaluated (>90% risk reduction), while increasing the air exchange was effective for the other patients in the same room only (up to 58% risk reduction).  相似文献   

10.
We analyze the risk of contracting illness due to the consumption in the United States of hamburgers contaminated with enterohemorrhagic Escherichia coli (EHEC) of serogroup O157 produced from manufacturing beef imported from Australia. We have used a novel approach for estimating risk by using the prevalence and concentration estimates of E. coli O157 in lots of beef that were withdrawn from the export chain following detection of the pathogen. For the purpose of the present assessment an assumption was that no product is removed from the supply chain following testing. This, together with a number of additional conservative assumptions, leads to an overestimation of E. coli O157‐associated illness attributable to the consumption of ground beef patties manufactured only from Australian beef. We predict 49.6 illnesses (95%: 0.0–148.6) from the 2.46 billion hamburgers made from 155,000 t of Australian manufacturing beef exported to the United States in 2012. All these illness were due to undercooking in the home and less than one illness is predicted from consumption of hamburgers cooked to a temperature of 68 °C in quick‐service restaurants.  相似文献   

11.
This article reports a quantitative risk assessment of human listeriosis linked to the consumption of soft cheeses made from raw milk. Risk assessment was based on data purposefully acquired inclusively over the period 2000-2001 for two French cheeses, namely: Camembert of Normandy and Brie of Meaux. Estimated Listeria monocytogenes concentration in raw milk was on average 0.8 and 0.3 cells/L, respectively, in Normandy and Brie regions. A Monte Carlo simulation was used to account for the time-temperature history of the milk and cheeses from farm to table. It was assumed that cell progeny did not spread within the solid cheese matrix (as they would be free to do in liquid broth). Interaction between pH and temperature was accounted for in the growth model. The simulated proportion of servings with no L. monocytogenes cell was 88% for Brie and 82% for Camembert. The 99th percentile of L. monocytogenes cell numbers in servings of 27 g of cheese was 131 for Brie and 77 for Camembert at the time of consumption, corresponding respectively to three and five cells of L. monocytogenes per gram. The expected number of severe listeriosis cases would be < or =10(-3) and < or =2.5 x 10(-3) per year for 17 million servings of Brie of Meaux and 480 million servings of Camembert of Normandy, respectively.  相似文献   

12.
《Risk analysis》2018,38(8):1738-1757
We developed a risk assessment of human salmonellosis associated with consumption of alfalfa sprouts in the United States to evaluate the public health impact of applying treatments to seeds (0–5‐log10 reduction in Salmonella ) and testing spent irrigation water (SIW) during production. The risk model considered variability and uncertainty in Salmonella contamination in seeds, Salmonella growth and spread during sprout production, sprout consumption, and Salmonella dose response. Based on an estimated prevalence of 2.35% for 6.8 kg seed batches and without interventions, the model predicted 76,600 (95% confidence interval (CI) 15,400 – 248,000) cases/year. Risk reduction (by 5 ‐ to 7‐fold) predicted from a 1‐log10 seed treatment alone was comparable to SIW testing alone, and each additional 1‐log10 seed treatment was predicted to provide a greater risk reduction than SIW testing. A 3‐log10 or a 5‐log10 seed treatment reduced the predicted cases/year to 139 (95% CI 33 – 448) or 1.4 (95% CI <1 – 4.5), respectively. Combined with SIW testing, a 3‐log10 or 5‐log10 seed treatment reduced the cases/year to 45 (95% CI 10–146) or <1 (95% CI <1 – 1.5), respectively. If the SIW coverage was less complete (i.e., less representative), a smaller risk reduction was predicted, e.g., a combined 3‐log10 seed treatment and SIW testing with 20% coverage resulted in an estimated 92 (95% CI 22 – 298) cases/year. Analysis of alternative scenarios using different assumptions for key model inputs showed that the predicted relative risk reductions are robust. This risk assessment provides a comprehensive approach for evaluating the public health impact of various interventions in a sprout production system.  相似文献   

13.
The risk analysis of the health impact of foods is increasingly focused on integrated risk‐benefit assessment, which will also need to be communicated to consumers. It therefore becomes important to understand how consumers respond to integrated risk‐benefit information. Quality‐adjusted‐life‐years (QALYs) is one measure that can be used to assess the balance between risks and benefits associated with a particular food. The effectiveness of QALYs for communicating both positive and negative health effects associated with food consumption to consumers was examined, using a 3 × 2 experiment varying information about health changes in terms of QALYs associated with the consumption of fish (n = 325). The effect of this information on consumer perceptions of the usefulness of QALYs for describing health effects, on risk and benefit perceptions, attitudes, and intentions to consume fish was examined. Results demonstrated that consumers perceived QALYs as useful for communicating health effects associated with food consumption. QALYs communicated as a net effect were preferred for food products associated with negative net effects on health, while separate communication of both risks and benefits may be preferred for food products associated with positive or zero net health effects. Information about health changes in terms of QALYs facilitated informed decision making by consumers, as indicated by the impact on risk and benefit perceptions as intended by the information. The impact of this information on actual food consumption choices merits further investigation.  相似文献   

14.
Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large‐scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision‐making process. Based on the social amplification of risk framework, our agent‐based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the “risk publics” model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community‐level parameters—including social groups, relationships, and communication variables, also from survey data—are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks.  相似文献   

15.
Wastewater from facilities processing livestock that may harbor transmissible spongiform encephalopathies (TSEs) infectivity is permitted under license for application to land where susceptible livestock may have access. Several previous risk assessments have investigated the risk of bovine spongiform encephalopathy (BSE) associated with wastewater effluents; however, the risk of exposure to classical scrapie and atypical scrapie has not been assessed. With the prevalence of certain TSEs (BSE in cattle and classical scrapie in sheep) steadily in decline, and with considerable changes in the structure of carcass‐processing industries in Great Britain, a reappraisal of the TSE risk posed by wastewater is required. Our results indicate that the predicted number of new TSE infections arising from the spreading of wastewater on pasture over one year would be low, with a mean of one infection every 1,000 years for BSE in cattle (769, 555,556), and one infection every 30 years (16, 2,500), and 33 years (16, 3,333) for classical and atypical scrapie, respectively. It is assumed that the values and assumptions used in this risk assessment remain constant. For BSE in cattle the main contributors are abattoir and rendering effluent, contributing 35% and 22% of the total number of new BSE infections. For TSEs in sheep, effluent from small incinerators and rendering plants are the major contributors (on average 32% and 31% of the total number of new classical scrapie and atypical scrapie infections). This is a reflection of the volume of carcass material and Category 1 material flow through such facilities.  相似文献   

16.
Toxoplasma gondii is a protozoan parasite that is responsible for approximately 24% of deaths attributed to foodborne pathogens in the United States. It is thought that a substantial portion of human T. gondii infections is acquired through the consumption of meats. The dose‐response relationship for human exposures to T. gondii‐infected meat is unknown because no human data are available. The goal of this study was to develop and validate dose‐response models based on animal studies, and to compute scaling factors so that animal‐derived models can predict T. gondii infection in humans. Relevant studies in literature were collected and appropriate studies were selected based on animal species, stage, genotype of T. gondii, and route of infection. Data were pooled and fitted to four sigmoidal‐shaped mathematical models, and model parameters were estimated using maximum likelihood estimation. Data from a mouse study were selected to develop the dose‐response relationship. Exponential and beta‐Poisson models, which predicted similar responses, were selected as reasonable dose‐response models based on their simplicity, biological plausibility, and goodness fit. A confidence interval of the parameter was determined by constructing 10,000 bootstrap samples. Scaling factors were computed by matching the predicted infection cases with the epidemiological data. Mouse‐derived models were validated against data for the dose‐infection relationship in rats. A human dose‐response model was developed as P (d) = 1–exp (–0.0015 × 0.005 × d) or P (d) = 1–(1 + d × 0.003 / 582.414)?1.479. Both models predict the human response after consuming T. gondii‐infected meats, and provide an enhanced risk characterization in a quantitative microbial risk assessment model for this pathogen.  相似文献   

17.
18.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号