首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 343 毫秒
1.
《Risk analysis》2018,38(4):724-754
A bounding risk assessment is presented that evaluates possible human health risk from a hypothetical scenario involving a 10,000‐gallon release of flowback water from horizontal fracturing of Marcellus Shale. The water is assumed to be spilled on the ground, infiltrates into groundwater that is a source of drinking water, and an adult and child located downgradient drink the groundwater. Key uncertainties in estimating risk are given explicit quantitative treatment using Monte Carlo analysis. Chemicals that contribute significantly to estimated health risks are identified, as are key uncertainties and variables to which risk estimates are sensitive. The results show that hypothetical exposure via drinking water impacted by chemicals in Marcellus Shale flowback water, assumed to be spilled onto the ground surface, results in predicted bounds between 10−10 and 10−6 (for both adult and child receptors) for excess lifetime cancer risk. Cumulative hazard indices (HICUMULATIVE) resulting from these hypothetical exposures have predicted bounds (5th to 95th percentile) between 0.02 and 35 for assumed adult receptors and 0.1 and 146 for assumed child receptors. Predicted health risks are dominated by noncancer endpoints related to ingestion of barium and lithium in impacted groundwater. Hazard indices above unity are largely related to exposure to lithium. Salinity taste thresholds are likely to be exceeded before drinking water exposures result in adverse health effects. The findings provide focus for policy discussions concerning flowback water risk management. They also indicate ways to improve the ability to estimate health risks from drinking water impacted by a flowback water spill (i.e., reducing uncertainty).  相似文献   

2.
T. Walton 《Risk analysis》2012,32(7):1122-1138
Through the use of case‐control analyses and quantitative microbial risk assessment (QMRA), relative risks of transmission of cryptosporidiosis have been evaluated (recreational water exposure vs. drinking water consumption) for a Canadian community with higher than national rates of cryptosporidiosis. A QMRA was developed to assess the risk of Cryptosporidium infection through the consumption of municipally treated drinking water. Simulations were based on site‐specific surface water contamination levels and drinking water treatment log10 reduction capacity for Cryptosporidium. Results suggested that the risk of Cryptosporidium infection via drinking water in the study community, assuming routine operation of the water treatment plant, was negligible (6 infections per 1013 persons per day—5th percentile: 2 infections per 1015 persons per day; 95th percentile: 3 infections per 1012 persons per day). The risk is essentially nonexistent during optimized, routine treatment operations. The study community achieves between 7 and 9 log10Cryptosporidium oocyst reduction through routine water treatment processes. Although these results do not preclude the need for constant vigilance by both water treatment and public health professionals in this community, they suggest that the cause of higher rates of cryptosporidiosis are more likely due to recreational water contact, or perhaps direct animal contact. QMRA can be successfully applied at the community level to identify data gaps, rank relative public health risks, and forecast future risk scenarios. It is most useful when performed in a collaborative way with local stakeholders, from beginning to end of the risk analysis paradigm.  相似文献   

3.
The aim of this study was to develop a modified quantitative microbial risk assessment (QMRA) framework that could be applied as a decision support tool to choose between alternative drinking water interventions in the developing context. The impact of different household water treatment (HWT) interventions on the overall incidence of diarrheal disease and disability adjusted life years (DALYs) was estimated, without relying on source water pathogen concentration as the starting point for the analysis. A framework was developed and a software tool constructed and then implemented for an illustrative case study for Nepal based on published scientific data. Coagulation combined with free chlorine disinfection provided the greatest estimated health gains in the short term; however, when long‐term compliance was incorporated into the calculations, the preferred intervention was porous ceramic filtration. The model demonstrates how the QMRA framework can be used to integrate evidence from different studies to inform management decisions, and in particular to prioritize the next best intervention with respect to estimated reduction in diarrheal incidence. This study only considered HWT interventions; it is recognized that a systematic consideration of sanitation, recreation, and drinking water pathways is important for effective management of waterborne transmission of pathogens, and the approach could be expanded to consider the broader water‐related context.  相似文献   

4.
Many farmers in water‐scarce regions of developing countries use wastewater to irrigate vegetables and other agricultural crops, a practice that may expand with climate change. There are a number of health risks associated with wastewater irrigation for human food crops, particularly with surface irrigation techniques common in the developing world. The World Health Organization (WHO) recommends using quantitative microbial risk assessment (QMRA) to determine if the irrigation scheme meets health standards. However, only a few vegetables have been studied for wastewater risk and little information is known about the disease burden of wastewater‐irrigated vegetable consumption in China. To bridge this knowledge gap, an experiment was conducted to determine volume of water left on Asian vegetables and lettuce after irrigation. One hundred samples each of Chinese chard (Brassica rapa var. chinensis), Chinese broccoli (Brassica oleracea var. alboglabra), Chinese flowering cabbage (Brassica rapa var. parachinensis), and lettuce (Lactuca sativa) were harvested after overhead sprinkler irrigation. Chinese broccoli and flowering cabbage were found to capture the most water and lettuce the least. QMRAs were then constructed to estimate rotavirus disease burden from consumption of wastewater‐irrigated Asian vegetables in Beijing. Results indicate that estimated risks from these reuse scenarios exceed WHO guideline thresholds for acceptable disease burden for wastewater use, signifying that reduction of pathogen concentration or stricter risk management is necessary for safe reuse. Considering the widespread practice of wastewater irrigation for food production, particularly in developing countries, incorporation of water retention factors in QMRAs can reduce uncertainty regarding health risks for consumers worldwide.  相似文献   

5.
Methyl tert-butyl ether (MTBE) was added to gasoline in New Hampshire (NH) between 1995 and 2006 to comply with the oxygenate requirements of the 1990 Amendments to the Clean Air Act. Leaking tanks and spills released MTBE into groundwater, and as a result, MTBE has been detected in drinking water in NH. We conducted a comparative cancer risk assessment and a margin-of-safety (MOS) analysis for several constituents, including MTBE, detected in NH drinking water. Using standard risk assessment methods, we calculated cancer risks from exposure to 12 detected volatile organic compounds (VOCs), including MTBE, and to four naturally occurring compounds (i.e., arsenic, radium-226, radium-228, and radon-222) detected in NH public water supplies. We evaluated exposures to a hypothetical resident ingesting the water, dermally contacting the water while showering, and inhaling compounds volatilizing from water in the home. We then compared risk estimates for MTBE to those of the other 15 compounds. From our analysis, we concluded that the high-end cancer risk from exposure to MTBE in drinking water is lower than the risks from all the other VOCs evaluated and several thousand times lower than the risks from exposure to naturally occurring constituents, including arsenic, radium, and radon. We also conducted an MOS analysis in which we compared toxicological points of departure to the NH maximum contaminant level (MCL) of 13 µg/L. All of the MOSs were greater than or equal to 160,000, indicating a large margin of safety and demonstrating the health-protectiveness of the NH MCL for MTBE.  相似文献   

6.
Rural communities dependent on unregulated drinking water are potentially at increased health risk from exposure to contaminants. Perception of drinking water safety influences water consumption, exposure, and health risk. A community‐based participatory approach and probabilistic Bayesian methods were applied to integrate risk perception in a holistic human health risk assessment. Tap water arsenic concentrations and risk perception data were collected from two Saskatchewan communities. Drinking water health standards were exceeded in 67% (51/76) of households in Rural Municipality #184 (RM184) and 56% (25/45) in Beardy's and Okemasis First Nation (BOFN). There was no association between the presence of a health exceedance and risk perception. Households in RM184 or with an annual income >$50,000 were most likely to have in‐house water treatment. The probability of consuming tap water perceived as safe (92%) or not safe (0%) suggested that households in RM184 were unlikely to drink water perceived as not safe. The probability of drinking tap water perceived as safe (77%) or as not safe (11%) suggested households in BOFN contradicted their perception and consumed water perceived as unsafe. Integration of risk perception lowered the adult incremental lifetime cancer risk by 3% to 1.3 × 10?5 (95% CI 8.4 × 10?8 to 9.0 × 10?5) for RM184 and by 8.9 × 10?6 (95% CI 2.2 × 10?7 to 5.9 × 10?5) for BOFN. Probability of exposure to arsenic concentrations >1:100,000, negligible cancer risk, was 23% for RM184 and 22% for BOFN.  相似文献   

7.
Ongoing publicity about methyl tertiary butyl ether (MTBE) suggests that this chemical is of greater concern than other contaminants commonly found in drinking water. The purpose of this article is to evaluate the available MTBE data in context with other volatile organic compounds (VOCs) that are detected in public drinking water sources in California. We find that of the 28 VOCs with a primary maximum contaminant level (MCL) in California, 21 were found in 50 or more drinking water sources from 1985 to 2002. Over the last 10 years, the most frequently detected VOCs were chloroform, tetrachloroethylene (PCE), and trichloroethylene (TCE), which were found in about 9-15% of all sampled drinking water sources. These same chemicals were found to have the highest mean detected concentrations over the last 5 years, ranging from 13 to 15 microg/L. Many VOCs were also found to routinely exceed state and federal drinking water standards, including benzene and carbon tetrachloride. By comparison, MTBE was found in approximately 1% of sampled drinking water sources for most years, and of those drinking water sources found to contain MTBE from 1998 to 2002, over 90% had detected concentrations below California's primary MCL of 13 microg/L. Relative to the other VOCs evaluated, MTBE has the lowest estimated California cancer potency value, and was found to pose one of the least cancer risks from household exposures to contaminated drinking water. These findings suggest that MTBE poses an insignificant threat to public drinking water supplies and public health in California, particularly when compared to other common drinking water contaminants.  相似文献   

8.
Point source pollution is one of the main threats to regional environmental health. Based on a water quality model, a methodology to assess the regional risk of point source pollution is proposed. The assessment procedure includes five parts: (1) identifying risk source units and estimating source emissions using Monte Carlo algorithms; (2) observing hydrological and water quality data of the assessed area, and evaluating the selected water quality model; (3) screening out the assessment endpoints and analyzing receptor vulnerability with the Choquet fuzzy integral algorithm; (4) using the water quality model introduced in the second step to predict pollutant concentrations for various source emission scenarios and analyzing hazards of risk sources; and finally, (5) using the source hazard values and receptor vulnerability scores to estimate overall regional risk. The proposed method, based on the Water Quality Analysis Simulation Program (WASP), was applied in the region of the Taipu River, which is in the Taihu Basin, China. Results of source hazard and receptor vulnerability analysis allowed us to describe aquatic ecological, human health, and socioeconomic risks individually, and also integrated risks in the Taipu region, from a series of risk curves. Risk contributions of sources to receptors were ranked, and the spatial distribution of risk levels was presented. By changing the input conditions, we were able to estimate risks for a range of scenarios. Thus, the proposed procedure may also be used by decisionmakers for long‐term dynamic risk prediction.  相似文献   

9.
Dose‐response models are essential to quantitative microbial risk assessment (QMRA), providing a link between levels of human exposure to pathogens and the probability of negative health outcomes. In drinking water studies, the class of semi‐mechanistic models known as single‐hit models, such as the exponential and the exact beta‐Poisson, has seen widespread use. In this work, an attempt is made to carefully develop the general mathematical single‐hit framework while explicitly accounting for variation in (1) host susceptibility and (2) pathogen infectivity. This allows a precise interpretation of the so‐called single‐hit probability and precise identification of a set of statistical independence assumptions that are sufficient to arrive at single‐hit models. Further analysis of the model framework is facilitated by formulating the single‐hit models compactly using probability generating and moment generating functions. Among the more practically relevant conclusions drawn are: (1) for any dose distribution, variation in host susceptibility always reduces the single‐hit risk compared to a constant host susceptibility (assuming equal mean susceptibilities), (2) the model‐consistent representation of complete host immunity is formally demonstrated to be a simple scaling of the response, (3) the model‐consistent expression for the total risk from repeated exposures deviates (gives lower risk) from the conventional expression used in applications, and (4) a model‐consistent expression for the mean per‐exposure dose that produces the correct total risk from repeated exposures is developed.  相似文献   

10.
Daily soil/dust ingestion rates typically used in exposure and risk assessments are based on tracer element studies, which have a number of limitations and do not separate contributions from soil and dust. This article presents an alternate approach of modeling soil and dust ingestion via hand and object mouthing of children, using EPA's SHEDS model. Results for children 3 to <6 years old show that mean and 95th percentile total ingestion of soil and dust values are 68 and 224 mg/day, respectively; mean from soil ingestion, hand‐to‐mouth dust ingestion, and object‐to‐mouth dust ingestion are 41 mg/day, 20 mg/day, and 7 mg/day, respectively. In general, hand‐to‐mouth soil ingestion was the most important pathway, followed by hand‐to‐mouth dust ingestion, then object‐to‐mouth dust ingestion. The variability results are most sensitive to inputs on surface loadings, soil‐skin adherence, hand mouthing frequency, and hand washing frequency. The predicted total soil and dust ingestion fits a lognormal distribution with geometric mean = 35.7 and geometric standard deviation = 3.3. There are two uncertainty distributions, one below the 20th percentile and the other above. Modeled uncertainties ranged within a factor of 3–30. Mean modeled estimates for soil and dust ingestion are consistent with past information but lower than the central values recommended in the 2008 EPA Child‐Specific Exposure Factors Handbook. This new modeling approach, which predicts soil and dust ingestion by pathway, source type, population group, geographic location, and other factors, offers a better characterization of exposures relevant to health risk assessments as compared to using a single value.  相似文献   

11.
Evaluations of Listeria monocytogenes dose‐response relationships are crucially important for risk assessment and risk management, but are complicated by considerable variability across population subgroups and L. monocytogenes strains. Despite difficulties associated with the collection of adequate data from outbreak investigations or sporadic cases, the limitations of currently available animal models, and the inability to conduct human volunteer studies, some of the available data now allow refinements of the well‐established exponential L. monocytogenes dose response to more adequately represent extremely susceptible population subgroups and highly virulent L. monocytogenes strains. Here, a model incorporating adjustments for variability in L. monocytogenes strain virulence and host susceptibility was derived for 11 population subgroups with similar underlying comorbidities using data from multiple sources, including human surveillance and food survey data. In light of the unique inherent properties of L. monocytogenes dose response, a lognormal‐Poisson dose‐response model was chosen, and proved able to reconcile dose‐response relationships developed based on surveillance data with outbreak data. This model was compared to a classical beta‐Poisson dose‐response model, which was insufficiently flexible for modeling the specific case of L. monocytogenes dose‐response relationships, especially in outbreak situations. Overall, the modeling results suggest that most listeriosis cases are linked to the ingestion of food contaminated with medium to high concentrations of L. monocytogenes. While additional data are needed to refine the derived model and to better characterize and quantify the variability in L. monocytogenes strain virulence and individual host susceptibility, the framework derived here represents a promising approach to more adequately characterize the risk of listeriosis in highly susceptible population subgroups.  相似文献   

12.
Jocelyne Rocourt 《Risk analysis》2012,32(10):1798-1819
We used a quantitative microbiological risk assessment model to describe the risk of Campylobacter and Salmonella infection linked to chicken meals prepared in households in Dakar, Senegal. The model uses data collected specifically for this study, such as the prevalence and level of bacteria on the neck skin of chickens bought in Dakar markets, time‐temperature profiles recorded from purchase to consumption, an observational survey of meal preparation in private kitchens, and detection and enumeration of pathogens on kitchenware and cooks’ hands. Thorough heating kills all bacteria present on chicken during cooking, but cross‐contamination of cooked chicken or ready‐to‐eat food prepared for the meal via kitchenware and cooks’ hands leads to a high expected frequency of pathogen ingestion. Additionally, significant growth of Salmonella is predicted during food storage at ambient temperature before and after meal preparation. These high exposures lead to a high estimated risk of campylobacteriosis and/or salmonellosis in Dakar households. The public health consequences could be amplified by the high level of antimicrobial resistance of Salmonella and Campylobacter observed in this setting. A significant decrease in the number of ingested bacteria and in the risk could be achieved through a reduction of the prevalence of chicken contamination at slaughter, and by the use of simple hygienic measures in the kitchen. There is an urgent need to reinforce the hygiene education of food handlers in Senegal.  相似文献   

13.
Lack of data on daily inhalation rate and activity of children has been an issue in health risk assessment of air pollutants. This study aimed to obtain the daily inhalation rate and intensity and frequency of physical activity in relation to the environment in Japanese preschool children. Children aged four–six years (n= 138) in the suburbs of Tokyo participated in this study, which involved three days' continuous monitoring of physical activity using a tri‐axial accelerometer and parent's completion of a time/location diary during daily life. The estimated three‐day mean daily inhalation rate (body temperature, pressure, saturated with water vapor) was 9.9 ± 1.6 m3/day (0.52 ± 0.09 m3/kg/day). The current daily inhalation rate value of 0.580 m3/kg/day proposed for use in health risk assessment in Japan is confirmed to be valid to calculate central value of inhaled dose of air pollutants in five‐ to six‐year‐old children. However, the 95th percentile daily inhalation rate of 0.83 m3/kg/day based on measurement for five‐year‐old children is recommended to be used to provide an upper bound estimate of exposure that ensure the protection of all five‐ to six‐year‐old children from the health risk of air pollutants. Children spent the majority of their time in sedentary and light level of physical activity (LPA) when indoors, while 85% of their time when outdoors was spent in LPA and moderate‐to‐vigorous physical activity. The results suggest the need to consider variability of minute respiratory ventilation rate according to the environment for more refined short‐term health risk assessment.  相似文献   

14.
《Risk analysis》2018,38(5):1070-1084
Human exposure to bacteria resistant to antimicrobials and transfer of related genes is a complex issue and occurs, among other pathways, via meat consumption. In a context of limited resources, the prioritization of risk management activities is essential. Since the antimicrobial resistance (AMR) situation differs substantially between countries, prioritization should be country specific. The objective of this study was to develop a systematic and transparent framework to rank combinations of bacteria species resistant to selected antimicrobial classes found in meat, based on the risk they represent for public health in Switzerland. A risk assessment model from slaughter to consumption was developed following the Codex Alimentarius guidelines for risk analysis of foodborne AMR. Using data from the Swiss AMR monitoring program, 208 combinations of animal species/bacteria/antimicrobial classes were identified as relevant hazards. Exposure assessment and hazard characterization scores were developed and combined using multicriteria decision analysis. The effect of changing weights of scores was explored with sensitivity analysis. Attributing equal weights to each score, poultry‐associated combinations represented the highest risk. In particular, contamination with extended‐spectrum β‐lactamase/plasmidic AmpC‐producing Escherichia coli in poultry meat ranked high for both exposure and hazard characterization. Tetracycline‐ or macrolide‐resistant Enterococcus spp., as well as fluoroquinolone‐ or macrolide‐resistant Campylobacter jejuni , ranked among combinations with the highest risk. This study provides a basis for prioritizing future activities to mitigate the risk associated with foodborne AMR in Switzerland. A user‐friendly version of the model was provided to risk managers; it can easily be adjusted to the constantly evolving knowledge on AMR.  相似文献   

15.
This study developed dose response models for determining the probability of eye or central nervous system infections from previously conducted studies using different strains of Acanthamoeba spp. The data were a result of animal experiments using mice and rats exposed corneally and intranasally to the pathogens. The corneal inoculations of Acanthamoeba isolate Ac 118 included varied amounts of Corynebacterium xerosis and were best fit by the exponential model. Virulence increased with higher levels of C. xerosis. The Acanthamoeba culbertsoni intranasal study with death as an endpoint of response was best fit by the beta‐Poisson model. The HN‐3 strain of A. castellanii was studied with an intranasal exposure and three different endpoints of response. For all three studies, the exponential model was the best fit. A model based on pooling data sets of the intranasal exposure and death endpoint resulted in an LD50 of 19,357 amebae. The dose response models developed in this study are an important step towards characterizing the risk associated with free‐living amoeba like Acanthamoeba in drinking water distribution systems. Understanding the human health risk posed by free‐living amoeba will allow for quantitative microbial risk assessments that support building design decisions to minimize opportunities for pathogen growth and survival.  相似文献   

16.
The risk analysis of the health impact of foods is increasingly focused on integrated risk‐benefit assessment, which will also need to be communicated to consumers. It therefore becomes important to understand how consumers respond to integrated risk‐benefit information. Quality‐adjusted‐life‐years (QALYs) is one measure that can be used to assess the balance between risks and benefits associated with a particular food. The effectiveness of QALYs for communicating both positive and negative health effects associated with food consumption to consumers was examined, using a 3 × 2 experiment varying information about health changes in terms of QALYs associated with the consumption of fish (n = 325). The effect of this information on consumer perceptions of the usefulness of QALYs for describing health effects, on risk and benefit perceptions, attitudes, and intentions to consume fish was examined. Results demonstrated that consumers perceived QALYs as useful for communicating health effects associated with food consumption. QALYs communicated as a net effect were preferred for food products associated with negative net effects on health, while separate communication of both risks and benefits may be preferred for food products associated with positive or zero net health effects. Information about health changes in terms of QALYs facilitated informed decision making by consumers, as indicated by the impact on risk and benefit perceptions as intended by the information. The impact of this information on actual food consumption choices merits further investigation.  相似文献   

17.
《Risk analysis》2018,38(8):1738-1757
We developed a risk assessment of human salmonellosis associated with consumption of alfalfa sprouts in the United States to evaluate the public health impact of applying treatments to seeds (0–5‐log10 reduction in Salmonella ) and testing spent irrigation water (SIW) during production. The risk model considered variability and uncertainty in Salmonella contamination in seeds, Salmonella growth and spread during sprout production, sprout consumption, and Salmonella dose response. Based on an estimated prevalence of 2.35% for 6.8 kg seed batches and without interventions, the model predicted 76,600 (95% confidence interval (CI) 15,400 – 248,000) cases/year. Risk reduction (by 5 ‐ to 7‐fold) predicted from a 1‐log10 seed treatment alone was comparable to SIW testing alone, and each additional 1‐log10 seed treatment was predicted to provide a greater risk reduction than SIW testing. A 3‐log10 or a 5‐log10 seed treatment reduced the predicted cases/year to 139 (95% CI 33 – 448) or 1.4 (95% CI <1 – 4.5), respectively. Combined with SIW testing, a 3‐log10 or 5‐log10 seed treatment reduced the cases/year to 45 (95% CI 10–146) or <1 (95% CI <1 – 1.5), respectively. If the SIW coverage was less complete (i.e., less representative), a smaller risk reduction was predicted, e.g., a combined 3‐log10 seed treatment and SIW testing with 20% coverage resulted in an estimated 92 (95% CI 22 – 298) cases/year. Analysis of alternative scenarios using different assumptions for key model inputs showed that the predicted relative risk reductions are robust. This risk assessment provides a comprehensive approach for evaluating the public health impact of various interventions in a sprout production system.  相似文献   

18.
《Risk analysis》2018,38(6):1107-1115
Coal combustion residuals (CCRs) are composed of various constituents, including radioactive materials. The objective of this study was to utilize methodology on radionuclide risk assessment from the Environmental Protection Agency (EPA) to estimate the potential cancer risks associated with residential exposure to CCR‐containing soil. We evaluated potential radionuclide exposure via soil ingestion, inhalation of soil particulates, and external exposure to ionizing radiation using published CCR radioactivity values for 232Th, 228Ra, 238U, and 226Ra from the Appalachia, Illinois, and Powder River coal basins. Mean and upper‐bound cancer risks were estimated individually for each radionuclide, exposure pathway, and coal basin. For each radionuclide at each coal basin, external exposure to ionizing radiation contributed the greatest to the overall risk estimate, followed by incidental ingestion of soil and inhalation of soil particulates. The mean cancer risks by route of exposure were 2.01 × 10−6 (ingestion), 6.80 × 10−9 (inhalation), and 3.66 × 10−5 (external), while the upper bound cancer risks were 3.70 × 10−6 (ingestion), 1.18 × 10−8 (inhalation), and 6.15 × 10−5 (external), using summed radionuclide‐specific data from all locations. The upper bound cancer risk from all routes of exposure was 6.52 × 10−5. These estimated cancer risks were within the EPA's acceptable cancer risk range of 1 × 10−6 to 1 × 10−4. If the CCR radioactivity values used in this analysis are generally representative of CCR waste streams, then our findings suggest that CCRs would not be expected to pose a significant radiological risk to residents living in areas where contact with CCR‐containing soils might occur.  相似文献   

19.
Cryptosporidium human dose‐response data from seven species/isolates are used to investigate six models of varying complexity that estimate infection probability as a function of dose. Previous models attempt to explicitly account for virulence differences among C. parvum isolates, using three or six species/isolates. Four (two new) models assume species/isolate differences are insignificant and three of these (all but exponential) allow for variable human susceptibility. These three human‐focused models (fractional Poisson, exponential with immunity and beta‐Poisson) are relatively simple yet fit the data significantly better than the more complex isolate‐focused models. Among these three, the one‐parameter fractional Poisson model is the simplest but assumes that all Cryptosporidium oocysts used in the studies were capable of initiating infection. The exponential with immunity model does not require such an assumption and includes the fractional Poisson as a special case. The fractional Poisson model is an upper bound of the exponential with immunity model and applies when all oocysts are capable of initiating infection. The beta Poisson model does not allow an immune human subpopulation; thus infection probability approaches 100% as dose becomes huge. All three of these models predict significantly (>10x) greater risk at the low doses that consumers might receive if exposed through drinking water or other environmental exposure (e.g., 72% vs. 4% infection probability for a one oocyst dose) than previously predicted. This new insight into Cryptosporidium risk suggests additional inactivation and removal via treatment may be needed to meet any specified risk target, such as a suggested 10?4 annual risk of Cryptosporidium infection.  相似文献   

20.
Self‐driving vehicles (SDVs) promise to considerably reduce traffic crashes. One pressing concern facing the public, automakers, and governments is “How safe is safe enough for SDVs?” To answer this question, a new expressed‐preference approach was proposed for the first time to determine the socially acceptable risk of SDVs. In our between‐subject survey (N = 499), we determined the respondents’ risk‐acceptance rate of scenarios with varying traffic‐risk frequencies to examine the logarithmic relationships between the traffic‐risk frequency and risk‐acceptance rate. Logarithmic regression models of SDVs were compared to those of human‐driven vehicles (HDVs); the results showed that SDVs were required to be safer than HDVs. Given the same traffic‐risk‐acceptance rates for SDVs and HDVs, their associated acceptable risk frequencies of SDVs and HDVs were predicted and compared. Two risk‐acceptance criteria emerged: the tolerable risk criterion, which indicates that SDVs should be four to five times as safe as HDVs, and the broadly acceptable risk criterion, which suggests that half of the respondents hoped that the traffic risk of SDVs would be two orders of magnitude lower than the current estimated traffic risk. The approach and these results could provide insights for government regulatory authorities for establishing clear safety requirements for SDVs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号