共查询到20条相似文献,搜索用时 10 毫秒
1.
Annual data from the Finnish National Salmonella Control Programme were used to build up a probabilistic transmission model of salmonella in the primary broiler production chain. The data set consisted of information on grandparent, parent, and broiler flock populations. A probabilistic model was developed to describe the unknown true prevalences, vertical and horizontal transmissions, as well as the dynamical model of infections. By combining these with the observed data, the posterior probability distributions of the unknown parameters and variables could be derived. Predictive distributions were derived for the true number of infected broiler flocks under the adopted intervention scheme and these were compared with the predictions under no intervention. With the model, the effect of the intervention used in the programme, i.e., eliminating salmonella positive breeding flocks, could be quantitatively assessed. The 95% probability interval of the posterior predictive distribution for (broiler) flock prevalence under current (1999) situation was [1.3%-17.4%] (no intervention), and [0.9%-5.8%] (with intervention). In the scenario of one infected grandparent flock, these were [2.8%-43.1%] and [1.0%-5.9%], respectively. Computations were performed using WinBUGS and Matlab softwares. 相似文献
2.
Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well‐established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents’ relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor‐based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. 相似文献
3.
Özkan Uğurlu Serdar Yıldız Sean Loughney Jin Wang Shota Kuntchulia Irakli Sharabidze 《Risk analysis》2020,40(12):2610-2638
This study examines and analyzes marine accidents that have occurred over the past 20 years in the Black Sea. Geographic information system, human factor analysis and classification system (HFACS), and Bayesian network models are used to analyze the marine accidents. The most important feature distinguishing this study from other studies is that this is the first study to analyze accidents that have occurred across the whole Black Sea. Another important feature is the application of a new HFACS structure to reveal accident formation patterns. The results of this study indicate that accidents occurred in high concentrations in coastal regions of the Black Sea, especially in the Kerch Strait, Novorossiysk, Kilyos, Constanta, Riva, and Batumi regions. The formation of grounding and sinking accidents has been found to be similar in nature; the use of inland and old vessels has been highlighted as important factors in sinking and grounding incidents. However, the sequence of events leading to collision-contact accidents differs from the sequence of events resulting in grounding and sinking accidents. This study aims to provide information to the maritime industry regarding the occurrence of maritime incidents in the Black Sea, in order to assist with reduction and prevention of the marine accidents. 相似文献
4.
Human variability is a very important factor considered in human health risk assessment for protecting sensitive populations from chemical exposure. Traditionally, to account for this variability, an interhuman uncertainty factor is applied to lower the exposure limit. However, using a fixed uncertainty factor rather than probabilistically accounting for human variability can hardly support probabilistic risk assessment advocated by a number of researchers; new methods are needed to probabilistically quantify human population variability. We propose a Bayesian hierarchical model to quantify variability among different populations. This approach jointly characterizes the distribution of risk at background exposure and the sensitivity of response to exposure, which are commonly represented by model parameters. We demonstrate, through both an application to real data and a simulation study, that using the proposed hierarchical structure adequately characterizes variability across different populations. 相似文献
5.
Ioannis A. Papazoglou Olga Aneziris Linda Bellamy B. J. M. Ale Joy I. H. Oh 《Risk analysis》2015,35(8):1536-1561
Occupational risk rates per hour of exposure have been quantified for 63 occupational accident types for the Dutch working population. Data were obtained from the analysis of more than 9,000 accidents that occurred over a period of six years in the Netherlands and resulted in three types of reportable consequences under Dutch law: (a) fatal injury, (b) permanent injury, and (c) serious recoverable injury requiring at least one day of hospitalization. A Bayesian uncertainty assessment on the value of the risk rates has been performed. Annual risks for each of the 63 occupational accident types have been calculated, including the variability in the annual exposure of the working population to the corresponding hazards. The suitability of three risk measures—individual risk rates, individual annual risk, and number of accidents—is examined and discussed. 相似文献
6.
Recent work in the assessment of risk in maritime transportation systems has used simulation-based probabilistic risk assessment techniques. In the Prince William Sound and Washington State Ferries risk assessments, the studies' recommendations were backed up by estimates of their impact made using such techniques and all recommendations were implemented. However, the level of uncertainty about these estimates was not available, leaving the decisionmakers unsure whether the evidence was sufficient to assess specific risks and benefits. The first step toward assessing the impact of uncertainty in maritime risk assessments is to model the uncertainty in the simulation models used. In this article, a study of the impact of proposed ferry service expansions in San Francisco Bay is used as a case study to demonstrate the use of Bayesian simulation techniques to propagate uncertainty throughout the analysis. The conclusions drawn in the original study are shown, in this case, to be robust to the inherent uncertainties. The main intellectual merit of this work is the development of Bayesian simulation technique to model uncertainty in the assessment of maritime risk. However, Bayesian simulations have been implemented only as theoretical demonstrations. Their use in a large, complex system may be considered state of the art in the field of computational sciences. 相似文献
7.
We conducted a regional‐scale integrated ecological and human health risk assessment by applying the relative risk model with Bayesian networks (BN‐RRM) to a case study of the South River, Virginia mercury‐contaminated site. Risk to four ecological services of the South River (human health, water quality, recreation, and the recreational fishery) was evaluated using a multiple stressor–multiple endpoint approach. These four ecological services were selected as endpoints based on stakeholder feedback and prioritized management goals for the river. The BN‐RRM approach allowed for the calculation of relative risk to 14 biotic, human health, recreation, and water quality endpoints from chemical and ecological stressors in five risk regions of the South River. Results indicated that water quality and the recreational fishery were the ecological services at highest risk in the South River. Human health risk for users of the South River was low relative to the risk to other endpoints. Risk to recreation in the South River was moderate with little spatial variability among the five risk regions. Sensitivity and uncertainty analysis identified stressors and other parameters that influence risk for each endpoint in each risk region. This research demonstrates a probabilistic approach to integrated ecological and human health risk assessment that considers the effects of chemical and ecological stressors across the landscape. 相似文献
8.
Sensitivity Analysis, Monte Carlo Risk Analysis, and Bayesian Uncertainty Assessment 总被引:3,自引:0,他引:3
Sander Greenland 《Risk analysis》2001,21(4):579-584
Standard statistical methods understate the uncertainty one should attach to effect estimates obtained from observational data. Among the methods used to address this problem are sensitivity analysis, Monte Carlo risk analysis (MCRA), and Bayesian uncertainty assessment. Estimates from MCRAs have been presented as if they were valid frequentist or Bayesian results, but examples show that they need not be either in actual applications. It is concluded that both sensitivity analyses and MCRA should begin with the same type of prior specification effort as Bayesian analysis. 相似文献
9.
Accident Epidemiology and the U.S. Chemical Industry: Accident History and Worst-Case Data from RMP*Info 总被引:2,自引:0,他引:2
Paul R. Kleindorfer James C. Belke Michael R. Elliott Kiwan Lee Robert A. Lowe Harold I. Feldman 《Risk analysis》2003,23(5):865-881
This article reports on the data collected on one of the most ambitious government-sponsored environmental data acquisition projects of all time, the Risk Management Plan (RMP) data collected under section 112(r) of the Clean Air Act Amendments of 1990. This RMP Rule 112(r) was triggered by the Bhopal accident in 1984 and led to the requirement that each qualifying facility develop and file with the U.S. Environmental Protection Agency a Risk Management Plan (RMP) as well as accident history data for the five-year period preceding the filing of the RMP. These data were collected in 1999-2001 on more than 15,000 facilities in the United States that store or use listed toxic or flammable chemicals believed to be a hazard to the environment or to human health of facility employees or off-site residents of host communities. The resulting database, RMP*Info, has become a key resource for regulators and researchers concerned with the frequency and severity of accidents, and the underlying facility-specific factors that are statistically associated with accident and injury rates. This article analyzes which facilities actually filed under the Rule and presents results on accident frequencies and severities available from the RMP*Info database. This article also presents summaries of related results from RMP*Info on Offsite Consequence Analysis (OCA), an analytical estimate of the potential consequences of hypothetical worst-case and alternative accidental releases on the public and environment around the facility. The OCA data have become a key input in the evaluation of site security assessment and mitigation policies for both government planners as well as facility managers and their insurers. Following the survey of the RMP*Info data, we discuss the rich set of policy decisions that may be informed by research based on these data. 相似文献
10.
Peter Burgherr 《Risk analysis》2013,33(1):146-160
We analyze the risk of severe fatal accidents causing five or more fatalities and for nine different activities covering the entire oil chain. Included are exploration and extraction, transport by different modes, refining and final end use in power plants, heating or gas stations. The risks are quantified separately for OECD and non‐OECD countries and trends are calculated. Risk is analyzed by employing a Bayesian hierarchical model yielding analytical functions for both frequency (Poisson) and severity distributions (Generalized Pareto) as well as frequency trends. This approach addresses a key problem in risk estimation—namely the scarcity of data resulting in high uncertainties in particular for the risk of extreme events, where the risk is extrapolated beyond the historically most severe accidents. Bayesian data analysis allows the pooling of information from different data sets covering, for example, the different stages of the energy chains or different modes of transportation. In addition, it also inherently delivers a measure of uncertainty. This approach provides a framework, which comprehensively covers risk throughout the oil chain, allowing the allocation of risk in sustainability assessments. It also permits the progressive addition of new data to refine the risk estimates. Frequency, severity, and trends show substantial differences between the activities, emphasizing the need for detailed risk analysis. 相似文献
11.
The public health community, news media, and members of the general public have expressed significant concern that methicillin‐resistant Staphylococcus aureus (MRSA) transmitted from pigs to humans may harm human health. Studies of the prevalence and dynamics of swine‐associated (ST398) MRSA have sampled MRSA at discrete points in the presumed causative chain leading from swine to human patients, including sampling bacteria from live pigs, retail meats, farm workers, and hospital patients. Nonzero prevalence is generally interpreted as indicating a potential human health hazard from MRSA infections, but quantitative assessments of resulting risks are not usually provided. This article integrates available data from several sources to construct a conservative (plausible upper bound) probability estimate for the actual human health harm (MRSA infections and fatalities) arising from ST398‐MRSA from pigs. The model provides plausible upper bounds of approximately one excess human infection per year among all U.S. pig farm workers, and one human infection per 31 years among the remaining total population of the United States. These results assume the possibility of transmission events not yet observed, so additional data collection may reduce these estimates further. 相似文献
12.
Antti Mikkelä Jukka Ranta Manuel González Marjaana Hakkinen Pirkko Tuominen 《Risk analysis》2016,36(11):2065-2080
A Bayesian statistical temporal‐prevalence‐concentration model (TPCM) was built to assess the prevalence and concentration of pathogenic campylobacter species in batches of fresh chicken and turkey meat at retail. The data set was collected from Finnish grocery stores in all the seasons of the year. Observations at low concentration levels are often censored due to the limit of determination of the microbiological methods. This model utilized the potential of Bayesian methods to borrow strength from related samples in order to perform under heavy censoring. In this extreme case the majority of the observed batch‐specific concentrations was below the limit of determination. The hierarchical structure was included in the model in order to take into account the within‐batch and between‐batch variability, which may have a significant impact on the sample outcome depending on the sampling plan. Temporal changes in the prevalence of campylobacter were modeled using a Markovian time series. The proposed model is adaptable for other pathogens if the same type of data set is available. The computation of the model was performed using OpenBUGS software. 相似文献
13.
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation‐based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source‐to‐source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. 相似文献
14.
The Finnish salmonella control program (FSCP) for beef production is based on both randomized and selective testing of herds and animals. Sampling of individual animals in abattoirs is randomized. Herds are selectively tested for salmonella on the basis of clinical symptoms and/or other factors. Risk assessment of FSCP is inherently difficult due to the complexity of the complete data set, especially if the detailed labeling of the testing types is lost. However, for a risk assessment of the whole production chain, methods for exploiting all available data should be considered. For this purpose, a hierarchical Bayesian model of true salmonella prevalence was constructed to combine information at different levels of aggregation: herds in geographical regions and individual animals arriving for slaughter. The conditional (municipality specific) probability of selection of a herd for testing was modeled given the underlying true infection status of the herd and information about the general sampling activity in the specific region. The model also accounted for the overall sensitivity of the sampling methods, both at the herd and at the animal level. In 1999, the 95% posterior probability intervals of true salmonella prevalence in the herd population, in individual cattle, and in slaughter animal populations were [0.54%, 1.4%] (mode 0.8%), [0.15%, 0.39%] (mode 0.2%), and [0.12%, 0.36%] (mode 0.2%), respectively. The results will be further exploited in other risk assessments focusing on the subsequent parts of the beef production chain, and in evaluation of the FSCP. 相似文献
15.
Despite rapid developments in the quality and safety of consumer products, the rise of intelligent household appliances, such as sweeping robots, has introduced new safety concerns. Considering “person–product–environment” elements and the complex systems of emerging consumer products, this study presents a new method of risk assessment for consumer products: systems theoretic process analysis (STPA)–failure mode and effects analysis (FMEA). As a case study, this method is applied to the safety control of a sweeping robot. The results suggest that this method can identify all the possible failure modes and injury scenarios among the product components, and the safety constraints in the hierarchical control structure of the interactive system. Moreover, the STPA–FMEA method combines user and environmental factors with the value of product risk events, based on the risk priority number (RPN). This provides an accurate and orderly system to reduce or eliminate the root causes of accidents and injuries. Finally, analysis of unsafe control behavior and its causes can be used to suggest improved safety constraints, which can effectively reduce the risk of some injury scenarios. This paper presents a new method of risk assessment for consumer products and a general five-level complex index system. 相似文献
16.
In 1996, an outbreak of E. coli O157:H7-associated illness occurred in an elementary school in Japan. This outbreak has been studied in unusual detail, making this an important case for quantitative risk assessment. The availability of stored samples of the contaminated food allowed reliable estimation of exposure to the pathogens. Collection of fecal samples allowed assessment of the numbers infected, including asymptomatic cases. Comparison to other published dose-response studies for E. coli O157:H7 show that the strain that caused the outbreak studied here must have been considerably more infectious. We use this well-documented incident as an example to demonstrate how such information on the response to a single dose can be used for dose-response assessment. In particular, we demonstrate how the high infectivity limits the uncertainty in the low-dose region. 相似文献
17.
Louis H.J. Goossens 《Risk analysis》1991,11(2):217-228
Accidents with automatic production systems are reported to be on the order of one in a hundred or thousand robot-years, while fatal accidents are found to occur one or two orders of magnitude less frequently. Traditions in occupational safety tend to seek for safety targets in terms of zero severe accidents for automatic systems. Decision-making requires a risk assessment balancing potential risk reduction measures and costs within the cultural environment of a production company. This paper presents a simplified procedure which acts as a decision tool. The procedure is based on a risk concept approaching prevention both in a deterministic and in a probabilistic manner. Eight accident scenarios are shown to represent the potential accident processes involving robot interactions with people. Seven prevention policies are shown to cover the accident scenarios in principle. An additional probabilistic approach may indicate which extra safety measures can be taken against what risk reduction and additional costs. The risk evaluation process aims at achieving a quantitative acceptable risk level. For that purpose, three risk evaluation methods are discussed with respect to reaching broad consensus on the safety targets. 相似文献
18.
When they do not use formal quantitative risk assessment methods, many scientists (like other people) make mistakes and exhibit biases in reasoning about causation, if‐then relations, and evidence. Decision‐related conclusions or causal explanations are reached prematurely based on narrative plausibility rather than adequate factual evidence. Then, confirming evidence is sought and emphasized, but disconfirming evidence is ignored or discounted. This tendency has serious implications for health‐related public policy discussions and decisions. We provide examples occurring in antimicrobial health risk assessments, including a case study of a recently reported positive relation between virginiamycin (VM) use in poultry and risk of resistance to VM‐like (streptogramin) antibiotics in humans. This finding has been used to argue that poultry consumption causes increased resistance risks, that serious health impacts may result, and therefore use of VM in poultry should be restricted. However, the original study compared healthy vegetarians to hospitalized poultry consumers. Our examination of the same data using conditional independence tests for potential causality reveals that poultry consumption acted as a surrogate for hospitalization in this study. After accounting for current hospitalization status, no evidence remains supporting a causal relationship between poultry consumption and increased streptogramin resistance. This example emphasizes both the importance and the practical possibility of analyzing and presenting quantitative risk information using data analysis techniques (such as Bayesian model averaging (BMA) and conditional independence tests) that are as free as possible from potential selection, confirmation, and modeling biases. 相似文献
19.
Risk filtering, ranking, and management framework using hierarchical holographic modeling. 总被引:4,自引:0,他引:4
This paper contributes a methodological framework to identify, prioritize, assess, and manage risk scenarios of a large-scale system. Qualitative screening of scenarios and classes of scenarios is appropriate initially, while quantitative assessments may be applied once the set of all scenarios (hundreds) has been prioritized in several phases. The eight-phase methodology is described in detail and is applied to operations other than war. The eight phases are as follows: Phase I, Scenario Identification-A hierarchical holographic model (HHM) is developed to describe the system's "as planned" or "success" scenario. Phase II, Scenario Filtering-The risk scenarios identified in Phase I are filtered according to the responsibilities and interests of the current system user. Phase III, Bi-Criteria Filtering and Ranking. Phase IV, Multi-Criteria Evaluation. Phase V, Quantitative Ranking-We continue to filter and rank scenarios based on quantitative and qualitative matrix scales of likelihood and consequence; and ordinal response to system resiliency, robustness, redundancy. Phase VI, Risk Management is performed, involving identification of management options for dealing with the filtered scenarios, and estimating the cost, performance benefits, and risk reduction of each. Phase VII, Safeguarding Against Missing Critical Items--We examine the performance of the options selected in Phase VI against the scenarios previously filtered out during Phases II to V. Phase VIII, Operational Feedback-We use the experience and information gained during application to refine the scenario filtering and decision processes in earlier phases. These eight phases reflect a philosophical approach rather than a mechanical methodology. In this philosophy, the filtering and ranking of discrete scenarios is viewed as a precursor to, rather than a substitute for, consideration of the totality of all risk scenarios. 相似文献
20.
Kyoji Furukawa Munechika Misumi John B. Cologne Harry M. Cullings 《Risk analysis》2016,36(6):1211-1223
In evaluating the risk of exposure to health hazards, characterizing the dose‐response relationship and estimating acceptable exposure levels are the primary goals. In analyses of health risks associated with exposure to ionizing radiation, while there is a clear agreement that moderate to high radiation doses cause harmful effects in humans, little has been known about the possible biological effects at low doses, for example, below 0.1 Gy, which is the dose range relevant to most radiation exposures of concern today. A conventional approach to radiation dose‐response estimation based on simple parametric forms, such as the linear nonthreshold model, can be misleading in evaluating the risk and, in particular, its uncertainty at low doses. As an alternative approach, we consider a Bayesian semiparametric model that has a connected piece‐wise‐linear dose‐response function with prior distributions having an autoregressive structure among the random slope coefficients defined over closely spaced dose categories. With a simulation study and application to analysis of cancer incidence data among Japanese atomic bomb survivors, we show that this approach can produce smooth and flexible dose‐response estimation while reasonably handling the risk uncertainty at low doses and elsewhere. With relatively few assumptions and modeling options to be made by the analyst, the method can be particularly useful in assessing risks associated with low‐dose radiation exposures. 相似文献