首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.  相似文献   

2.
《Risk analysis》2018,38(9):1781-1794
In risky situations characterized by imminent decisions, scarce resources, and insufficient data, policymakers rely on experts to estimate model parameters and their associated uncertainties. Different elicitation and aggregation methods can vary substantially in their efficacy and robustness. While it is generally agreed that biases in expert judgments can be mitigated using structured elicitations involving groups rather than individuals, there is still some disagreement about how to best elicit and aggregate judgments. This mostly concerns the merits of using performance‐based weighting schemes to combine judgments of different individuals (rather than assigning equal weights to individual experts), and the way that interaction between experts should be handled. This article aims to contribute to, and complement, the ongoing discussion on these topics.  相似文献   

3.
在多属性群决策方法的研究中,为了科学地确定专家的权重,提出一种基于信息熵的群组聚类组合赋权法。依据各个专家的判断矩阵归一化得到的排序向量,利用相关系数法构造相关矩阵。通过分析阀值变化率选取最优聚类阀值,对相似程度较高的排序向量给出合理的聚类。运用信息熵为类内专家赋权,综合聚类结果和排序向量的信息熵,确定专家的总权重。算例表明该方法可以对较为相近的专家评价结果进行有效分类,并准确衡量每位专家评价信息量的大小,能够有效提高专家赋权的合理性和群组决策的科学性。  相似文献   

4.
Yifan Zhang 《Risk analysis》2013,33(1):109-120
Expert judgment (or expert elicitation) is a formal process for eliciting judgments from subject‐matter experts about the value of a decision‐relevant quantity. Judgments in the form of subjective probability distributions are obtained from several experts, raising the question how best to combine information from multiple experts. A number of algorithmic approaches have been proposed, of which the most commonly employed is the equal‐weight combination (the average of the experts’ distributions). We evaluate the properties of five combination methods (equal‐weight, best‐expert, performance, frequentist, and copula) using simulated expert‐judgment data for which we know the process generating the experts’ distributions. We examine cases in which two well‐calibrated experts are of equal or unequal quality and their judgments are independent, positively or negatively dependent. In this setting, the copula, frequentist, and best‐expert approaches perform better and the equal‐weight combination method performs worse than the alternative approaches.  相似文献   

5.
A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens's data are used to demonstrate performance of this method in updating parameters of the chicken processing line model.  相似文献   

6.
Good policy making should be based on available scientific knowledge. Sometimes this knowledge is well established through research, but often scientists must simply express their judgment, and this is particularly so in risk scenarios that are characterized by high levels of uncertainty. Usually in such cases, the opinions of several experts will be sought in order to pool knowledge and reduce error, raising the question of whether individual expert judgments should be given different weights. We argue—against the commonly advocated “classical method”—that no significant benefits are likely to accrue from unequal weighting in mathematical aggregation. Our argument hinges on the difficulty of constructing reliable and valid measures of substantive expertise upon which to base weights. Practical problems associated with attempts to evaluate experts are also addressed. While our discussion focuses on one specific weighting scheme that is currently gaining in popularity for expert knowledge elicitation, our general thesis applies to externally imposed unequal weighting schemes more generally.  相似文献   

7.
Combining Probability Distributions From Experts in Risk Analysis   总被引:33,自引:0,他引:33  
This paper concerns the combination of experts' probability distributions in risk analysis, discussing a variety of combination methods and attempting to highlight the important conceptual and practical issues to be considered in designing a combination process in practice. The role of experts is important because their judgments can provide valuable information, particularly in view of the limited availability of hard data regarding many important uncertainties in risk analysis. Because uncertainties are represented in terms of probability distributions in probabilistic risk analysis (PRA), we consider expert information in terms of probability distributions. The motivation for the use of multiple experts is simply the desire to obtain as much information as possible. Combining experts' probability distributions summarizes the accumulated information for risk analysts and decision-makers. Procedures for combining probability distributions are often compartmentalized as mathematical aggregation methods or behavioral approaches, and we discuss both categories. However, an overall aggregation process could involve both mathematical and behavioral aspects, and no single process is best in all circumstances. An understanding of the pros and cons of different methods and the key issues to consider is valuable in the design of a combination process for a specific PRA. The output, a combined probability distribution, can ideally be viewed as representing a summary of the current state of expert opinion regarding the uncertainty of interest.  相似文献   

8.
Setting action levels or limits for health protection is complicated by uncertainty in the dose-response relation across a range of hazards and exposures. To address this issue, we consider the classic newsboy problem. The principles used to manage uncertainty for that case are applied to two stylized exposure examples, one for high dose and high dose rate radiation and the other for ammonia. Both incorporate expert judgment on uncertainty quantification in the dose-response relationship. The mathematical technique of probabilistic inversion also plays a key role. We propose a coupled approach, whereby scientists quantify the dose-response uncertainty using techniques such as structured expert judgment with performance weights and probabilistic inversion, and stakeholders quantify associated loss rates.  相似文献   

9.
A model is constructed for the failure frequency of underground pipelines per kilometer year, as a function of pipe and environmental characteristics. The parameters in the model were quantified, with uncertainty, using historical data and structured expert judgment. Fifteen experts from institutes in The Netherlands, the United Kingdom, Italy, France, Germany, Belgium, Denmark, and Canada participated in the study.  相似文献   

10.
Research evaluations based on quality weighted publication output are often criticized on account of the employed journal quality weights. This study shows that evaluations of entire research organizations are very robust with respect to the choice of readily available weighting schemes. We document this robustness by applying rather different weighting schemes to otherwise identical rankings. Our unit of analysis consists of German, Austrian and Swiss university departments in business administration and economics.  相似文献   

11.
Using Bayesian Networks to Model Expected and Unexpected Operational Losses   总被引:1,自引:0,他引:1  
This report describes the use of Bayesian networks (BNs) to model statistical loss distributions in financial operational risk scenarios. Its focus is on modeling "long" tail, or unexpected, loss events using mixtures of appropriate loss frequency and severity distributions where these mixtures are conditioned on causal variables that model the capability or effectiveness of the underlying controls process. The use of causal modeling is discussed from the perspective of exploiting local expertise about process reliability and formally connecting this knowledge to actual or hypothetical statistical phenomena resulting from the process. This brings the benefit of supplementing sparse data with expert judgment and transforming qualitative knowledge about the process into quantitative predictions. We conclude that BNs can help combine qualitative data from experts and quantitative data from historical loss databases in a principled way and as such they go some way in meeting the requirements of the draft Basel II Accord (Basel, 2004) for an advanced measurement approach (AMA).  相似文献   

12.
A robust process minimises the effect of the noise factors on the performance of a product or process. The variation of the performance of a robust process can be measured through modelling and analysis of process robustness. In this paper, a comprehensive methodology for modelling and analysis of process robustness is developed considering a number of relevant tools and techniques such as multivariate regression, control charting and simulation within the broad framework of Taguchi method. The methodology as developed considers, in specific terms, process modelling using historical data pertaining to responses, inputs variables and parameters as well as simulated noise variables data, identification of the model responses at each experimental setting of the controllable variables, estimation of multivariate process capability indices and control of their variability using control charting for determining optimal settings of the process variables using design of experiment-based Taguchi Method. The methodology is applied to a centrifugal casting process that produces worm-wheels for steam power plants in view of its critical importance of maintaining consistent performance in various under controllable situations (input conditions). The results show that the process settings as determined ensure minimum in-control variability with maximum performance of the centrifugal casting process, indicating improved level of robustness.  相似文献   

13.
This case study examines the hazard and risk perception and the need for decontamination according to people exposed to soil pollution. Using an ecological-symbolic approach (ESA), a multidisciplinary model is developed that draws upon psychological and sociological perspectives on risk perception and includes ecological variables by using data from experts' risk assessments. The results show that hazard perception is best predicted by objective knowledge, subjective knowledge, estimated knowledge of experts, and the assessed risks. However, experts' risk assessments induce an increase in hazard perception only when residents know the urgency of decontamination. Risk perception is best predicted by trust in the risk management. Additionally, need for decontamination relates to hazard perception, risk perception, estimated knowledge of experts, and thoughts about sustainability. In contrast to the knowledge deficit model, objective and subjective knowledge did not significantly relate to risk perception and need for decontamination. The results suggest that residents can make a distinction between hazards in terms of the seriousness of contamination on the one hand, and human health risks on the other hand. Moreover, next to the importance of social determinants of environmental risk perception, this study shows that the output of experts' risk assessments—or the objective risks—can create a hazard awareness rather than an alarming risk consciousness, despite residents' distrust of scientific knowledge.  相似文献   

14.
In this article, we present a methodology to assess the risk incurred by a participant in an activity involving danger of injury. The lack of high-quality historical data for the case considered prevented us from constructing a sufficiently detailed statistical model. It was therefore decided to generate a risk assessment model based on expert judgment. The methodology is illustrated in a real case context: the assessment of risk to participants in a San Fermin bull-run in Pamplona (Spain). The members of the panel of "experts on the bull-run" represented very different perspectives on the phenomenon: runners, surgeons and other health care personnel, journalists, civil defense workers, security staff, organizers, herdsmen, authors of books on the bull-run, etc. We consulted 55 experts. Our methodology includes the design of a survey instrument to elicit the experts' views and the statistical and mathematical procedures used to aggregate their subjective opinions.  相似文献   

15.
Jan F. Van Impe 《Risk analysis》2011,31(8):1295-1307
The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo‐randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used—that is, an ANOVA‐like model and Sobol sensitivity indices—to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.  相似文献   

16.
In human reliability analysis (HRA), dependence analysis refers to assessing the influence of the failure of the operators to perform one task on the failure probabilities of subsequent tasks. A commonly used approach is the technique for human error rate prediction (THERP). The assessment of the dependence level in THERP is a highly subjective judgment based on general rules for the influence of five main factors. A frequently used alternative method extends the THERP model with decision trees. Such trees should increase the repeatability of the assessments but they simplify the relationships among the factors and the dependence level. Moreover, the basis for these simplifications and the resulting tree is difficult to trace. The aim of this work is a method for dependence assessment in HRA that captures the rules used by experts to assess dependence levels and incorporates this knowledge into an algorithm and software tool to be used by HRA analysts. A fuzzy expert system (FES) underlies the method. The method and the associated expert elicitation process are demonstrated with a working model. The expert rules are elicited systematically and converted into a traceable, explicit, and computable model. Anchor situations are provided as guidance for the HRA analyst's judgment of the input factors. The expert model and the FES‐based dependence assessment method make the expert rules accessible to the analyst in a usable and repeatable way, with an explicit and traceable basis.  相似文献   

17.
企业隐性知识导航方法研究   总被引:3,自引:0,他引:3  
在对企业隐性知识特性及其研究现状分析的基础上,提出了企业隐性知识导航方法。该方法包括2个方面内容:基于平衡计分卡的专家导航方法,充分考虑了专家之间的能力差别;基于多关系专家网络的专家导航路径查询方法和相关专家查询方法,充分考虑了专家之间的多重关系,该方法为管理者发现及考核专家提供了依据。在此基础上,利用社会网络分析工具Sociom etryP ro对专家网络进行评价,得到了专家个人特征结果,并指出了改善企业隐性知识共享的具体措施。  相似文献   

18.
Kamel Bala  Wade D. Cook   《Omega》2003,31(6):439-450
This paper presents an improved measurement tool for evaluating performance of branches within a major Canadian bank. While there have been numerous previous studies of performance in the banking industry, particularly at the branch level, this study is different in a very significant way: specifically two kinds of data are used to develop the model. The first type of data is that related to standard transactions, available from any bank; such have formed the basis of numerous previous studies. The second type of data, obtained from the site studied, is classification information, based on branch consultant/expert judgment as to good and poor performance of branches. The purpose herein is to present a modified version of an existing benchmarking model, data envelopment analysis (DEA), and to show how this tool is applied in the banking industry. The mechanism used herein to incorporate expert knowledge within the DEA framework is to first apply a discriminant or classification tool, to quantify the functional relation that best captures the expert's mental model for performance. The outcome of this first phase is an orientation of variables to aid in the definition of inputs and outputs. The resulting orientation then defines the DEA model that makes up the second phase of the model.  相似文献   

19.
Trumbo  Craig W. 《Risk analysis》1999,19(3):391-400
The heuristic-systematic information processing model (HSM) holds that individuals will use one or both of these modes of information processing when attempting to evaluate information in order to arrive at a judgment. Systematic processing is defined by effortful scrutiny and comparison of information, whereas heuristic processing is defined by the use of cues to arrive more easily at a judgment. Antecedents to the two processing modes include information sufficiency, motivation, and self-efficacy. Structural equation modeling is used to examine competing configuration of this model and to evaluate the model as appropriate for predicting risk judgment. The model also is evaluated across three groups that vary with respect to their level of concern. These analyses are executed within a case study involving an epidemiological investigation of a suspected cancer cluster. The analysis confirms the HSM's theoretically proposed structure and shows it to be a useful vehicle for evaluating risk judgment. In the overall analysis, antecedent variables generally function as specified by theory. Systematic processing is predicted by greater motivation. Heuristic processing is predicted by information sufficiency. Self-efficacy is a significant predictor of both processing modes. And heuristic processing is shown to be associated with judgment of less risk. However, when the analysis is contrasted across three groups (those concerned about cancer, not concerned and uncertain) it is shown that the model is significantly more robust for the uncertain group. This finding may have implications for the use of the HSM in risk research specifically, and in field research generally.  相似文献   

20.
A Bayesian forecasting model is developed to quantify uncertainty about the postflight state of a field-joint primary O-ring (not damaged or damaged), given the O-ring temperature at the time of launch of the space shuttle Challenger in 1986. The crux of this problem is the enormous extrapolation that must be performed: 23 previous shuttle flights were launched at temperatures between 53 °F and 81 °F, but the next launch is planned at 31 °F. The fundamental advantage of the Bayesian model is its theoretic structure, which remains correct over the entire sample space of the predictor and that affords flexibility of implementation. A novel approach to extrapolating the input elements based on expert judgment is presented; it recognizes that extrapolation is equivalent to changing the conditioning of the model elements. The prior probability of O-ring damage can be assessed subjectively by experts following a nominal-interacting process in a group setting. The Bayesian model can output several posterior probabilities of O-ring damage, each conditional on the given temperature and on a different strength of the temperature effect hypothesis. A lower bound on, or a value of, the posterior probability can be selected for decision making consistently with expert judgment, which encapsulates engineering information, knowledge, and experience. The Bayesian forecasting model is posed as a replacement for the logistic regression and the nonparametric approach advocated in earlier analyses of the Challenger O-ring data. A comparison demonstrates the inherent deficiency of the generalized linear models for risk analyses that require (1) forecasting an event conditional on a predictor value outside the sampling interval, and (2) combining empirical evidence with expert judgment.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号