首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
In this article, Savage’s theory of decision-making under uncertainty is extended from a classical environment into a non-classical one. The Boolean lattice of events is replaced by an arbitrary ortho-complemented poset. We formulate the corresponding axioms and provide representation theorems for qualitative measures and expected utility. Then, we discuss the issue of beliefs updating and investigate a transition probability model. An application to a simple game context is proposed.  相似文献   

2.
At a very fundamental level an individual (or a computer) can process only a finite amount of information in a finite time. We can therefore model the possibilities facing such an observer by a tree with only finitely many arcs leaving each node. There is a natural field of events associated with this tree, and we show that any finitely additive probability measure on this field will also be countably additive. Hence when considering the foundations of Bayesian statistics we may as well assume countable additivity over a σ-field of events.  相似文献   

3.
Any dynamic decision model should be based on conditional objects and must refer to (not necessarily structured) domains containing only the elements and the information of interest. We characterize binary relations, defined on an arbitrary set of conditional events, which are representable by a coherent generalized decomposable conditional measure and we study, in particular, the case of binary relations representable by a coherent conditional probability.  相似文献   

4.
Recent experimental evidence has concluded that experimentally observed juxtaposition effects, as predicted by regret theory1, are largely attributable to “event-splitting effects” (ESEs) whereby the subjective decision weight attached to an outcome depends on the number of, as well as on the combined probability of, the disjoint events in which that outcome occurs. An experiment is reported that discriminates between juxtaposition effects and ESEs under conditions of both complete and incomplete information. The results confirm that juxtaposition effects are indeed largely due to ESEs and are robust over different informational conditions.  相似文献   

5.
Jeffrey conditioning tells an agent how to update her priors so as to grant a given probability to a particular event. Weighted averaging tells an agent how to update her priors on the basis of testimonial evidence, by changing to a weighted arithmetic mean of her priors and another agent’s priors. We show that, in their respective settings, these two seemingly so different updating rules are axiomatized by essentially the same invariance condition. As a by-product, this sheds new light on the question how weighted averaging should be extended to deal with cases when the other agent reveals only parts of her probability distribution. The combination of weighted averaging (for the events whose probability the other agent reveals) and Jeffrey conditioning (for the events whose probability the other agent does not reveal) is a comprehensive updating rule to deal with such cases, which is again axiomatized by invariance under embedding. We conclude that, even though one may dislike Jeffrey conditioning or weighted averaging, the two make a natural pair when a policy for partial testimonial evidence is needed.  相似文献   

6.
We resolve a useful formulation of the question how a statistician can coherently incorporate the information in a consulted expert??s probability assessment for an event into a personal posterior probability assertion. Using a framework that recognises the total information available as composed of units available only to each of them along with units available to both, we show: that a sufficient statistic for all the information available to both the expert and the statistician is the product of their odds ratios in favour of the event; that the geometric mean of their two probabilities specifies a contour of pairs of assertions in the unit-square that yield the same posterior probability; that the information-combining function is parameterised by an unknown probability for the event conditioned only on the unspecified information common to both the statistician and the expert; and that an assessable mixing distribution over this unspecified probability allows an integrable mixture distribution to represent a computable posterior probability. The exact results allow the identification of the subclass of coherent probabilities that are externally Bayesian operators. This subclass is equivalent to the class of combining functions that honour the principles of uniformity and compromise.  相似文献   

7.
Making Low Probabilities Useful   总被引:4,自引:2,他引:2  
This paper explores how people process information on low probability-high consequence negative events and what it will take to get individuals to be sensitive to the likelihood of these types of accidents or disasters. In a set of experiments, information is presented to individuals on the likelihood of serious accidents from a chemical facility. Comparisons are made with other risks, such as fatalities from automobile accidents, to see whether laypersons can determine the relative safety of different plants. We conclude that fairly rich context information must be available for people to be able to judge differences between low probabilities. In particular, it appears that one needs to present comparison scenarios that are located on the probability scale to evoke people's own feelings of risk. The concept of evaluability recently introduced by Hsee and his colleagues provides a useful explanation of these findings.  相似文献   

8.
9.
Subjectively weighted linear utility   总被引:4,自引:0,他引:4  
An axiomatized theory of nonlinear utility and subjective probability is presented in which assessed probabilities are allowed to depend on the consequences associated with events. The representation includes the expected utility model as a special case, but can accommodate the Ellsberg paradox and other types of ambiguity sensitive behavior, while retaining familiar properties of subjective probability, such as additivity for disjoint events and multiplication of conditional probabilities. It is an extension, to the states model of decision making under uncertainty, of Chew's weighted linear utility representation for decision making under risk.  相似文献   

10.
E-Capacities and the Ellsberg Paradox   总被引:1,自引:1,他引:1  
Ellsberg's (1961) famous paradox shows that decision-makers give events with known probabilities a higher weight in their outcome evaluation. In the same article, Ellsberg suggests a preference representation which has intuitive appeal but lacks an axiomatic foundation. Schmeidler (1989) and Gilboa (1987) provide an axiomatisation for expected utility with non-additive probabilities. This paper introduces E-capacities as a representation of beliefs which incorporates objective information about the probability of events. It can be shown that the Choquet integral of an E-capacity is the Ellsberg representation. The paper further explores properties of this representation of beliefs and provides an axiomatisation for them.  相似文献   

11.
Neglecting Disaster: Why Don't People Insure Against Large Losses?   总被引:2,自引:0,他引:2  
This paper provides a theoretical explanation for the common observation that people often fail to purchase insurance against low-probability high-loss events even when it is offered at favorable premiums. We hypothesize that individuals maximize expected utility but face an explicit or implicit cost to discovering the true probability of rare events. This cost constitutes a threshold that may inhibit purchase but may be offset in several ways by suppliers of insurers and state regulators.  相似文献   

12.
We examine the choice-of-single-stage-experiment problem (Raiffa and Schlaifer, 1961) under the assumption that the decider's (weak) preference relation satisfies Schmeidler's (1989) or Gilboa's (1987) axiomatization and is thus representable by a nonadditive expected-utility functional as a Choquet integral w.r.t. a monotone probability measure on events. The basic properties of information value, certainty equivalent of information cost, net gain of information, and optimal choice of experiment that obtain (La Valle, 1968) when satisfies the Anscombe-Aumann (1963) or Savage (1954) axiomatizations continue to obtain in the more general Schmeidler-Gilboa context-provided that there is no incentive to randomize the choice of experiment. When this proviso fails, information value can in general be assigned only to the set of available experiments.  相似文献   

13.
14.
对气象灾害事件信息的研究是解构气象灾害事件的重要路径。依据信息传播的渠道和主体,可以将气象灾害事件信息划分为正式信息和非正式信息。气象灾害事件正式信息的扩散路径相对固定,发布具有可控性,但是正式传播渠道不能保证其真实性。气象灾害正式信息在扩散过程中具有非线性、稀释性和支离性等特征。在此基础上构建的气象灾害事件正式信息扩散模型表明,对气象灾害信息持开放态度,注重信息扩散方式的创新,加强对专业性信息的通俗化处理,重视对公众的培训和教育以及选择合适的气象灾害信息发布策略,将有助于引导信息扩散,优化气象灾害事件的应急处置。  相似文献   

15.
Journal of Risk and Uncertainty - When receiving personalized rather than population-based information, agents improve their knowledge about their probability of experiencing adverse events (e.g....  相似文献   

16.
By definition, the subjective probability distribution of a random event is revealed by the (‘rational’) subject's choice between bets — a view expressed by F. Ramsey, B. De Finetti, L. J. Savage and traceable to E. Borel and, it can be argued, to T. Bayes. Since hypotheses are not observable events, no bet can be made, and paid off, on a hypothesis. The subjective probability distribution of hypotheses (or of a parameter, as in the current ‘Bayesian’ statistical literature) is therefore a figure of speech, an ‘as if’, justifiable in the limit. Given a long sequence of previous observations, the subjective posterior probabilities of events still to be observed are derived by using a mathematical expression that would approximate the subjective probability distribution of hypotheses, if these could be bet on. This position was taken by most, but not all, respondents to a ‘Round Robin’ initiated by J. Marschak after M. H. De-Groot's talk on Stopping Rules presented at the UCLA Interdisciplinary Colloquium on Mathematics in Behavioral Sciences. Other participants: K. Borch, H. Chernoif, R. Dorfman, W. Edwards, T. S. Ferguson, G. Graves, K. Miyasawa, P. Randolph, L. J. Savage, R. Schlaifer, R. L. Winkler. Attention is also drawn to K. Borch's article in this issue.  相似文献   

17.
Montesano  Aldo 《Theory and Decision》2001,51(2-4):183-195
The Choquet expected utility model deals with nonadditive probabilities (or capacities). Their dependence on the information the decision-maker has about the possibility of the events is taken into account. Two kinds of information are examined: interval information (for instance, the percentage of white balls in an urn is between 60% and 100%) and comparative information (for instance, the information that there are more white balls than black ones). Some implications are shown with regard to the core of the capacity and to two additive measures which can be derived from capacities: the Shapley value and the nucleolus. Interval information bounds prove to be satisfied by all probabilities in the core, but they are not necessarily satisfied by the nucleolus (when the core is empty) and the Shapley value. We must introduce the constrained nucleolus in order for these bounds to be satisfied, while the Shapley value does not seem to be adjustable. On the contrary, comparative information inequalities prove to be not necessarily satisfied by all probabilities in the core and we must introduce the constrained core in order for these inequalities be satisfied. However, both the nucleolus and the Shapley value satisfy the comparative information inequalities, and the Shapley value does it more strictly than the nucleolus. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

18.
Both Popper and Good have noted that a deterministic microscopic physical approach to probability requires subjective assumptions about the statistical distribution of initial conditions. However, they did not use such a fact for defining an a priori probability, but rather recurred to the standard observation of repetitive events. This observational probability may be hard to assess for real-life decision problems under uncertainty that very often are - strictly speaking - non-repetitive, one-time events. This may be a reason for the popularity of subjective probability in decision models. Unfortunately, such subjective probabilities often merely reflect attitudes towards risk, and not the underlying physical processes.In order to get as objective as possible a definition of probability for one-time events, this paper identifies the origin of randomness in individual chance processes. By focusing on the dynamics of the process, rather than on the (static) device, it is found that any process contains two components: observer-independent (= objective) and observer-dependent (= subjective). Randomness, if present, arises from the subjective definition of the rules of the game, and is not - as in Popper's propensity - a physical property of the chance device. In this way, the classical definition of probability is no longer a primitive notion based upon equally possible cases, but is derived from the underlying microscopic processes, plus a subjective, clearly identified, estimate of the branching ratios in an event tree. That is, equipossibility is not an intrinsic property of the system object/subject but is forced upon the system via the rules of the game/measurement.Also, the typically undefined concept of symmetry in games of chance is broken down into objective and subjective components. It is found that macroscopic symmetry may hold under microscopic asymmetry. A similar analysis of urn drawings shows no conceptual difference with other games of chance (contrary to Allais' opinion). Finally, the randomness in Lande's knife problem is not due to objective fortuity (as in Popper's view) but to the rules of the game (the theoretical difficulties arise from intermingling microscopic trajectories and macroscopic events).Dedicated to Professor Maurice Allais on the occasion of the Nobel Prize in Economics awarded December, 1988.  相似文献   

19.
We study, from the standpoint of coherence, comparative probabilities on an arbitrary familyE of conditional events. Given a binary relation ·, coherence conditions on · are related to de Finetti's coherent betting system: we consider their connections to the usual properties of comparative probability and to the possibility of numerical representations of ·. In this context, the numerical reference frame is that of de Finetti's coherent subjective conditional probability, which is not introduced (as in Kolmogoroff's approach) through a ratio between probability measures.Another relevant feature of our approach is that the family & need not have any particular algebraic structure, so that the ordering can be initially given for a few conditional events of interest and then possibly extended by a step-by-step procedure, preserving coherence.  相似文献   

20.
This paper discusses two problems. (a) What happens to the conditional risk premium that a decision maker is willing to pay out of the middle prize in a lottery to avoid uncertainty concerning the middle prize outcome, when the probabilities of other prizes change? (b) What happens to the increase that a decision maker is willing to accept in the probability of an unpleasant outcome in order to avoid ambiguity concerning this probability, when this probability increases? We discuss both problems by using anticipated utility theory, and show that the same conditions on this functional predict behavioral patterns that are consistent both with a natural extension of the concept of diminishing risk aversion and with some experimental findings.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号