共查询到20条相似文献,搜索用时 78 毫秒
1.
The paper reports the results of a survey designed to elicit probability judgements for different types of events: ‘pure chance’
events, for which objective probabilities can be calculated; ‘public’ events, about which there may be some discussion in
social groups and the media; and ‘personal’ events, such as those relating to crime or accidental injury. Even among respondents
deemed to be ‘well-calibrated’ in the domain of pure chance events we find limited sensitivity to the ‘temporal scope’ of
public and personal events—this being especially marked for personal events. We discuss possible reasons and some implications
for policy-related survey work.
相似文献
Graham LoomesEmail: |
2.
Expected utility maximization problem is one of the most useful tools in mathematical finance, decision analysis and economics.
Motivated by statistical model selection, via the principle of expected utility maximization, Friedman and Sandow (J Mach
Learn Res 4:257–291, 2003a) considered the model performance question from the point of view of an investor who evaluates
models based on the performance of the optimal strategies that the models suggest. They interpreted their performance measures
in information theoretic terms and provided new generalizations of Shannon entropy and Kullback–Leibler relative entropy and
called them U-entropy and U-relative entropy. In this article, a utility-based criterion for independence of two random variables is defined. Then, Markov’s
inequality for probabilities is extended from the U-entropy viewpoint. Moreover, a lower bound for the U-relative entropy is obtained. Finally, a link between conditional U-entropy and conditional Renyi entropy is derived. 相似文献
3.
Anthony Willing 《Theory and Decision》1976,7(3):221-229
In ‘Semantic Foundations for the Logic of Preference’ (Rescher, ed.,The Logic of Decision and Action, University Press, Pittsburgh, 1967), Nicholas Rescher claims that, on the semantics developed in that paper, a certain principle
- call it ‘Q’ turns out to be ‘unacceptable’. I argue, however, that, given certain assumptions that Rescher invokes in that same paper,Q can in fact be shown to be a ‘preference-tautology’, and henceQ should be classified as ‘acceptable’ on Rescher's theory. 相似文献
4.
Robin Pope 《Theory and Decision》1995,39(3):241-265
The terms negative utility of gambling and risk aversion conflate three things:
Factor (iii) has not been previously distinguished from (i). Factor (i) is regularly either confused with (ii) or ignored as elusive and unimportant. 相似文献
(i) | Disutility from the mere act of taking a chance: i.e. negative effects that would not exist if there were no risk or uncertainty, effects which include serious business considerations such as the availability of loans — exemplified in von Neumann and Morgenstern's famous 1947 Appendix; |
(ii) | Diminishing marginal utility of money: — exemplified in Bernoulli and Cramer's expected utility procedure; and |
(iii) | A preference for safety: — exemplified in the rank dependent utility models of Allais, Lopes, Quiggin and Yaari. |
5.
Nathalie Etchart-Vincent 《Journal of Risk and Uncertainty》2009,39(1):45-63
The main goal of the experimental study described in this paper is to investigate the sensitivity of probability weighting
to the payoff structure of the gambling situation—namely the level of consequences at stake and the spacing between them—in the loss domain. For that purpose, three kinds of gambles are introduced: two kinds of homogeneous gambles
(involving either small or large losses), and heterogeneous gambles involving both large and small losses. The findings suggest
that at least for moderate/high probability of loss do both ‘level’ and ‘spacing’ effects reach significance, with the impact
of ‘spacing’ being both opposite to and stronger than the impact of ‘level’. As compared to small-loss gambles, large-loss
gambles appear to enhance probabilistic optimism, while heterogeneous gambles tend to increase pessimism.
相似文献
Nathalie Etchart-VincentEmail: |
6.
Hagen Lindstädt 《Theory and Decision》2007,62(4):335-353
Sometimes we believe that others receive harmful information. However, Marschak’s value of information framework always assigns
non-negative value under expected utility: it starts from the decision maker’s beliefs – and one can never anticipate information’s
harmfulness for oneself. The impact of decision makers’ capabilities to process information and of their expectations remains
hidden behind the individual and subjective perspective Marschak’s framework assumes. By introducing a second decision maker
as a point of reference, this paper introduces a way for evaluating others’ information from a cross-individual, imperfect
expectations perspective for agents maximising expected utility. We define the cross-value of information that can become negative – then the information is “harmful” from a cross-individual perspective – and we define (mutual) cost of limited information processing capabilities and imperfect expectations as an opportunity cost from this same point of reference. The simple relationship between these two expected utility-based
concepts and Marschak’s framework is shown, and we discuss evaluating short-term reactions of stock market prices to new information
as an important domain of valuing others’ information.
相似文献
7.
On Loss Aversion in Bimatrix Games 总被引:1,自引:0,他引:1
In this article three different types of loss aversion equilibria in bimatrix games are studied. Loss aversion equilibria
are Nash equilibria of games where players are loss averse and where the reference points—points below which they consider
payoffs to be losses—are endogenous to the equilibrium calculation. The first type is the fixed point loss aversion equilibrium,
introduced in Shalev (2000; Int. J. Game Theory 29(2):269) under the name of ‘myopic loss aversion equilibrium.’ There, the
players’ reference points depend on the beliefs about their opponents’ strategies. The second type, the maximin loss aversion
equilibrium, differs from the fixed point loss aversion equilibrium in that the reference points are only based on the carriers of the strategies, not on the exact probabilities. In the third type, the safety level loss aversion equilibrium, the reference
points depend on the values of the own payoff matrices. Finally, a comparative statics analysis is carried out of all three
equilibrium concepts in 2 × 2 bimatrix games. It is established when a player benefits from his opponent falsely believing
that he is loss averse. 相似文献
8.
The problem of asymmetric information causes a winner’s curse in many environments. Given many unsuccessful attempts to eliminate
it, we hypothesize that some people ‘prefer’ the lotteries underlying the winner’s curse. Study 1 shows that after removing
the hypothesized cause of error, asymmetric information, half the subjects still prefer winner’s curse lotteries, implying
past efforts to de-bias the winner’s curse may have been more successful than previously recognized since subjects prefer
these lotteries. Study 2 shows risk-seeking preferences only partially explain lottery preferences, while non-monetary sources
of utility may explain the rest. Study 2 suggests lottery preferences are not independent of context, and offers methods to
reduce the winner’s curse.
相似文献
Robert Slonim (Corresponding author)Email: |
9.
An extensive literature overlapping economics, statistical decision theory and finance, contrasts expected utility [EU] with
the more recent framework of mean–variance (MV). A basic proposition is that MV follows from EU under the assumption of quadratic
utility. A less recognized proposition, first raised by Markowitz, is that MV is fully justified under EU, if and only if
utility is quadratic. The existing proof of this proposition relies on an assumption from EU, described here as “Buridan’s
axiom” after the French philosopher’s fable of the ass that starved out of indifference between two bales of hay. To satisfy
this axiom, MV must represent not only “pure” strategies, but also their probability mixtures, as points in the (σ, μ) plane. Markowitz and others have argued that probability mixtures are represented sufficiently by (σ, μ) only under quadratic utility, and hence that MV, interpreted as a mathematical re-expression of EU, implies quadratic utility.
We prove a stronger form of this theorem, not involving or contradicting Buridan’s axiom, nor any more fundamental axiom of
utility theory. 相似文献
10.
Patrick Roger 《Theory and Decision》2011,70(1):27-44
In a recent article entitled “Putting Risk in its Proper Place,” Eeckhoudt and Schlesinger (2006) established a theorem linking
the sign of the n-th derivative of an agent’s utility function to her preferences among pairs of simple lotteries. We characterize these lotteries
and show that, in a given pair, they only differ by their moments of order greater than or equal to n. When the n-th derivative of the utility function is positive (negative) and n is odd (even), the agent prefers a lottery with higher (lower) n + 2p-th moments for p belonging to the set of positive integers. This result links the preference for disaggregation of risks across states of
nature to the complete structure of moments preferred by mixed risk averse agents. It can be viewed as a generalization of
a proposition appearing in Ekern (1980) which focused only on the differences in the n-th moments. 相似文献
11.
On Decomposing Net Final Values: Eva,Sva and Shadow Project 总被引:1,自引:0,他引:1
A decomposition model of Net Final Values (NFV), named Systemic Value Added (SVA), is proposed for decision-making purposes, based on a systemic approach introduced in Magni [Magni, C. A. (2003), Bulletin of Economic Research 55(2), 149–176; Magni, C. A. (2004) Economic Modelling 21, 595–617]. The model translates the notion of excess profit giving formal expression to a counterfactual alternative available
to the decision maker. Relations with other decomposition models are studied, among which Stewart’s [Stewart, G.B. (1991),
The Quest for Value: The EVA™ Management Guide, Harper Collins, Publishers Inc]. The index here introduced differs from Stewart’s Economic Value Added (EVA) in that it
rests on a different interpretation of the notion of excess profit and is formally connected with the EVA model by means of
a shadow project. The SVA is formally and conceptually threefold, in that it is economic, financial, accounting-flavoured. Some results
are offered, providing sufficient and necessary conditions for decomposing NFV. Relations between a project’s SVA and its
shadow project’s EVA are shown, all results of Pressacco and Stucchi [Pressacco, F. and Stucchi, P. (1997), Rivista di Matematica per le Scienze Economiche e Sociali 20, 165–185] are proved by making use of the systemic approach and the shadow counterparts of those results are also shown. 相似文献
12.
Scientific ideas neither arise nor develop in a vacuum. They are always nutured against a background of prior, partially conflicting
ideas. Systemic hypothesistesting is the problem of testing scientific hypotheses relative to various systems of background
knowledge. This paper shows how the problem of systemic hypothesis-testing (Sys HT) can be systematically expressed as a constrained
maximimization problem. It is also shown how the error of the third kind (E
III) is fundamental to the theory of Sys HT.The error of the third kind is defined as the probability of having solved the ‘wrong’ problem when one should have solved
the ‘right’ problem. This paper shows howE
III can be given both a systematic as well as a systemic treatment. Sys HT gives rise to a whole host of new decision problems,
puzzles, and paradoxes. 相似文献
13.
What are the minimal requirements of rational choice? Arguments from the sequential-decision setting
Katie Siobhan Steele 《Theory and Decision》2010,68(4):463-487
There are at least two plausible generalisations of subjective expected utility (SEU) theory: cumulative prospect theory (which
relaxes the independence axiom) and Levi’s decision theory (which relaxes at least ordering). These theories call for a re-assessment
of the minimal requirements of rational choice. Here, I consider how an analysis of sequential decision making contributes
to this assessment. I criticise Hammond’s (Economica 44(176):337–350, 1977; Econ Philos 4:292–297, 1988a; Risk, decision and
rationality, 1988b; Theory Decis 25:25–78, 1988c) ‘consequentialist’ argument for the SEU preference axioms, but go on to
formulate a related diachronic-Dutch-book-style’ argument that better achieves Hammond’s aims. Some deny the importance of
Dutch-book sure losses, however, in which case, Seidenfeld’s (Econ Philos 4:267–290, 1988a) argument that distinguishes between
theories that relax independence and those that relax ordering is relevant. I unravel Seidenfeld’s argument in light of the
various criticisms of it and show that the crux of the argument is somewhat different and much more persuasive than what others
have taken it to be; the critical issue is the modelling of future choices between ‘indifferent’ decision-tree branches in
the sequential setting. Finally, I consider how Seidenfeld’s conclusions might nonetheless be resisted. 相似文献
14.
Jeffrey Helzner 《Theory and Decision》2009,66(4):301-315
Ellsberg (The Quarterly Journal of Economics 75, 643–669 (1961); Risk, Ambiguity and Decision, Garland Publishing (2001)) argued that uncertainty is not reducible to risk. At the center of Ellsberg’s argument lies a
thought experiment that has come to be known as the three-color example. It has been observed that a significant number of
sophisticated decision makers violate the requirements of subjective expected utility theory when they are confronted with
Ellsberg’s three-color example. More generally, such decision makers are in conflict with either the ordering assumption or
the independence assumption of subjective expected utility theory. While a clear majority of the theoretical responses to
these violations have advocated maintaining ordering while relaxing independence, a persistent minority has advocated abandoning
the ordering assumption. The purpose of this paper is to consider a similar dilemma that exists within the context of multiattribute
models, where it arises by considering indeterminacy in the weighting of attributes rather than indeterminacy in the determination
of probabilities as in Ellsberg’s example.
相似文献
15.
Can ranking techniques elicit robust values? 总被引:1,自引:0,他引:1
This paper reports two experiments which examine the use of ranking methods to elicit ‘certainty equivalent’ values. It investigates
whether such methods are able to eliminate the disparities between choice and value which constitute the ‘preference reversal
phenomenon’ and which thereby pose serious problems for both theory and policy application. The results show that ranking
methods are vulnerable to distorting effects of their own, but that when such effects are controlled for, the preference reversal
phenomenon, previously so strong and striking, is very considerably attenuated.
相似文献
Graham LoomesEmail: |
16.
In some situations, a decision is best represented by an incompletely analyzed act: conditionally on a given event A, the consequences of the decision on sub-events are perfectly known and uncertainty becomes probabilizable, whereas the plausibility
of this event itself remains vague and the decision outcome on the complementary event [`(A)]{\bar{A}} is imprecisely known. In this framework, we study an axiomatic decision model and prove a representation theorem. Resulting
decision criteria aggregate partial evaluations consisting of (i) the conditional expected utility associated with the analyzed
part of the decision, and (ii) the best and worst consequences of its non-analyzed part. The representation theorem is consistent
with a wide variety of decision criteria, which allows for expressing various degrees of knowledge on (A, [`(A)]{A, \bar{A}}) and various types of attitude toward ambiguity and uncertainty. This diversity is taken into account by specific models
already existing in the literature. We exploit this fact and propose some particular forms of our model incorporating these
models as sub-models and moreover expressing various types of beliefs concerning the relative plausibility of the analyzed
and the non-analyzed events ranging from probabilities to complete ignorance that include capacities. 相似文献
17.
We investigate how choices for uncertain gain and loss prospects are affected by the decision maker’s perceived level of knowledge
about the underlying domain of uncertainty. Specifically, we test whether Heath and Tversky’s (J Risk Uncertain 4:5–28, 1991) competence hypothesis extends from gains to losses. We predict that the commonly-observed preference for high knowledge
over low knowledge prospects for gains reverses for losses. We employ an empirical setup in which participants make hypothetical
choices between gain or loss prospects in which the outcome depends on whether a high or low knowledge event occurs. We infer
decision weighting functions for high and low knowledge events from choices using a representative agent preference model.
For gains, we replicate the results of Kilka and Weber (Manage Sci 47:1712–1726, 2001), finding that decision makers are more attracted to choices that they feel more knowledgeable about. However, for losses,
we find limited support for our extension of the competence effect. 相似文献
18.
19.
Aymeric Lardon 《Theory and Decision》2012,72(3):387-411
In cooperative Cournot oligopoly games, it is known that the β-core is equal to the α-core, and both are non-empty if every individual profit function is continuous and concave (Zhao, Games Econ Behav 27:153–168,
1999b). Following Chander and Tulkens (Int J Game Theory 26:379–401, 1997), we assume that firms react to a deviating coalition by choosing individual best reply strategies. We deal with the problem
of the non-emptiness of the induced core, the γ-core, by two different approaches. The first establishes that the associated Cournot oligopoly Transferable Utility (TU)-games
are balanced if the inverse demand function is differentiable and every individual profit function is continuous and concave
on the set of strategy profiles, which is a step forward beyond Zhao’s core existence result for this class of games. The
second approach, restricted to the class of Cournot oligopoly TU-games with linear cost functions, provides a single-valued
allocation rule in the γ-core called Nash Pro rata (NP)-value. This result generalizes Funaki and Yamato’s (Int J Game Theory 28:157–171, 1999) core existence result from no capacity constraint to asymmetric capacity constraints. Moreover, we provide an axiomatic
characterization of this solution by means of four properties: efficiency, null firm, monotonicity, and non-cooperative fairness. 相似文献
20.
Wändi Bruine de Bruin Andrew M. Parker Jürgen Maurer 《Journal of Risk and Uncertainty》2011,42(2):145-159
Feelings of invulnerability, seen in judgments of 0% risk, can reflect misunderstandings of risk and risk behaviors, suggesting
increased need for risk communication. However, judgments of 0% risk may be given by individuals who feel invulnerable, and
by individuals who are rounding from small non-zero probabilities. We examined the effect of allowing participants to give
more precise responses in the 0–1% range on the validity of reported probability judgments. Participants assessed probabilities
for getting H1N1 influenza and dying from it conditional on infection, using a 0–100% visual linear scale. Those responding
in the 0–1% range received a follow-up question with more options in that range. This two-step procedure reduced the use of
0% and increased the resolution of responses in the 0–1% range. Moreover, revised probability responses improved predictions
of attitudes and self-reported behaviors. Hence, our two-step procedure allows for more precise and more valid measurement
of perceived invulnerability. 相似文献