首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper, a problem for utility theory - that it would have an agent who was compelled to play Russian Roulette with one revolver or another, to pay as much to have a six-shooter with four bullets relieved of one bullet before playing with it, as he would be willing to pay to have a six-shooter with two bullets emptied - is reviewed. A less demanding Bayesian theory is described, that would have an agent maximize expected values of possible total consequence of his actions. And utility theory is located within that theory as valid for agents who satisfy certain formal conditions, that is, for agents who are, in terms of that more general theory, indifferent to certain dimensions of risk. Raiffa- and Savage-style arguments for its more general validity are then resisted. Addenda are concerned with implications for game theory, and relations between utilities and values.  相似文献   

2.
The idea that an individual's behavior is a function of its utility or Value represents a very common and fundamental assumption in the study of human conduct. In this paper it will be attempted to determine the nature of this function more precisely. Adopting a probabilistic conception of human action, it appears that an exponential function perfectly satisfies the empirical as well as formal conditions which it seems necessary to impose upon it initially. Empirical research into behavioral change lends additional support to the function thus constructed.  相似文献   

3.
This paper discusses several concepts that can be used to provide a foundation for a unified, theory of rational, economic behavior. First, decision-making is defined to be a process that takes place with reference to both subjective and objective time, that distinguishes between plans and actions, between information and states and that explicitly incorporates the collection and processing of information. This conception of decision making is then related to several important aspects of behavioral economics, the dependence of values on experience, the use of behavioral rules, the occurrence of multiple goals and environmental feedback.Our conclusions are (1) the non-transitivity of observed or revealed preferences is a characteristic of learning and hence is to be expected of rational decision-makers; (2) the learning of values through experience suggests the sensibleness of short time horizons and the making of choices according to flexible utility; (3) certain rules of thumb used to allow for risk are closely related to principles of Safety-First and can also be based directly on the hypothesis that the feeling of risk (the probability of disaster) is identified with extreme departures from recently executed decisions. (4) The maximization of a hierarchy of goals, or of a lexicographical utility function, is closely related to the search for feasibility and the practice of satisficing. (5) When the dim perception of environmental feedback and the effect of learning on values are acknowledged the intertemporal optimality of planned decision trajectories is seen to be a characteristic of subjective not objective time. This explains why decision making is so often best characterized by rolling plans. In short, we find that economic man - like any other - is an existential being whose plans are based on hopes and fears and whose every act involves a leap of faith.This paper is based on a talk presented at the Conference, New Beginnings in Economics, Akron, Ohio, March 15, 1969. Work on this paper was supported by a grant from the National Science Foundation.  相似文献   

4.
The traditional or orthodox decision rule of maximizing conditional expected utility has recently come under attack by critics who advance alternative causal decision theories. The traditional theory has, however, been defended. And these defenses have in turn been criticized. Here, I examine two objections to such defenses and advance a theory about the dynamics of deliberation (a diachronic theory about the process of deliberation) within the framework of which both objections to the defenses of the traditional theory fail.  相似文献   

5.
This paper considers two fundamental aspects of the analysis of dynamic choices under risk: the issue of the dynamic consistency of the strategies of a non EU maximizer, and the issue that an individual whose preferences are nonlinear in probabilities may choose a strategy which is in some appropriate sense dominated by other strategies. A proposed way of dealing with these problems, due to Karni and Safra and called behavioral consistency, is described. The implications of this notion of behavioral consistency are explored, and it is shown that while the Karni and Safra approach obtains dynamically consistent behavior under nonlinear preferences, it may imply the choice of dominated strategies even in very simple decision trees.  相似文献   

6.
A fixed agenda social choice correspondence on outcome set X maps each profile of individual preferences into a nonempty subset of X. If satisfies an analogue of Arrow's independence of irrelevant alternatives condition, then either the range of contains exactly two alternatives, or else there is at most one individual whose preferences have any bearing on . This is the case even if is not defined for any proper subset of X.  相似文献   

7.
Sometimes conducting an experiment to ascertain the state of a system changes the state of the system being measured. Kahneman & Tversky modelled this effect with support theory. Quantum physics models this effect with probability amplitude mechanics. As this paper shows, probability amplitude mechanics is similar to support theory. Additionally, Viscusi's proposed generalized expected utility model has an analogy in quantum mechanics.  相似文献   

8.
Two institutions that are often implicit or overlooked in noncooperative games are the assumption of Nash behavior to solve a game, and the ability to correlate strategies. We consider two behavioral paradoxes; one in which maximin behavior rules out all Nash equilibria (Chicken), and another in which minimax supergame behavior leads to an inefficient outcome in comparison to the unique stage game equilibrium (asymmetric Deadlock). Nash outcomes are achieved in both paradoxes by allowing for correlated strategies, even when individual behavior remains minimax or maximin. However, the interpretation of correlation as a public institution differs for each case.  相似文献   

9.
We analyze the decision of individuals with time-inconsistent preferences to invest in projects yielding either current costs and future benefits or current benefits and future costs. We show that competition between agents for the same project mitigates the tendency to procrastinate on the first type of activities (i.e. to undertake them too late) and to rush on the second one (i.e. to undertake them too early). Competition can therefore increase the expected welfare of each individual. On the contrary, complementarity of projects exacerbates the tendency to rush and to procrastinate and therefore it can decrease the expected welfare of each individual.  相似文献   

10.
Theories of economic behavior often use as-if-languages: for example, analytical sentences or definitions are used as if they were synthetic and factual-normative theoretical constructs are used as if they were empirical concepts. Such as-if-languages impede the acquisition of knowledge and are apt to encourage the wrong assessment of actual research strategies. The author's criticism is first leveled at revealed-preference theory. In this theory observed behavior is often understood in an empirical sense although it is a pure theoretical construct. Another example can be found in von Mises' representations on marketing behavior: here theoretical valuations are used to achieve a spurious streamlining of reality. Result: Scientists should not ogle with reality if they have nothing to say about it.  相似文献   

11.
Nash's solution of a two-person cooperative game prescribes a coordinated mixed strategy solution involving Pareto-optimal outcomes of the game. Testing this normative solution experimentally presents problems in as much as rather detailed explanations must be given to the subjects of the meaning of threat strategy, strategy mixture, expected payoff, etc. To the extent that it is desired to test the solution using naive subjects, the problem arises of imparting to them a minimal level of understanding about the issue involved in the game without actually suggesting the solution.Experiments were performed to test the properties of the solution of a cooperative two-person game as these are embodied in three of Nash's four axioms: Symmetry, Pareto-optimality, and Invariance with respect to positive linear transformations. Of these, the last was definitely discorroborated, suggesting that interpersonal comparison of utilities plays an important part in the negotiations.Some evidence was also found for a conjecture generated by previous experiments, namely that an externally imposed threat (penalty for non-cooperation) tends to bring the players closer together than the threats generated by the subjects themselves in the process of negotiation.  相似文献   

12.
This paper studies two models of rational behavior under uncertainty whose predictions are invariant under ordinal transformations of utility. The quantile utility model assumes that the agent maximizes some quantile of the distribution of utility. The utility mass model assumes maximization of the probability of obtaining an outcome whose utility is higher than some fixed critical value. Both models satisfy weak stochastic dominance. Lexicographic refinements satisfy strong dominance.The study of these utility models suggests a significant generalization of traditional ideas of riskiness and risk preference. We define one action to be riskier than another if the utility distribution of the latter crosses that of the former from below. The single crossing property is equivalent to a minmax spread of a random variable. With relative risk defined by the single crossing criterion, the risk preference of a quantile utility maximizer increases with the utility distribution quantile that he maximizes. The risk preference of a utility mass maximizer increases with his critical utility value.  相似文献   

13.
Both Popper and Good have noted that a deterministic microscopic physical approach to probability requires subjective assumptions about the statistical distribution of initial conditions. However, they did not use such a fact for defining an a priori probability, but rather recurred to the standard observation of repetitive events. This observational probability may be hard to assess for real-life decision problems under uncertainty that very often are - strictly speaking - non-repetitive, one-time events. This may be a reason for the popularity of subjective probability in decision models. Unfortunately, such subjective probabilities often merely reflect attitudes towards risk, and not the underlying physical processes.In order to get as objective as possible a definition of probability for one-time events, this paper identifies the origin of randomness in individual chance processes. By focusing on the dynamics of the process, rather than on the (static) device, it is found that any process contains two components: observer-independent (= objective) and observer-dependent (= subjective). Randomness, if present, arises from the subjective definition of the rules of the game, and is not - as in Popper's propensity - a physical property of the chance device. In this way, the classical definition of probability is no longer a primitive notion based upon equally possible cases, but is derived from the underlying microscopic processes, plus a subjective, clearly identified, estimate of the branching ratios in an event tree. That is, equipossibility is not an intrinsic property of the system object/subject but is forced upon the system via the rules of the game/measurement.Also, the typically undefined concept of symmetry in games of chance is broken down into objective and subjective components. It is found that macroscopic symmetry may hold under microscopic asymmetry. A similar analysis of urn drawings shows no conceptual difference with other games of chance (contrary to Allais' opinion). Finally, the randomness in Lande's knife problem is not due to objective fortuity (as in Popper's view) but to the rules of the game (the theoretical difficulties arise from intermingling microscopic trajectories and macroscopic events).Dedicated to Professor Maurice Allais on the occasion of the Nobel Prize in Economics awarded December, 1988.  相似文献   

14.
Aumann's (1987) theorem shows that correlated equilibrium is an expression of Bayesian rationality. We extend this result to games with incomplete information.First, we rely on Harsanyi's (1967) model and represent the underlying multiperson decision problem as a fixed game with imperfect information. We survey four definitions of correlated equilibrium which have appeared in the literature. We show that these definitions are not equivalent to each other. We prove that one of them fits Aumann's framework; the agents normal form correlated equilibrium is an expression of Bayesian rationality in games with incomplete information.We also follow a universal Bayesian approach based on Mertens and Zamir's (1985) construction of the universal beliefs space. Hierarchies of beliefs over independent variables (states of nature) and dependent variables (actions) are then constructed simultaneously. We establish that the universal set of Bayesian solutions satisfies another extension of Aumann's theorem.We get the following corollary: once the types of the players are not fixed by the model, the various definitions of correlated equilibrium previously considered are equivalent.  相似文献   

15.
A Comparison of Some Distance-Based Choice Rules in Ranking Environments   总被引:1,自引:0,他引:1  
We discuss the relationships between positional rules (such as plurality and approval voting as well as the Borda count), Dodgsons, Kemenys and Litvaks methods of reaching consensus. The discrepancies between methods are seen as results of different intuitive conceptions of consensus goal states and ways of measuring distances therefrom. Saaris geometric methodology is resorted to in the analysis of the consensus reaching methods.  相似文献   

16.
Choices between gambles show systematic violations of stochastic dominance. For example, most people choose ($6, .05; $91, .03; $99, .92) over ($6, .02; $8, .03; $99, .95), violating dominance. Choices also violate two cumulative independence conditions: (1) If S = (z, r; x, p; y, q) R = (z, r; x, p; y, q) then S = (x, r; y, p + q) R = (x, r + p; y, q). (2) If S = (x, p; y, q; z, r) R = (x, p; y, q; z, r) then S = (x, p + q; y, r) R = (x, p; y, q + r), where 0 < z < x < x < y < y < y < z.Violations contradict any utility theory satisfying transivity, outcome monotonicity, coalescing, and comonotonic independence. Because rank-and sign-dependent utility theories, including cumulative prospect theory (CPT), satisfy these properties, they cannot explain these results.However, the configural weight model of Birnbaum and McIntosh (1996) predicted the observed violations of stochastic dominance, cumulative independence, and branch independence. This model assumes the utility of a gamble is a weighted average of outcomes\' utilities, where each configural weight is a function of the rank order of the outcome\'s value among distinct values and that outcome\'s probability. The configural weight, TAX model with the same number of parameters as CPT fit the data of most individuals better than the model of CPT.  相似文献   

17.
We report an experiment on two treatments of an ultimatum minigame. In one treatment, responders reactions are hidden to proposers. We observe high rejection rates reflecting responders intrinsic resistance to unfairness. In the second treatment, proposers are informed, allowing for dynamic effects over eight rounds of play. The higher rejection rates can be attributed to responders provision of a public good: Punishment creates a group reputation for being tough and effectively educate proposers. Since rejection rates with informed proposers drop to the level of the treatment with non-informed proposers, the hypothesis of responders enjoyment of overt punishment is not supported.  相似文献   

18.
Let (, ) and (, ) be mean-standard deviation pairs of two probability distributions on the real line. Mean-variance analyses presume that the preferred distribution depends solely on these pairs, with primary preference given to larger mean and smaller variance. This presumption, in conjunction with the assumption that one distribution is better than a second distribution if the mass of the first is completely to the right of the mass of the second, implies that (, ) is preferred to (, ) if and only if either > or ( = and < ), provided that the set of distributions is sufficiently rich. The latter provision fails if the outcomes of all distributions lie in a finite interval, but then it is still possible to arrive at more liberal dominance conclusions between (, ) and (, ).This research was supported by the Office of Naval Research.  相似文献   

19.
The present paper deals with the Galbraithian theory of the managerial firm. Galbraith has stressed corporate size and has questioned the effectiveness of the market demand, technology and capital market constraints, which in conventional theory restrict the size of the firm.Galbraith represents the objectives of the corporation in terms of a conventional lexicographic objective function with some minimal level of profits (in terms of cash flow) being ranked the dominant objective. Also in his treatment of the corporate constraints, Galbraith does not move much beyond the current state of knowledge. The assumption of consumer sovereignty has long been relegated to the text-book literature, and the firm's control over the quality of its product (its price elasticity) has been generally recognized. Similarly, it has been known that the capital market is not perfect so that it is unlikely to constrain the expansion of the firm with some given investor determined earning constraint. In his attempt to show the technostructure's ability to plan the rate and the direction of the technological development Galbraith did not, however, meet with wide support from empirical research and analysis. It is extremely difficult to test the firm's control over its production technology, and while the few industry studies available can hardly be used to reject the Galbraithian position, there is not sufficient evidence to support a generalization of Galbraith's conjecture.While individually these constraints have been analyzed and discussed in the literature, Galbraith has combined these results and has been able to show that in the industrial state the qualitative laws of economic common sense do not hold. The importance of this conclusion is not only academic. Efforts to control corporate allocations through rate controls, antitrust litigation, and in other ways emanate from the conventional theory of firms and markets and do not fit the industrial state. In this state corporate size does matter and cannot be treated as random: The larger the corporation the more perfect the control it assumes over its environment and the higher the efficiency with which it plans its over-all operations.We acknowledge the helpful comments of a referee of this journal.  相似文献   

20.
Operational researchers, management scientists, and industrial engineers have been asked by Russell Ackoff to become systems scientists, yet he stated that Systems Science is not a science. (TIMS Interfaces, 2 (4), 41). A. C. Fabergé (Science 184, 1330) notes that the original intent of operational researchers was that they be scientists, trained to observe. Hugh J. Miser (Operations Research 22, 903), views operations research as a science, noting that its progress indeed is of a cyclic nature.The present paper delineates explicitly the attributes of simulation methodology. Simulation is shown to be both an art and a science; its methodology, properly used, is founded both on confirmed (validated) observation and scrutinised (verified) art work.The paper delineates the existing procedures by which computer-directed models can be cyclically scrutinised and confirmed and therefore deemed credible. The complexities of the phenomena observed by social scientists are amenable to human understanding by properly applied simulation; the methodology of the scientist of systems (the systemic scientist).
Résumé Russell Ackoff propose à ceux qui s'occupent de recherches opérationnelle, industrielle, et de gestion, d'agir en systems scientists, et pourtant il affirme que systems science n'est pas une science (TIMS Interfaces 2 (4), 41). A. C. Fabergé (Science 184, 1330) remarque, qu'à l'origine, le but de ceux qui s'occupaient de recherche opérationnelle était d'agir en hommes de science instruits à observer. Hugh J. Miser (Operational Research 22, 903) considère la recherche opérationnelle comme science, notant que ses progrès sont en effet de nature cyclique.La présente étude délimite explicitement les attributs de la méthode de la simulation. Il est démontré que la simulation est à la fois un art et une science; sa méthode, lorsqu'utilisée correctement, repose sur l'observation validée et le modèle vérifié.L'étude délimite les moyens actuels dont nous disposons pour vérifier et valider cycliquement les modèles bâtis à l'aide d'ordinateurs, établissant ainsi leur crédibilité. La nature complexe des phénomènes étudiés par les sciences sociales peut être comprise à l'aide de la simulation: la méthode dont se servent les hommes de science qui étudient les systèmes (les scientistes systémiques).
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号