首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper develops a theory of optimal provision of commitment devices to people who value both commitment and flexibility and whose preferences differ in the degree of time inconsistency. If time inconsistency is observable, both a planner and a monopolist provide devices that help each person commit to the efficient level of flexibility. However, the combination of unobservable time inconsistency and preference for flexibility causes an adverse‐selection problem. To solve this problem, the monopolist and (possibly) the planner curtail flexibility in the device for a more inconsistent person at both ends of the efficient choice range; moreover, they may have to add unused options to the device for a less inconsistent person and also distort his actual choices. This theory has normative and positive implications for private and public provision of commitment devices.  相似文献   

2.
We study the random Strotz model, a version of the Strotz (1955) model with uncertainty about the nature of the temptation that will strike. We show that the random Strotz representation is unique and characterize a comparative notion of “more temptation averse.” Also, we demonstrate an unexpected connection between the random Strotz model and a generalization of the Gul–Pesendorfer (GP) (2001) model of temptation which allows for the temptation to be uncertain and which we call random GP. In particular, a preference over menus has a random GP representation if and only if it also has a representation via a random Strotz model with sufficiently smooth uncertainty about the intensity of temptation. We also show that choices of menus combined with choices from menus can distinguish the random GP and random Strotz models.  相似文献   

3.
We demonstrate the asymptotic equivalence between commonly used test statistics for out‐of‐sample forecasting performance and conventional Wald statistics. This equivalence greatly simplifies the computational burden of calculating recursive out‐of‐sample test statistics and their critical values. For the case with nested models, we show that the limit distribution, which has previously been expressed through stochastic integrals, has a simple representation in terms of χ2‐distributed random variables and we derive its density. We also generalize the limit theory to cover local alternatives and characterize the power properties of the test.  相似文献   

4.
We propose a novel technique to boost the power of testing a high‐dimensional vector H : θ = 0 against sparse alternatives where the null hypothesis is violated by only a few components. Existing tests based on quadratic forms such as the Wald statistic often suffer from low powers due to the accumulation of errors in estimating high‐dimensional parameters. More powerful tests for sparse alternatives such as thresholding and extreme value tests, on the other hand, require either stringent conditions or bootstrap to derive the null distribution and often suffer from size distortions due to the slow convergence. Based on a screening technique, we introduce a “power enhancement component,” which is zero under the null hypothesis with high probability, but diverges quickly under sparse alternatives. The proposed test statistic combines the power enhancement component with an asymptotically pivotal statistic, and strengthens the power under sparse alternatives. The null distribution does not require stringent regularity conditions, and is completely determined by that of the pivotal statistic. The proposed methods are then applied to testing the factor pricing models and validating the cross‐sectional independence in panel data models.  相似文献   

5.
Owing to the worldwide shortage of deceased‐donor organs for transplantation, living donations have become a significant source of transplant organs. However, not all willing donors can donate to their intended recipients because of medical incompatibilities. These incompatibilities can be overcome by an exchange of donors between patients. For kidneys, such exchanges have become widespread in the last decade with the introduction of optimization and market design techniques to kidney exchange. A small but growing number of liver exchanges have also been conducted. Over the last two decades, a number of transplantation procedures emerged where organs from two living donors are transplanted to a single patient. Prominent examples include dual‐graft liver transplantation, lobar lung transplantation, and simultaneous liver‐kidney transplantation. Exchange, however, has been neither practiced nor introduced in this context. We introduce dual‐donor organ exchange as a novel transplantation modality, and through simulations show that living‐donor transplants can be significantly increased through such exchanges. We also provide a simple theoretical model for dual‐donor organ exchange and introduce optimal exchange mechanisms under various logistical constraints.  相似文献   

6.
We propose a novel model of stochastic choice: the single‐crossing random utility model (SCRUM). This is a random utility model in which the collection of preferences satisfies the single‐crossing property. We offer a characterization of SCRUMs based on two easy‐to‐check properties: the classic Monotonicity property and a novel condition, Centrality. The identified collection of preferences and associated probabilities is unique. We show that SCRUMs nest both single‐peaked and single‐dipped random utility models and establish a stochastic monotone comparative result for the case of SCRUMs.  相似文献   

7.
This paper develops the fixed‐smoothing asymptotics in a two‐step generalized method of moments (GMM) framework. Under this type of asymptotics, the weighting matrix in the second‐step GMM criterion function converges weakly to a random matrix and the two‐step GMM estimator is asymptotically mixed normal. Nevertheless, the Wald statistic, the GMM criterion function statistic, and the Lagrange multiplier statistic remain asymptotically pivotal. It is shown that critical values from the fixed‐smoothing asymptotic distribution are high order correct under the conventional increasing‐smoothing asymptotics. When an orthonormal series covariance estimator is used, the critical values can be approximated very well by the quantiles of a noncentral F distribution. A simulation study shows that statistical tests based on the new fixed‐smoothing approximation are much more accurate in size than existing tests.  相似文献   

8.
This note studies some seemingly anomalous results that arise in possibly misspecified, reduced‐rank linear asset‐pricing models estimated by the continuously updated generalized method of moments. When a spurious factor (that is, a factor that is uncorrelated with the returns on the test assets) is present, the test for correct model specification has asymptotic power that is equal to the nominal size. In other words, applied researchers will erroneously conclude that the model is correctly specified even when the degree of misspecification is arbitrarily large. The rejection probability of the test for overidentifying restrictions typically decreases further in underidentified models where the dimension of the null space is larger than 1.  相似文献   

9.
Empirical studies have delivered mixed conclusions on whether the widely acclaimed assertions of lower electronic retail (e‐tail) prices are true and to what extent these prices impact conventional retail prices, profits, and consumer welfare. For goods that require little in‐person pre‐ or postsales support such as CDs, DVDs, and books, we extend Balasubramanian's e‐tailer‐in‐the‐center, spatial, circular market model to examine the impact of a multichannel e‐tailer's presence on retailers' decisions to relocate, on retail prices and profits, and consumer welfare. We demonstrate several counter‐intuitive results. For example, when the disutility of buying online and shipping costs are relatively low, retailers are better off by not relocating in response to an e‐tailer's entry into the retail channel. In addition, such an entry—a multichannel strategy—may lead to increased retail prices and increased profits across the industry. Finally, consumers can be better off with less channel competition. The underlying message is that inferences regarding prices, profits, and consumer welfare critically depend on specifications of the good, disutility and shipping costs versus transportation costs (or more generally, positioning), and competition.  相似文献   

10.
We estimate demand for residential broadband using high‐frequency data from subscribers facing a three‐part tariff. The three‐part tariff makes data usage during the billing cycle a dynamic problem, thus generating variation in the (shadow) price of usage. We provide evidence that subscribers respond to this variation, and we use their dynamic decisions to estimate a flexible distribution of willingness to pay for different plan characteristics. Using the estimates, we simulate demand under alternative pricing and find that usage‐based pricing eliminates low‐value traffic. Furthermore, we show that the costs associated with investment in fiber‐optic networks are likely recoverable in some markets, but that there is a large gap between social and private incentives to invest.  相似文献   

11.
This paper extends the long‐term factorization of the stochastic discount factor introduced and studied by Alvarez and Jermann (2005) in discrete‐time ergodic environments and by Hansen and Scheinkman (2009) and Hansen (2012) in Markovian environments to general semimartingale environments. The transitory component discounts at the stochastic rate of return on the long bond and is factorized into discounting at the long‐term yield and a positive semimartingale that extends the principal eigenfunction of Hansen and Scheinkman (2009) to the semimartingale setting. The permanent component is a martingale that accomplishes a change of probabilities to the long forward measure, the limit of T‐forward measures. The change of probabilities from the data‐generating to the long forward measure absorbs the long‐term risk‐return trade‐off and interprets the latter as the long‐term risk‐neutral measure.  相似文献   

12.
In this paper, we provide efficient estimators and honest confidence bands for a variety of treatment effects including local average (LATE) and local quantile treatment effects (LQTE) in data‐rich environments. We can handle very many control variables, endogenous receipt of treatment, heterogeneous treatment effects, and function‐valued outcomes. Our framework covers the special case of exogenous receipt of treatment, either conditional on controls or unconditionally as in randomized control trials. In the latter case, our approach produces efficient estimators and honest bands for (functional) average treatment effects (ATE) and quantile treatment effects (QTE). To make informative inference possible, we assume that key reduced‐form predictive relationships are approximately sparse. This assumption allows the use of regularization and selection methods to estimate those relations, and we provide methods for post‐regularization and post‐selection inference that are uniformly valid (honest) across a wide range of models. We show that a key ingredient enabling honest inference is the use of orthogonal or doubly robust moment conditions in estimating certain reduced‐form functional parameters. We illustrate the use of the proposed methods with an application to estimating the effect of 401(k) eligibility and participation on accumulated assets. The results on program evaluation are obtained as a consequence of more general results on honest inference in a general moment‐condition framework, which arises from structural equation models in econometrics. Here, too, the crucial ingredient is the use of orthogonal moment conditions, which can be constructed from the initial moment conditions. We provide results on honest inference for (function‐valued) parameters within this general framework where any high‐quality, machine learning methods (e.g., boosted trees, deep neural networks, random forest, and their aggregated and hybrid versions) can be used to learn the nonparametric/high‐dimensional components of the model. These include a number of supporting auxiliary results that are of major independent interest: namely, we (1) prove uniform validity of a multiplier bootstrap, (2) offer a uniformly valid functional delta method, and (3) provide results for sparsity‐based estimation of regression functions for function‐valued outcomes.  相似文献   

13.
Strategic choice data from a carefully chosen set of ring‐network games are used to obtain individual‐level estimates of higher‐order rationality. The experimental design exploits a natural exclusion restriction that is considerably weaker than the assumptions underlying alternative designs in the literature. In our data set, 93 percent of subjects are rational, 71 percent are rational and believe others are rational, 44 percent are rational and hold second‐order beliefs that others are rational, and 22 percent are rational and hold at least third‐order beliefs that others are rational.  相似文献   

14.
To study the behavior of agents who are susceptible to temptation in infinite horizon consumption problems under uncertainty, we define and characterize dynamic self‐control (DSC) preferences. DSC preferences are recursive and separable. In economies with DSC agents, equilibria exist but may be inefficient; in such equilibria, steady state consumption is independent of initial endowments and increases in self‐control. Increasing the preference for commitment while keeping self‐control constant increases the equity premium. Removing nonbinding constraints changes equilibrium allocations and prices. Debt contracts can be sustained even if the only feasible punishment for default is the termination of the contract.  相似文献   

15.
We propose a semiparametric two‐step inference procedure for a finite‐dimensional parameter based on moment conditions constructed from high‐frequency data. The population moment conditions take the form of temporally integrated functionals of state‐variable processes that include the latent stochastic volatility process of an asset. In the first step, we nonparametrically recover the volatility path from high‐frequency asset returns. The nonparametric volatility estimator is then used to form sample moment functions in the second‐step GMM estimation, which requires the correction of a high‐order nonlinearity bias from the first step. We show that the proposed estimator is consistent and asymptotically mixed Gaussian and propose a consistent estimator for the conditional asymptotic variance. We also construct a Bierens‐type consistent specification test. These infill asymptotic results are based on a novel empirical‐process‐type theory for general integrated functionals of noisy semimartingale processes.  相似文献   

16.
We develop an econometric methodology to infer the path of risk premia from a large unbalanced panel of individual stock returns. We estimate the time‐varying risk premia implied by conditional linear asset pricing models where the conditioning includes both instruments common to all assets and asset‐specific instruments. The estimator uses simple weighted two‐pass cross‐sectional regressions, and we show its consistency and asymptotic normality under increasing cross‐sectional and time series dimensions. We address consistent estimation of the asymptotic variance by hard thresholding, and testing for asset pricing restrictions induced by the no‐arbitrage assumption. We derive the restrictions given by a continuum of assets in a multi‐period economy under an approximate factor structure robust to asset repackaging. The empirical analysis on returns for about ten thousand U.S. stocks from July 1964 to December 2009 shows that risk premia are large and volatile in crisis periods. They exhibit large positive and negative strays from time‐invariant estimates, follow the macroeconomic cycles, and do not match risk premia estimates on standard sets of portfolios. The asset pricing restrictions are rejected for a conditional four‐factor model capturing market, size, value, and momentum effects.  相似文献   

17.
We provide general conditions under which principal‐agent problems with either one or multiple agents admit mechanisms that are optimal for the principal. Our results cover as special cases pure moral hazard and pure adverse selection. We allow multidimensional types, actions, and signals, as well as both financial and non‐financial rewards. Our results extend to situations in which there are ex ante or interim restrictions on the mechanism, and allow the principal to have decisions in addition to choosing the agent's contract. Beyond measurability, we require no a priori restrictions on the space of mechanisms. It is not unusual for randomization to be necessary for optimality and so it (should be and) is permitted. Randomization also plays an essential role in our proof. We also provide conditions under which some forms of randomization are unnecessary.  相似文献   

18.
We explore the impact of private information in sealed‐bid first‐price auctions. For a given symmetric and arbitrarily correlated prior distribution over values, we characterize the lowest winning‐bid distribution that can arise across all information structures and equilibria. The information and equilibrium attaining this minimum leave bidders indifferent between their equilibrium bids and all higher bids. Our results provide lower bounds for bids and revenue with asymmetric distributions over values. We also report further characterizations of revenue and bidder surplus including upper bounds on revenue. Our work has implications for the identification of value distributions from data on winning bids and for the informationally robust comparison of alternative auction mechanisms.  相似文献   

19.
Mechanism design enables a social planner to obtain a desired outcome by leveraging the players' rationality and their beliefs. It is thus a fundamental, but yet unproven, intuition that the higher the level of rationality of the players, the better the set of obtainable outcomes. In this paper, we prove this fundamental intuition for players with possibilistic beliefs, a model long considered in epistemic game theory. Specifically, • We define a sequence of monotonically increasing revenue benchmarks for single‐good auctions, G0G1G2≤⋯, where each Gi is defined over the players' beliefs and G0 is the second‐highest valuation (i.e., the revenue benchmark achieved by the second‐price mechanism). • We (1) construct a single, interim individually rational, auction mechanism that, without any clue about the rationality level of the players, guarantees revenue Gk if all players have rationality levels ≥k+1, and (2) prove that no such mechanism can guarantee revenue even close to Gk when at least two players are at most level‐k rational.  相似文献   

20.
Modeling intergenerational altruism is crucial to evaluate the long‐term consequences of current decisions, and requires a set of principles guiding such altruism. We axiomatically develop a theory of pure, direct altruism: Altruism is pure if it concerns the total utility (rather than the mere consumption utility) of future generations, and direct if it directly incorporates the utility of all future generations. Our axioms deliver a new class of altruistic, forward‐looking preferences, whose weight put on the consumption of a future generation generally depends on the consumption of other generations. The only preferences lacking this dependence correspond to the quasi‐hyperbolic discounting model, which our theory characterizes. Our approach provides a framework to analyze welfare in the presence of altruistic preferences and addresses technical challenges stemming from the interdependent nature of such preferences.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号