首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We revisit the comparison of mathematical programming with equilibrium constraints (MPEC) and nested fixed point (NFXP) algorithms for estimating structural dynamic models by Su and Judd (2012). Their implementation of the nested fixed point algorithm used successive approximations to solve the inner fixed point problem (NFXP‐SA). We redo their comparison using the more efficient version of NFXP proposed by Rust (1987), which combines successive approximations and Newton–Kantorovich iterations to solve the fixed point problem (NFXP‐NK). We show that MPEC and NFXP are similar in speed and numerical performance when the more efficient NFXP‐NK variant is used.  相似文献   

2.
The widely used estimator of Berry, Levinsohn, and Pakes (1995) produces estimates of consumer preferences from a discrete‐choice demand model with random coefficients, market‐level demand shocks, and endogenous prices. We derive numerical theory results characterizing the properties of the nested fixed point algorithm used to evaluate the objective function of BLP's estimator. We discuss problems with typical implementations, including cases that can lead to incorrect parameter estimates. As a solution, we recast estimation as a mathematical program with equilibrium constraints, which can be faster and which avoids the numerical issues associated with nested inner loops. The advantages are even more pronounced for forward‐looking demand models where the Bellman equation must also be solved repeatedly. Several Monte Carlo and real‐data experiments support our numerical concerns about the nested fixed point approach and the advantages of constrained optimization. For static BLP, the constrained optimization approach can be as much as ten to forty times faster for large‐dimensional problems with many markets.  相似文献   

3.
We formulate and solve a range of dynamic models of constrained credit/insurance that allow for moral hazard and limited commitment. We compare them to full insurance and exogenously incomplete financial regimes (autarky, saving only, borrowing and lending in a single asset). We develop computational methods based on mechanism design, linear programming, and maximum likelihood to estimate, compare, and statistically test these alternative dynamic models with financial/information constraints. Our methods can use both cross‐sectional and panel data and allow for measurement error and unobserved heterogeneity. We estimate the models using data on Thai households running small businesses from two separate samples. We find that in the rural sample, the exogenously incomplete saving only and borrowing regimes provide the best fit using data on consumption, business assets, investment, and income. Family and other networks help consumption smoothing there, as in a moral hazard constrained regime. In contrast, in urban areas, we find mechanism design financial/information regimes that are decidedly less constrained, with the moral hazard model fitting best combined business and consumption data. We perform numerous robustness checks in both the Thai data and in Monte Carlo simulations and compare our maximum likelihood criterion with results from other metrics and data not used in the estimation. A prototypical counterfactual policy evaluation exercise using the estimation results is also featured.  相似文献   

4.
宏观经济领域中存在严重的结构突变性,模型估计量的优劣对估计样本规模是敏感的。本文针对时变参数模型,建立了滚动窗宽选择标准,通过最小化估计量的近似二次损失函数及最大化各子样本估计量间的曼哈顿距离选择窗宽大小,权衡了模型估计量的准确性和时变性两个相悖目标。蒙特卡罗模拟实验表明,本文所提出的方法在各种结构突变情形下均适用,能够应用于线性关系和非线性关系的时变参数模型中,且均具有稳健性。将该方法应用于我国金融网络的结构突变识别过程,显著改善了传统窗宽选择方法的结果。  相似文献   

5.
For large multi‐division firms, coordinating procurement policies across multiple divisions to leverage volume discounts from suppliers based on firm‐wide purchasing power can yield millions of dollars of savings in procurement costs. Coordinated procurement entails deciding which suppliers to use to meet each division's purchasing needs and sourcing preferences so as to minimize overall purchasing, logistics, and operational costs. Motivated by this tactical procurement planning problem facing a large industrial products manufacturer, we propose an integrated optimization model that simultaneously considers both firm‐wide volume discounts and divisional ordering and inventory costs. To effectively solve this large‐scale integer program, we develop and apply a tailored solution approach that exploits the problem structure to generate tight bounds. We identify several classes of valid inequalities to strengthen the linear programming relaxation, establish polyhedral properties of these inequalities, and develop both a cutting‐plane method and a sequential rounding heuristic procedure. Extensive computational tests for realistic problems demonstrate that our integrated sourcing model and solution method are effective and can provide significant economic benefits. The integrated approach yields average savings of 7.5% in total procurement costs compared to autonomous divisional policies, and our algorithm generates near‐optimal solutions (within 0.75% of optimality) within reasonable computational time.  相似文献   

6.
We study the sensitivity of investment to cash flow conditional on measures of q in an adjustment costs framework with costly external finance. We present a benchmark model in which this conditional investment–cash flow sensitivity increases monotonically with the cost premium for external finance, for firms in a financially constrained regime. Using simulated data, we show that this pattern is found in linear regressions that relate investment rates to measures of both cash flow and average q. We also derive a structural equation for investment from the first‐order conditions of our model, and show that this can be estimated directly.  相似文献   

7.
Many service industries use revenue management to balance demand and capacity. The assumption of risk-neutrality lies at the heart of the classical approaches, which aim at maximizing expected revenue. In this paper, we give a comprehensive overview of the existing approaches, most of which were only recently developed, and discuss the need to take risk-averse decision makers into account. We then present a heuristic that maximizes conditional value-at-risk (CVaR). Although CVaR has become increasingly popular in finance and actuarial science due to its beneficial properties, this risk measure has not yet been considered in the context of revenue management. We are able to efficiently solve the optimization problem inherent in CVaR by taking advantage of specific structural properties that allow us to reformulate this optimization problem as a continuous knapsack problem. In order to demonstrate the applicability and robustness of our approach, we conduct a simulation study that shows that the new approach can significantly improve the risk profile in various scenarios.  相似文献   

8.
The ability to accurately forecast and control inpatient census, and thereby workloads, is a critical and long‐standing problem in hospital management. The majority of current literature focuses on optimal scheduling of inpatients, but largely ignores the process of accurate estimation of the trajectory of patients throughout the treatment and recovery process. The result is that current scheduling models are optimizing based on inaccurate input data. We developed a Clustering and Scheduling Integrated (CSI) approach to capture patient flows through a network of hospital services. CSI functions by clustering patients into groups based on similarity of trajectory using a novel semi‐Markov model (SMM)‐based clustering scheme, as opposed to clustering by patient attributes as in previous literature. Our methodology is validated by simulation and then applied to real patient data from a partner hospital where we demonstrate that it outperforms a suite of well‐established clustering methods. Furthermore, we demonstrate that extant optimization methods achieve significantly better results on key hospital performance measures under CSI, compared with traditional estimation approaches, increasing elective admissions by 97% and utilization by 22% compared to 30% and 8% using traditional estimation techniques. From a theoretical standpoint, the SMM‐clustering is a novel approach applicable to any temporal‐spatial stochastic data that is prevalent in many industries and application areas.  相似文献   

9.
This paper proposes a general approach and a computationally convenient estimation procedure for the structural analysis of auction data. Considering first‐price sealed‐bid auction models within the independent private value paradigm, we show that the underlying distribution of bidders' private values is identified from observed bids and the number of actual bidders without any parametric assumptions. Using the theory of minimax, we establish the best rate of uniform convergence at which the latent density of private values can be estimated nonparametrically from available data. We then propose a two‐step kernel‐based estimator that converges at the optimal rate.  相似文献   

10.
This paper considers issues related to estimation, inference, and computation with multiple structural changes that occur at unknown dates in a system of equations. Changes can occur in the regression coefficients and/or the covariance matrix of the errors. We also allow arbitrary restrictions on these parameters, which permits the analysis of partial structural change models, common breaks that occur in all equations, breaks that occur in a subset of equations, and so forth. The method of estimation is quasi‐maximum likelihood based on Normal errors. The limiting distributions are obtained under more general assumptions than previous studies. For testing, we propose likelihood ratio type statistics to test the null hypothesis of no structural change and to select the number of changes. Structural change tests with restrictions on the parameters can be constructed to achieve higher power when prior information is present. For computation, an algorithm for an efficient procedure is proposed to construct the estimates and test statistics. We also introduce a novel locally ordered breaks model, which allows the breaks in different equations to be related yet not occurring at the same dates.  相似文献   

11.
We propose a framework for out‐of‐sample predictive ability testing and forecast selection designed for use in the realistic situation in which the forecasting model is possibly misspecified, due to unmodeled dynamics, unmodeled heterogeneity, incorrect functional form, or any combination of these. Relative to the existing literature (Diebold and Mariano (1995) and West (1996)), we introduce two main innovations: (i) We derive our tests in an environment where the finite sample properties of the estimators on which the forecasts may depend are preserved asymptotically. (ii) We accommodate conditional evaluation objectives (can we predict which forecast will be more accurate at a future date?), which nest unconditional objectives (which forecast was more accurate on average?), that have been the sole focus of previous literature. As a result of (i), our tests have several advantages: they capture the effect of estimation uncertainty on relative forecast performance, they can handle forecasts based on both nested and nonnested models, they allow the forecasts to be produced by general estimation methods, and they are easy to compute. Although both unconditional and conditional approaches are informative, conditioning can help fine‐tune the forecast selection to current economic conditions. To this end, we propose a two‐step decision rule that uses current information to select the best forecast for the future date of interest. We illustrate the usefulness of our approach by comparing forecasts from leading parameter‐reduction methods for macroeconomic forecasting using a large number of predictors.  相似文献   

12.
This paper proposes a method to address the longstanding problem of lack of monotonicity in estimation of conditional and structural quantile functions, also known as the quantile crossing problem (Bassett and Koenker (1982)). The method consists in sorting or monotone rearranging the original estimated non‐monotone curve into a monotone rearranged curve. We show that the rearranged curve is closer to the true quantile curve than the original curve in finite samples, establish a functional delta method for rearrangement‐related operators, and derive functional limit theory for the entire rearranged curve and its functionals. We also establish validity of the bootstrap for estimating the limit law of the entire rearranged curve and its functionals. Our limit results are generic in that they apply to every estimator of a monotone function, provided that the estimator satisfies a functional central limit theorem and the function satisfies some smoothness conditions. Consequently, our results apply to estimation of other econometric functions with monotonicity restrictions, such as demand, production, distribution, and structural distribution functions. We illustrate the results with an application to estimation of structural distribution and quantile functions using data on Vietnam veteran status and earnings.  相似文献   

13.
We present a new biclustering algorithm to simultaneously discover tissue classes and identify a set of genes that well-characterize these classes from DNA microarray data sets. We employ a combinatorial optimization approach where the object is to simultaneously identify an interesting set of genes and a partition of the array samples that optimizes a certain score based on a novel color island statistic. While this optimization problem is NP-complete in general, we are effectively able to solve problems of interest to optimality using a branch-and-bound algorithm. We have tested the algorithm on a 30 sample Cutaneous T-cell Lymphoma data set; it was able to almost perfectly discriminate short-term survivors from long-term survivors and normal controls. Another useful feature of our method is that can easily handle missing expression data.  相似文献   

14.
Utility systems such as power and communication systems regularly experience significant damage and loss of service during hurricanes. A primary damage mode for these systems is failure of wooden utility poles that support conductors and communication lines. In this article, we present an approach for combining structural reliability models for utility poles with observed data on pole performance during past hurricanes. This approach, based on Bayesian updating, starts from an imperfect but informative prior and updates this prior with observed performance data. We consider flexural and foundation failure mechanisms in the prior, acknowledging that these are an incomplete, but still informative, subset of the possible failure mechanisms for utility poles during hurricanes. We show how a model‐based prior can be updated with observed failure data, using pole failure data from Hurricane Katrina as a case study. The results of this integration of model‐based estimates and observed performance data then offer a more informative starting point for power system performance estimation for hurricane conditions.  相似文献   

15.
This paper considers the optimization of linearly constrained stochastic problem which only noisy measurements of the loss function are available. We propose a method which combines genetic algorithm (GA) with simultaneous perturbation stochastic approximation (SPSA) to solve linearly constrained stochastic problems. The hybrid method uses GA to search for optimum over the whole feasible region, and SPSA to search for optimum at local region. During the GA and SPSA search process, the hybrid method generates new solutions according to gradient projection direction, which is calculated based on active constraints. Because the gradient projection method projects the search direction into the subspace at a tangent to the active constraints, it ensures new solutions satisfy all constraints strictly. This paper applies the hybrid method to nine typical constrained optimization problems and the results coincide with the ideal solutions cited in the references. The numerical results reveal that the hybrid method is suitable for multimodal constrained stochastic optimization problem. Moreover, each solution generated by the hybrid method satisfies all linear constraints strictly.  相似文献   

16.
We develop and evaluate a modeling approach for making periodic review production and distribution decisions for a supply chain in the processed food industry. The supply chain faces several factors, including multiple products, multiple warehouses, production constraints, high transportation costs, and limited storage at the production facility. This problem is motivated by the supply chain structure at Amy's Kitchen, one of the leading producers of natural and organic foods in the United States. We develop an enhanced myopic two‐stage approach for this problem. The first stage determines the production plan and uses a heuristic, and the second stage determines the warehouse allocation plan and uses a non‐linear optimization model. This two‐stage approach is repeated every period and incorporates look‐ahead features to improve its performance in future periods. We validate our model using actual data from one factory at Amy's Kitchen and compare the performance of our model to that of the actual operation. We find that our model significantly reduces both inventory levels and stockouts relative to those of the actual operation. In addition, we identify a lower bound on the total costs for all feasible solutions to the problem and measure the effectiveness of our model against this lower bound. We perform sensitivity analysis on some key parameters and assumptions of our modeling approach.  相似文献   

17.
We propose a new methodology for structural estimation of infinite horizon dynamic discrete choice models. We combine the dynamic programming (DP) solution algorithm with the Bayesian Markov chain Monte Carlo algorithm into a single algorithm that solves the DP problem and estimates the parameters simultaneously. As a result, the computational burden of estimating a dynamic model becomes comparable to that of a static model. Another feature of our algorithm is that even though the number of grid points on the state variable is small per solution‐estimation iteration, the number of effective grid points increases with the number of estimation iterations. This is how we help ease the “curse of dimensionality.” We simulate and estimate several versions of a simple model of entry and exit to illustrate our methodology. We also prove that under standard conditions, the parameters converge in probability to the true posterior distribution, regardless of the starting values.  相似文献   

18.
Discrete‐choice models are widely used to model consumer purchase behavior in assortment optimization and revenue management. In many applications, each customer segment is associated with a consideration set that represents the set of products that customers in this segment consider for purchase. The firm has to make a decision on what assortment to offer at each point in time without the ability to identify the customer's segment. A linear program called the Choice‐based Deterministic Linear Program (CDLP) has been proposed to determine these offer sets. Unfortunately, its size grows exponentially in the number of products and it is NP‐hard to solve when the consideration sets of the segments overlap. The Segment‐based Deterministic Concave Program with some additional consistency equalities (SDCP+) is an approximation of CDLP that provides an upper bound on CDLP's optimal objective value. SDCP+ can be solved in a fraction of the time required to solve CDLP and often achieves the same optimal objective value. This raises the question under what conditions can one guarantee equivalence of CDLP and SDCP+. In this study, we obtain a structural result to this end, namely that if the segment consideration sets overlap with a certain tree structure or if they are fully nested, CDLP can be equivalently replaced with SDCP+. We give a number of examples from the literature where this tree structure arises naturally in modeling customer behavior.  相似文献   

19.
We propose a systematic approach that incorporates fuzzy set theory in conjunction with portfolio matrices to assist managers in reaching a better understanding of the overall competitiveness of their business portfolios. Integer linear programming is also accommodated in the proposed integrated approach to help select strategic plans by using the results derived from the previous portfolio analysis and other financial data. The proposed integrated approach is designed from a strategy‐oriented perspective for portfolio management at the corporate level. It has the advantage of dealing with the uncertainty problem of decision makers in doing evaluation, providing a technique that presents the diversity of confidence and optimism levels of decision makers. Furthermore, integer linear programming is used because it offers an effective quantitative method for managers to allocate constrained resources optimally among proposed strategies. An illustration from a real‐world situation demonstrates the integrated approach. Although a particular portfolio matrix model has been adopted in our research, the procedure proposed here can be modified to incorporate other portfolio matrices.  相似文献   

20.
There is a conventional wisdom in economics that public debt can serve as a substitute for private credit if private borrowing is limited. The purpose of this paper is to show that, while a government could in principle use such a policy to fully relax borrowing limits, this is not generally optimal. In our economy, agents invest in a short‐term asset, a long‐term asset, and government bonds. Agents are subject to idiosyncratic liquidity shocks prior to the maturity of the long‐term asset. We show that a high public debt policy fully relaxes private borrowing limits and is suboptimal. This is because agents expecting such a policy respond by investing less than is socially optimal in the short asset which can protect them in the event of a liquidity shock. The optimal policy is more constrained and it induces a wedge between the technological rate of return on the long asset and the rate of return on bonds. In such a regime, agents subject to liquidity shocks are also borrowing constrained, and this expectation of being borrowing constrained induces them to invest the optimal level in the short asset.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号