首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Estimating structural models is often viewed as computationally difficult, an impression partly due to a focus on the nested fixed‐point (NFXP) approach. We propose a new constrained optimization approach for structural estimation. We show that our approach and the NFXP algorithm solve the same estimation problem, and yield the same estimates. Computationally, our approach can have speed advantages because we do not repeatedly solve the structural equation at each guess of structural parameters. Monte Carlo experiments on the canonical Zurcher bus‐repair model demonstrate that the constrained optimization approach can be significantly faster.  相似文献   

2.
We revisit the comparison of mathematical programming with equilibrium constraints (MPEC) and nested fixed point (NFXP) algorithms for estimating structural dynamic models by Su and Judd (2012). Their implementation of the nested fixed point algorithm used successive approximations to solve the inner fixed point problem (NFXP‐SA). We redo their comparison using the more efficient version of NFXP proposed by Rust (1987), which combines successive approximations and Newton–Kantorovich iterations to solve the fixed point problem (NFXP‐NK). We show that MPEC and NFXP are similar in speed and numerical performance when the more efficient NFXP‐NK variant is used.  相似文献   

3.
This paper considers the estimation problem of structural models for which empirical restrictions are characterized by a fixed point constraint, such as structural dynamic discrete choice models or models of dynamic games. We analyze a local condition under which the nested pseudo likelihood (NPL) algorithm converges to a consistent estimator, and derive its convergence rate. We find that the NPL algorithm may not necessarily converge to a consistent estimator when the fixed point mapping does not have a local contraction property. To address the issue of divergence, we propose alternative sequential estimation procedures that can converge to a consistent estimator even when the NPL algorithm does not.  相似文献   

4.
This paper proposes a new nested algorithm (NPL) for the estimation of a class of discrete Markov decision models and studies its statistical and computational properties. Our method is based on a representation of the solution of the dynamic programming problem in the space of conditional choice probabilities. When the NPL algorithm is initialized with consistent nonparametric estimates of conditional choice probabilities, successive iterations return a sequence of estimators of the structural parameters which we call K–stage policy iteration estimators. We show that the sequence includes as extreme cases a Hotz–Miller estimator (for K=1) and Rust's nested fixed point estimator (in the limit when K→∞). Furthermore, the asymptotic distribution of all the estimators in the sequence is the same and equal to that of the maximum likelihood estimator. We illustrate the performance of our method with several examples based on Rust's bus replacement model. Monte Carlo experiments reveal a trade–off between finite sample precision and computational cost in the sequence of policy iteration estimators.  相似文献   

5.
In this article, we study the performance of multi‐echelon inventory systems with intermediate, external product demand in one or more upper echelons. This type of problem is of general interest in inventory theory and of particular importance in supply chain systems with both end‐product demand and spare parts (subassemblies) demand. The multi‐echelon inventory system considered here is a combination of assembly and serial stages with direct demand from more than one node. The aspect of multiple sources of demands leads to interesting inventory allocation problems. The demand and capacity at each node are considered stochastic in nature. A fixed supply and manufacturing lead time is used between the stages. We develop mathematical models for these multi‐echelon systems, which describe the inventory dynamics and allow simulation of the system. A simulation‐based inventory optimization approach is developed to search for the best base‐stock levels for these systems. The gradient estimation technique of perturbation analysis is used to derive sample‐path estimators. We consider four allocation schemes: lexicographic with priority to intermediate demand, lexiographic with priority to downstream demand, predetermined proportional allocation, and proportional allocation. Based on the numerical results we find that no single allocation policy is appropriate under all conditions. Depending on the combinations of variability and utilization we identify conditions under which use of certain allocation polices across the supply chain result in lower costs. Further, we determine how selection of an inappropriate allocation policy in the presence of scarce on‐hand inventory could result in downstream nodes facing acute shortages. Consequently we provide insight on why good allocation policies work well under differing sets of operating conditions.  相似文献   

6.
In this research, we apply robust optimization (RO) to the problem of locating facilities in a network facing uncertain demand over multiple periods. We consider a multi‐period fixed‐charge network location problem for which we find (1) the number of facilities, their location and capacities, (2) the production in each period, and (3) allocation of demand to facilities. Using the RO approach we formulate the problem to include alternate levels of uncertainty over the periods. We consider two models of demand uncertainty: demand within a bounded and symmetric multi‐dimensional box, and demand within a multi‐dimensional ellipsoid. We evaluate the potential benefits of applying the RO approach in our setting using an extensive numerical study. We show that the alternate models of uncertainty lead to very different solution network topologies, with the model with box uncertainty set opening fewer, larger facilities. Through sample path testing, we show that both the box and ellipsoidal uncertainty cases can provide small but significant improvements over the solution to the problem when demand is deterministic and set at its nominal value. For changes in several environmental parameters, we explore the effects on the solution performance.  相似文献   

7.
We consider the inventory management problem of a firm reacting to potential change points in demand, which we define as known epochs at which the demand distribution may (or may not) abruptly change. Motivating examples include global news events (e.g., the 9/11 terrorist attacks), local events (e.g., the opening of a nearby attraction), or internal events (e.g., a product redesign). In the periods following such a potential change point in demand, a manager is torn between using a possibly obsolete demand model estimated from a long data history and using a model estimated from a short, recent history. We formulate a Bayesian inventory problem just after a potential change point. We pursue heuristic policies coupled with cost lower bounds, including a new lower bounding approach to non‐perishable Bayesian inventory problems that relaxes the dependence between physical demand and demand signals and that can be applied for a broad set of belief and demand distributions. Our numerical studies reveal small gaps between the costs implied by our heuristic solutions and our lower bounds. We also provide analytical and numerical sensitivity results suggesting that a manager worried about downside profit risk should err on the side of underestimating demand at a potential change point.  相似文献   

8.
We present the Integrated Preference Functional (IPF) for comparing the quality of proposed sets of near‐pareto‐optimal solutions to bi‐criteria optimization problems. Evaluating the quality of such solution sets is one of the key issues in developing and comparing heuristics for multiple objective combinatorial optimization problems. The IPF is a set functional that, given a weight density function provided by a decision maker and a discrete set of solutions for a particular problem, assigns a numerical value to that solution set. This value can be used to compare the quality of different sets of solutions, and therefore provides a robust, quantitative approach for comparing different heuristic, a posteriori solution procedures for difficult multiple objective optimization problems. We provide specific examples of decision maker preference functions and illustrate the calculation of the resulting IPF for specific solution sets and a simple family of combined objectives.  相似文献   

9.
A large‐sample approximation of the posterior distribution of partially identified structural parameters is derived for models that can be indexed by an identifiable finite‐dimensional reduced‐form parameter vector. It is used to analyze the differences between Bayesian credible sets and frequentist confidence sets. We define a plug‐in estimator of the identified set and show that asymptotically Bayesian highest‐posterior‐density sets exclude parts of the estimated identified set, whereas it is well known that frequentist confidence sets extend beyond the boundaries of the estimated identified set. We recommend reporting estimates of the identified set and information about the conditional prior along with Bayesian credible sets. A numerical illustration for a two‐player entry game is provided.  相似文献   

10.
The conventional heteroskedasticity‐robust (HR) variance matrix estimator for cross‐sectional regression (with or without a degrees‐of‐freedom adjustment), applied to the fixed‐effects estimator for panel data with serially uncorrelated errors, is inconsistent if the number of time periods T is fixed (and greater than 2) as the number of entities n increases. We provide a bias‐adjusted HR estimator that is ‐consistent under any sequences (n, T) in which n and/or T increase to ∞. This estimator can be extended to handle serial correlation of fixed order.  相似文献   

11.
The ill‐posedness of the nonparametric instrumental variable (NPIV) model leads to estimators that may suffer from poor statistical performance. In this paper, we explore the possibility of imposing shape restrictions to improve the performance of the NPIV estimators. We assume that the function to be estimated is monotone and consider a sieve estimator that enforces this monotonicity constraint. We define a constrained measure of ill‐posedness that is relevant for the constrained estimator and show that, under a monotone IV assumption and certain other mild regularity conditions, this measure is bounded uniformly over the dimension of the sieve space. This finding is in stark contrast to the well‐known result that the unconstrained sieve measure of ill‐posedness that is relevant for the unconstrained estimator grows to infinity with the dimension of the sieve space. Based on this result, we derive a novel non‐asymptotic error bound for the constrained estimator. The bound gives a set of data‐generating processes for which the monotonicity constraint has a particularly strong regularization effect and considerably improves the performance of the estimator. The form of the bound implies that the regularization effect can be strong even in large samples and even if the function to be estimated is steep, particularly so if the NPIV model is severely ill‐posed. Our simulation study confirms these findings and reveals the potential for large performance gains from imposing the monotonicity constraint.  相似文献   

12.
Evidence suggests that municipal water utility administrators in the western US price water significantly below its marginal cost and, in so doing, inefficiently exploit aquifer stocks and induce social surplus losses. This paper empirically identifies the objective function of those managers, measures the deadweight losses resulting from their price‐discounting decisions, and recovers the efficient water pricing policy function from counterfactual experiments. In doing so, the estimation uses a “continuous‐but‐constrained‐ control” version of a nested fixed‐point algorithm in order to measure the important intertemporal consequences of groundwater pricing decisions.  相似文献   

13.
The demand for glass bottles is exhibiting an upward trend over time. The manufacturing of glass bottles is costlier in terms of time and resources and is associated with a higher level of heat generation and environmental pollution compared to recycling processes. In response to the aforementioned challenges, companies that use glass bottles need to implement strategies to manage their reverse supply chains in conjunction with their traditional supply chains, as the economic and environmental benefits of returned products are unquestionable. Closed-loop supply chains (CLSCs) integrate forward and reverse flows of products and information. This integration helps companies to have a broader view of the whole chain. Despite these advantages, managing CLSCs can be challenging as they are exposed to many uncertainties regarding supply and demand processes, travel times, and quantity/quality of returned products.In this study, we consider the production planning, inventory management, and vehicle routing decisions of a CLSC of beverage glass bottles. We propose an MILP model and rely on a multi-stage adjustable robust optimization (ARO) formulation to deal with the randomness in both the demand for filled bottles and the requests for pickups of empty bottles. We develop an exact oracle-based algorithm to solve the ARO problem and propose a heuristic search algorithm to reduce the solution time. Our numerical experiments not only show the incompetency of the customary method, namely the affine decision rule approach, but also illustrate how our algorithms can solve the small-size problems and significantly improve the quality of the obtained solution for large problems. Furthermore, our numerical results show that robust plans tend to be sparse, meaning the routes are chosen so that empty bottles are transported to production sites in such a way that fewer new bottles need to be ordered. Thus, robust planning makes the CLSCs more environmentally friendly.  相似文献   

14.
An asymptotically efficient likelihood‐based semiparametric estimator is derived for the censored regression (tobit) model, based on a new approach for estimating the density function of the residuals in a partially observed regression. Smoothing the self‐consistency equation for the nonparametric maximum likelihood estimator of the distribution of the residuals yields an integral equation, which in some cases can be solved explicitly. The resulting estimated density is smooth enough to be used in a practical implementation of the profile likelihood estimator, but is sufficiently close to the nonparametric maximum likelihood estimator to allow estimation of the semiparametric efficient score. The parameter estimates obtained by solving the estimated score equations are then asymptotically efficient. A summary of analogous results for truncated regression is also given.  相似文献   

15.
We address an inventory rationing problem in a lost sales make‐to‐stock (MTS) production system with batch ordering and multiple demand classes. Each production order contains a single batch of a fixed lot size and the processing time of each batch is random. Assuming that there is at most one order outstanding at any point in time, we first address the case with the general production time distribution. We show that the optimal order policy is characterized by a reorder point and the optimal rationing policy is characterized by time‐dependent rationing levels. We then approximate the production time distribution with a phase‐type distribution and show that the optimal policy can be characterized by a reorder point and state‐dependent rationing levels. Using the Erlang production time distribution, we generalize the model to a tandem MTS system in which there may be multiple outstanding orders. We introduce a state‐transformation approach to perform the structural analysis and show that both the reorder point and rationing levels are state dependent. We show the monotonicity of the optimal reorder point and rationing levels for the outstanding orders, and generate new theoretical and managerial insights from the research findings.  相似文献   

16.
Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan‐Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January–March 2009.  相似文献   

17.
This paper considers random coefficients binary choice models. The main goal is to estimate the density of the random coefficients nonparametrically. This is an ill‐posed inverse problem characterized by an integral transform. A new density estimator for the random coefficients is developed, utilizing Fourier–Laplace series on spheres. This approach offers a clear insight on the identification problem. More importantly, it leads to a closed form estimator formula that yields a simple plug‐in procedure requiring no numerical optimization. The new estimator, therefore, is easy to implement in empirical applications, while being flexible about the treatment of unobserved heterogeneity. Extensions including treatments of nonrandom coefficients and models with endogeneity are discussed.  相似文献   

18.
We propose and analyze a two‐dimensional Markov chain model of an Emergency Medical Services system that repositions ambulances using a compliance table policy, which is commonly used in practice. The model is solved via a fixed‐point iteration. We validate the model against a detailed simulation model for several scenarios. We demonstrate that the model provides accurate approximations to various system performance measures, such as the response time distribution and the distribution of the number of busy ambulances, and that it can be used to identify near‐optimal compliance tables. Our numerical results show that performance depends strongly on the compliance table that is used, indicating the importance of choosing a well‐designed compliance table.  相似文献   

19.
Although cross‐trained workers offer numerous operational advantages for extended‐hour service businesses, they must first be scheduled for duty. The outcome from those decisions, usually made a week or more in advance, varies with realized service demand, worker attendance, and the way available cross‐trained workers are deployed once the demands for service are known. By ignoring the joint variability of attendance and demand, we show that existing workforce scheduling models tend to overstate expected schedule performance and systematically undervalue the benefits of cross‐training. We propose a two‐stage stochastic program for profit‐oriented cross‐trained workforce scheduling and allocation decisions that is driven by service completion estimates obtained from the convolution of the employee attendance and service demand distributions. Those estimates, reflecting optimal worker allocation decisions over all plausible realizations of attendance and demand, provide the gradient information used to guide workforce scheduling decisions. Comparing the performance of workforce scheduling decisions for hundreds of different hypothetical service environments, we find that solutions based on convolution estimates are more profitable, favor proportionately more cross‐trained workers and fewer specialists, and tend to recommend significantly larger (smaller) staffing levels for services under high (low) contribution margins than workforce schedules developed with independent expectations of attendance and demand.  相似文献   

20.
Fixed effects estimators of panel models can be severely biased because of the well‐known incidental parameters problem. We show that this bias can be reduced by using a panel jackknife or an analytical bias correction motivated by large T. We give bias corrections for averages over the fixed effects, as well as model parameters. We find large bias reductions from using these approaches in examples. We consider asymptotics where T grows with n, as an approximation to the properties of the estimators in econometric applications. We show that if T grows at the same rate as n, the fixed effects estimator is asymptotically biased, so that asymptotic confidence intervals are incorrect, but that they are correct for the panel jackknife. We show T growing faster than n1/3 suffices for correctness of the analytic correction, a property we also conjecture for the jackknife.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号