首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Given a dataset an outlier can be defined as an observation that does not follow the statistical properties of the majority of the data. Computation of the location estimate is of fundamental importance in data analysis, and it is well known in statistics that classical methods, such as taking the sample average, can be greatly affected by the presence of outliers in the data. Using the median instead of the mean can partially resolve this issue but not completely. For the univariate case, a robust version of the median is the Least Trimmed Absolute Deviation (LTAD) robust estimator introduced in Tableman (Stat Probab Lett 19(5):387–398, 1994), which has desirable asymptotic properties such as robustness, consistently, high breakdown and normality. There are different generalizations of the LTAD for multivariate data, depending on the choice of norm. Chatzinakos et al. (J Comb Optim, 2015) we present such a generalization using the Euclidean norm and propose a solution technique for the resulting combinatorial optimization problem, based on a necessary condition, that results in a highly convergent local search algorithm. In this subsequent work, we use the \(L^1\) norm to generalize the LTAD to higher dimensions, and show that the resulting mixed integer programming problem has an integral relaxation, after applying an appropriate data transformation. Moreover, we utilize the structure of the problem to show that the resulting LP’s can be solved efficiently using a subgradient optimization approach. The robust statistical properties of the proposed estimator are verified by extensive computational results.  相似文献   

2.
Lai  Zhizhu  Yue  Qun  Wang  Zheng  Ge  Dongmei  Chen  Yulong  Zhou  Zhihong 《Journal of Combinatorial Optimization》2022,44(2):1134-1160

Improper value of the parameter p in robust constraints will result in no feasible solutions while applying stochastic p-robustness optimization approach (p-SRO) to solving facility location problems under uncertainty. Aiming at finding the lowest critical p-value of parameter p and corresponding robust optimal solution, we developed a novel robust optimization approach named as min-p robust optimization approach (min-pRO) for P-median problem (PMP) and fixed cost P-median problem (FPMP). Combined with the nearest allocation strategy, the vertex substitution heuristic algorithm is improved and the influencing factors of the lowest critical p-value are analyzed. The effectiveness and performance of the proposed approach are verified by numerical examples. The results show that the fluctuation range of data is positively correlated with the lowest critical p-value with given number of new facilities. However, the number of new facilities has a different impact on lowest critical p-value with the given fluctuation range of data. As the number of new facilities increases, the lowest critical p-value for PMP and FPMP increases and decreases, respectively.

  相似文献   

3.
Taguchi method is found efficient for optimising process performance with a single quality characteristic (QCH) of a product or process. In practice, however, customers are concerned about multiple QCHs, which are usually correlated. This research proposes and implements an approach using principal components analysis (PCA) and two data envelopment analysis (DEA) models, including CCR and super efficiency, for optimising multiple correlated QCHs in robust design. The PCA is first utilised to obtain multiple uncorrelated linear combinations of principal components, which are the same number of QCHs and hence avoid the loss of information by ignoring some principal components. Then, these components are utilised in two DEA models to decide optimal factor levels. Three real case studies are provided for illustration; in all of which the proposed approach is found more efficient than some other techniques in literature, including engineering judgement, PCA, PCA and grey analysis, and utility concept. In conclusion, the proposed approach shall provide a great assistance to process/product engineers for obtaining robust design with multiple correlated QCHs.  相似文献   

4.
Flood protection is of major importance to many flood-prone regions and involves substantial investment and maintenance costs. Modern flood risk management often requires determining a cost-efficient protection strategy, i.e., one which has the lowest possible long run cost and which satisfies flood protection standards imposed by the regulator throughout the entire planning horizon. There are two challenges that complicate the modeling: (i) uncertainty - many of the important parameters on which the strategies are based (e.g. the sea level rise) are uncertain, and will be known only in the future, and (ii) adjustability - decisions implemented at later time stages need to adapt to the realized uncertainty values. We develop a new mathematical model addressing both issues, based on recent advances in integer robust optimization, and we apply it to the Rhine Estuary - Drechtsteden area in the Netherlands. Our approach shows, among others, that (i) considering 40% uncertainty about the sea level rise leads to a solution with less than 10% increase in the total cost, (ii) solutions taking the uncertainty into account are cheaper in the long run if the ‘bad scenarios’ for the uncertainty materialize, even if the ‘optimistic solutions’ are allowed to be repaired later on, and (iii) the optimal here-and-now investment decisions change when uncertainty and adjustability are included in the model.  相似文献   

5.
Under a data envelopment analysis (DEA) framework, full ranking of a group of decision making units (DMUs) can be carried out through an adequate amalgamation of the cross-efficiency (CE henceforth) scores produced for each DMU. In this paper, we propose a ranking procedure that is based on amalgamating the weight profiles selected over the cross-evaluation rather than related CE scores. The new approach builds, for each DMU, a collective weight profile (CWP henceforth) by exploiting the preference voting system embedded within the matrix of weights, which views the assessing DMUs as voters and the input/output factors as candidates. The occurrence of zero votes is discussed as a special case and a two-level aggregation procedure is developed. The CWPs that are produced extend the concept of collective appreciation to the input/output factors of each DMU so that group dynamics is truly reflected, mainly in decision making circumstances where factor prioritization is necessary for making choices or allocating resources. The robustness of the proposed ranking approach is evaluated with three examples drawn from the literature.  相似文献   

6.
7.
《Omega》2014,42(6):984-997
In this paper, our major theme is a unifying framework for duality in robust linear programming. We show that there are two pair of dual programs allied with a robust linear program; one in which the primal is constructed to be “ultra-conservative” and one in which the primal is constructed to be “ultra-optimistic.” Furthermore, as one would expect, if the uncertainly in the primal is row-based, the corresponding uncertainty in the dual is column-based, and vice-versa. Several examples are provided that illustrate the properties of these primal and dual models.A second theme of the paper is about modeling in robust linear programming. We replace the ordinary activity vectors (points) and right-hand sides with well-known geometric objects such as hyper-rectangles, parallel line segments and hyper-spheres. In this manner, imprecision and uncertainty can be explicitly modeled as an inherent characteristic of the model. This is in contrast to the usual approach of using vectors to model activities and/or constraints and then, subsequently, imposing some further constraints in the model to accommodate imprecision and uncertainties. The unifying duality structure is then applied to these models to understand and interpret the marginal prices. The key observation is that the optimal solutions to these dual problems are comprised of two parts: a traditional “centrality” component along with a “robustness” component.  相似文献   

8.
Intensified research on multivariate Poisson models offers new opportunities for the analysis of purchase quantities in market basket data. The investigation of positive or negative correlations in quantity decisions among product categories facilitates a deeper understanding of consumer purchase behavior. The applied multivariate log-normal Poisson model introduces interdependencies between categories with multivariate normal-distributed latent effects by means of a covariance matrix. As the size of this covariance matrix depends on the number of categories in the model, its ation may become tedious. Furthermore, we assume that quantity decisions do not interact for all pairs of categories. That is why we propose to use covariance selection to derive a parsimonious representation of the correlation structure. For two market basket data sets, we show that the vast majority of off-diagonal elements in the covariance matrix are irrelevant. For a data set with product categories, the model with a partly restricted covariance matrix achieves a better fit to the holdout data than the model with full covariance matrix. For a data set with subcategories of the broader category beverage, the proposed model with restricted covariance outperforms the model with full covariance matrix even on the calibration data. We conclude that interactions of quantity decisions are overall the exception, even for complements-in-use.  相似文献   

9.
Approximation algorithms for connected facility location problems   总被引:1,自引:1,他引:0  
We study Connected Facility Location problems. We are given a connected graph G=(V,E) with nonnegative edge cost c e for each edge eE, a set of clients DV such that each client jD has positive demand d j and a set of facilities FV each has nonnegative opening cost f i and capacity to serve all client demands. The objective is to open a subset of facilities, say , to assign each client jD to exactly one open facility i(j) and to connect all open facilities by a Steiner tree T such that the cost is minimized for a given input parameter M≥1. We propose a LP-rounding based 8.29 approximation algorithm which improves the previous bound 8.55 (Swamy and Kumar in Algorithmica, 40:245–269, 2004). We also consider the problem when opening cost of all facilities are equal. In this case we give a 7.0 approximation algorithm.  相似文献   

10.
In this paper, our major theme is a unifying framework for duality in robust linear programming. We show that there are two pair of dual programs allied with a robust linear program; one in which the primal is constructed to be “ultra-conservative” and one in which the primal is constructed to be “ultra-optimistic.” Furthermore, as one would expect, if the uncertainly in the primal is row-based, the corresponding uncertainty in the dual is column-based, and vice-versa. Several examples are provided that illustrate the properties of these primal and dual models.  相似文献   

11.
12.
The aggregate production planning (APP) problem considers the medium-term production loading plans subject to certain restrictions such as production capacity and workforce level. It is not uncommon for management to often encounter uncertainty and noisy data, in which the variables or parameters are stochastic. In this paper, a robust optimization model is developed to solve the aggregate production planning problems in an environment of uncertainty in which the production cost, labour cost, inventory cost, and hiring and layoff cost are minimized. By adjusting penalty parameters, decision-makers can determine an optimal medium-term production strategy including production loading plan and workforce level while considering different economic growth scenarios. Numerical results demonstrate the robustness and effectiveness of the proposed model. The proposed model is realistic for dealing with uncertain economic conditions. The analysis of the tradeoff between solution robustness and model robustness is also presented.  相似文献   

13.
In this paper, we study the problem of planning a timetable for passenger trains considering that possible delays might occur due to unpredictable circumstances. If a delay occurs, a timetable could not be able to manage it unless some extra time has been scheduled in advance. Delays might be managed in several ways and the usual objective function considered for such purpose is the minimization of the overall waiting time caused to passengers. We analyze the timetable planning problem in terms of the recoverable robustness model, where a timetable is said to be recoverable robust if it is able to absorb small delays by possibly applying given limited recovery capabilities. The quality of a robust timetable is measured by the price of robustness that is the ratio between the cost of the recoverable robust timetable and that of a non-robust optimal one.  相似文献   

14.
The uniform bounded facility location problem (UBFLP) seeks for the optimal way of locating facilities to minimize total costs (opening costs plus routing costs), while the maximal routing costs of all clients are at most a given bound M. After building a mixed 0–1 integer programming model for UBFLP, we present the first constant-factor approximation algorithm with an approximation guarantee of 6.853+? for UBFLP on plane, which is composed of the algorithm by Dai and Yu (Theor. Comp. Sci. 410:756–765, 2009) and the schema of Xu and Xu (J. Comb. Optim. 17:424–436, 2008). We also provide a heuristic algorithm based on Benders decomposition to solve UBFLP on general graphes, and the computational experience shows that the heuristic works well.  相似文献   

15.
This paper describes a simplified optimization algorithm used for the solution of a classical depot location problem as presented in a Greek Manufacturing Company. Algorithms in the literature for this type of problem are based on the assumption of predetermined fixed costs which are independent of the final size of the depots. This assumption is usually far from reality; the size of each depot does not remain constant during the optimization process and so does the associated fixed cost which is variable with the size of the depot. This assumption is relaxed in the proposed algorithm; the associated fixed cost is modified each time a new customer is allocated to a depot thus changing the required depot size.  相似文献   

16.
The flowshop scheduling problem (FSP) has been widely studied in the literature and many techniques for its solution have been proposed. Some authors have concluded that genetic algorithms are not suitable for this hard, combinatorial problem unless hybridization is used. This work proposes new genetic algorithms for solving the permutation FSP that prove to be competitive when compared to many other well known algorithms. The optimization criterion considered is the minimization of the total completion time or makespan (CmaxCmax). We show a robust genetic algorithm and a fast hybrid implementation. These algorithms use new genetic operators, advanced techniques like hybridization with local search and an efficient population initialization as well as a new generational scheme. A complete evaluation of the different parameters and operators of the algorithms by means of a Design of Experiments approach is also given. The algorithm's effectiveness is compared against 11 other methods, including genetic algorithms, tabu search, simulated annealing and other advanced and recent techniques. For the evaluations we use Taillard's well known standard benchmark. The results show that the proposed algorithms are very effective and at the same time are easy to implement.  相似文献   

17.
《Omega》1987,15(5):429-441
Traditionally, most existing retail location models largely ignore the behavioral aspect of locational strategies. On the other hand, a few of the existing behavioral models of retail store choice are aspatial at best. This paper designs a multiobjective retail location decision model which not only considers the behavioral and spatial aspects of location scenarios, but also takes advantage of systematic sequential decision process. The model has been applied to solve ‘real-world-like’ fastfood restaurant location problems which were based on the actual data.  相似文献   

18.
Israel Brosh  Marvin Hersh 《Omega》1974,2(6):805-808
A warehouses location problem is treated using a mixed integer programming and a heuristic algorithm. A simplification of freight rates schedules, based upon shipments consolidation and a linear regression of rates vs distances was made. Warehousing costs were divided according to fixed and variable and related to the throughput of the warehouses. Consideration was given in the analysis to the choice between owning and leasing each warehouse. In the case studied, the analysis demonstrated that a possible saving of approximately 22 per cent in annual distribution costs could be realized under the optimized warehouse location network.  相似文献   

19.
This paper considers the minimum-energy symmetric network connectivity problem (MESNC) in wireless sensor networks. The aim of the MESNC is to assign transmission power to each sensor node such that the resulting network, using only bidirectional links, is connected and the total energy consumption is minimized. We first present two new models of this problem and then propose new branch-and-cut algorithms. Based on an existing formulation, we present the first model by introducing additional constraints. These additional constraints allow us to relax certain binary variables to continuous ones and thus to reduce significantly the number of binary variables. Our second model strengthens the first one by adding an exponential number of lifted directed-connectivity constraints. We present two branch-and-cut procedures based on these proposed improvements. The computational results are reported and show that our approaches, using the proposed formulations, can efficiently solve instances with up to 120 nodes, which significantly improve our ability to solve much larger instances in comparison with other exact algorithms in the literature.  相似文献   

20.
Chen  Xin  Fang  Qizhi  Liu  Wenjing  Ding  Yuan  Nong  Qingqin 《Journal of Combinatorial Optimization》2022,43(5):1628-1644
Journal of Combinatorial Optimization - We study a fairness-based model for 2-facility location games on the real line where the social objective is to minimize the maximum envy over all agents....  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号