排序方式: 共有23条查询结果,搜索用时 15 毫秒
11.
12.
Dissemination of information derived from large contingency tables formed from confidential data is a major responsibility of statistical agencies. In this paper we present solutions to several computational and algorithmic problems that arise in the dissemination of cross-tabulations (marginal sub-tables) from a single underlying table. These include data structures that exploit sparsity to support efficient computation of marginals and algorithms such as iterative proportional fitting, as well as a generalized form of the shuttle algorithm that computes sharp bounds on (small, confidentiality threatening) cells in the full table from arbitrary sets of released marginals. We give examples illustrating the techniques. 相似文献
13.
This paper analyses strategy-proof mechanisms or decision schemes which map profiles of cardinal utility functions to lotteries
over a finite set of outcomes. We provide a new proof of Hylland’s theorem which shows that the only strategy-proof cardinal
decision scheme satisfying a weak unanimity property is the random dictatorship. Our proof technique assumes a framework where
individuals can discern utility differences only if the difference is at least some fixed number which we call the grid size. We also prove a limit random dictatorship result which shows that any sequence of strategy-proof and unanimous decision
schemes defined on a sequence of decreasing grid sizes approaching zero must converge to a random dictatorship.
We are most grateful to an Associate Editor and two referees for very helpful comments on an earlier version of the paper.
An erratum to this article can be found at 相似文献
14.
Data quality: A statistical perspective 总被引:1,自引:0,他引:1
We present the old-but-new problem of data quality from a statistical perspective, in part with the goal of attracting more statisticians, especially academics, to become engaged in research on a rich set of exciting challenges. The data quality landscape is described, and its research foundations in computer science, total quality management and statistics are reviewed. Two case studies based on an EDA approach to data quality are used to motivate a set of research challenges for statistics that span theory, methodology and software tools. 相似文献
15.
Ethical indices of income mobility measure the change in welfare resulting from mobility. The concept of mobility we explore consists of a welfare comparison between the actual time path of the income distribution with a hypothetical time path obtained by supposing that starting from the actual first-period distribution, the remaining income receipts exhibit complete immobility. 相似文献
16.
Bhaskar?DuttaEmail author Matthew O.?Jackson Michel Le?Breton 《Social Choice and Welfare》2004,23(1):21-57
We develop a definition of equilibrium for agenda formation in general voting settings. The definition is independent of any protocol. We show that the set of equilibrium outcomes for any Pareto efficient voting rule is uniquely determined, and in fact coincides with that of the outcomes generated by considering all full agendas. Under voting by successive elimination (or amendment), the set of equilibrium outcomes corresponds with the Banks set. We also examine the implications in several specific settings and show that studying equilibrium agendas can lead to sharp predictions, in contrast with well-known chaos theorems.Financial support under NSF grant SES-9986190 and an RTDF grant from the University of Warwick are gratefully acknowledged. We thank John Duggan, Martin Osborne, and an anonymous referee for valuable suggestions, and participants at the Sixth International Meeting of the Society for Social Choice and Welfare for helpful comments. 相似文献
17.
Bhaskar Dutta Matthew O. Jackson Michel Le Breton 《Econometrica : journal of the Econometric Society》2001,69(4):1013-1037
We study the incentives of candidates to strategically affect the outcome of a voting procedure. We show that the outcomes of every nondictatorial voting procedure that satisfies unanimity will be affected by the incentives of noncontending candidates (i.e., who cannot win the election) to influence the outcome by entering or exiting the election. 相似文献
18.
Xiaojuan Zhu William Seaver Rapinder Sawhney Bruce Holt Gurudatt Bhaskar Sanil 《Journal of applied statistics》2017,44(8):1421-1440
In some organizations, the hiring lead time is often long due to responding to human resource requirements associated with technical and security constrains. Thus, the human resource departments in these organizations are pretty interested in forecasting employee turnover since a good prediction of employee turnover could help the organizations to minimize the costs and impacts from the turnover on the operational capabilities and the budget. This study aims to enhance the ability to forecast employee turnover with or without considering the impact of economic indicators. Various time series modelling techniques were used to identify optimal models for effective employee turnover prediction. More than 11-years of monthly turnover data were used to build and validate the proposed models. Compared with other models, a dynamic regression model with additive trend, seasonality, interventions, and a very important economic indicator effectively predicted the turnover with training R2?=?0.77 and holdout R2?=?0.59. The forecasting performance of optimal models confirms that time series modelling approach has the ability to predict employee turnover for the specific scenario observed in our analysis. 相似文献
19.
Inapproximability results for the lateral gene transfer problem 总被引:1,自引:0,他引:1
Bhaskar Dasgupta Sergio Ferrarini Uthra Gopalakrishnan Nisha Raj Paryani 《Journal of Combinatorial Optimization》2006,11(4):387-405
This paper concerns the Lateral Gene Transfer Problem. This minimization problem, defined by Hallett and Lagergren (2001), is that of finding the most parsimonious lateral gene transfer scenario for a given pair of gene and species trees. Our main results are the following:
This research was supported by NSF grants CCR-0296041, CCR-0206795, CCR-0208749 and IIS-0346973. 相似文献
(a) | We show that it is not possible to approximate the problem in polynomial time within an approximation ratio of 1 + ε, for some constant ε > 0 unless P = NP. We also provide explicit values of ε for the above claim. |
(b) | We provide an upper bound on the cost of any 1-active scenario and prove the tightness of this bound. |
20.
When historical data are available, incorporating them in an optimal way into the current data analysis can improve the quality of statistical inference. In Bayesian analysis, one can achieve this by using quality-adjusted priors of Zellner, or using power priors of Ibrahim and coauthors. These rules are constructed by raising the prior and/or the sample likelihood to some exponent values, which act as measures of compatibility of their quality or proximity of historical data to current data. This paper presents a general, optimum procedure that unifies these rules and is derived by minimizing a Kullback–Leibler divergence under a divergence constraint. We show that the exponent values are directly related to the divergence constraint set by the user and investigate the effect of this choice theoretically and also through sensitivity analysis. We show that this approach yields ‘100% efficient’ information processing rules in the sense of Zellner. Monte Carlo experiments are conducted to investigate the effect of historical and current sample sizes on the optimum rule. Finally, we illustrate these methods by applying them on real data sets. 相似文献