首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到9条相似文献,搜索用时 515 毫秒
1.

We consider a problem of allocation of a sample in two- and three-stage sampling. We seek allocation which is both multi-domain and population efficient. Choudhry et al. (Survey Methods 38(1):23–29, 2012) recently considered such problem for one-stage stratified simple random sampling without replacement in domains. Their approach was through minimization of the sample size under constraints on relative variances in all domains and on the overall relative variance. To attain this goal, they used nonlinear programming. Alternatively, we minimize here the relative variances in all domains (controlling them through given priority weights) as well as the overall relative variance under constraints imposed on total (expected) cost. We consider several two- and three-stage sampling schemes. Our aim is to shed some light on the analytic structure of solutions rather than in deriving a purely numerical tool for sample allocation. To this end, we develop the eigenproblem methodology introduced in optimal allocation problems in Niemiro and Wesołowski (Appl Math 28:73–82, 2001) and recently updated in Wesołowski and Wieczorkowski (Commun Stat Theory Methods 46(5):2212–2231, 2017) by taking under account several new sampling schemes and, more importantly, by the (single) total expected variable cost constraint. Such approach allows for solutions which are direct generalization of the Neyman-type allocation. The structure of the solution is deciphered from the explicit allocation formulas given in terms of an eigenvector \({\underline{v}}^*\) of a population-based matrix \(\mathbf{D}\). The solution we provide can be viewed as a multi-domain version of the Neyman-type allocation in multistage stratified SRSWOR schemes.

  相似文献   

2.

In the context of causal mediation analysis, prevailing notions of direct and indirect effects are based on nested counterfactuals. These can be problematic regarding interpretation and identifiability especially when the mediator is a time-dependent process and the outcome is survival or, more generally, a time-to-event outcome. We propose and discuss an alternative definition of mediated effects that does not suffer from these problems, and is more transparent than the current alternatives. Our proposal is based on the extended graphical approach of Robins and Richardson (in: Shrout (ed) Causality and psychopathology: finding the determinants of disorders and their cures, Oxford University Press, Oxford, 2011), where treatment is decomposed into different components, or aspects, along different causal paths corresponding to real world mechanisms. This is an interesting alternative motivation for any causal mediation setting, but especially for survival outcomes. We give assumptions allowing identifiability of such alternative mediated effects leading to the familiar mediation g-formula (Robins in Math Model 7:1393, 1986); this implies that a number of available methods of estimation can be applied.

  相似文献   

3.
Hu  Tianle  Nan  Bin  Lin  Xihong 《Lifetime data analysis》2019,25(3):480-506

Cross-ratio is an important local measure of the strength of dependence among correlated failure times. If a covariate is available, it may be of scientific interest to understand how the cross-ratio varies with the covariate as well as time components. Motivated by the Tremin study, where the dependence between age at a marker event reflecting early lengthening of menstrual cycles and age at menopause may be affected by age at menarche, we propose a proportional cross-ratio model through a baseline cross-ratio function and a multiplicative covariate effect. Assuming a parametric model for the baseline cross-ratio, we generalize the pseudo-partial likelihood approach of Hu et al. (Biometrika 98:341–354, 2011) to the joint estimation of the baseline cross-ratio and the covariate effect. We show that the proposed parameter estimator is consistent and asymptotically normal. The performance of the proposed technique in finite samples is examined using simulation studies. In addition, the proposed method is applied to the Tremin study for the dependence between age at a marker event and age at menopause adjusting for age at menarche. The method is also applied to the Australian twin data for the estimation of zygosity effect on cross-ratio for age at appendicitis between twin pairs.

  相似文献   

4.

Motivated by penalized likelihood maximization in complex models, we study optimization problems where neither the function to optimize nor its gradient has an explicit expression, but its gradient can be approximated by a Monte Carlo technique. We propose a new algorithm based on a stochastic approximation of the proximal-gradient (PG) algorithm. This new algorithm, named stochastic approximation PG (SAPG) is the combination of a stochastic gradient descent step which—roughly speaking—computes a smoothed approximation of the gradient along the iterations, and a proximal step. The choice of the step size and of the Monte Carlo batch size for the stochastic gradient descent step in SAPG is discussed. Our convergence results cover the cases of biased and unbiased Monte Carlo approximations. While the convergence analysis of some classical Monte Carlo approximation of the gradient is already addressed in the literature (see Atchadé et al. in J Mach Learn Res 18(10):1–33, 2017), the convergence analysis of SAPG is new. Practical implementation is discussed, and guidelines to tune the algorithm are given. The two algorithms are compared on a linear mixed effect model as a toy example. A more challenging application is proposed on nonlinear mixed effect models in high dimension with a pharmacokinetic data set including genomic covariates. To our best knowledge, our work provides the first convergence result of a numerical method designed to solve penalized maximum likelihood in a nonlinear mixed effect model.

  相似文献   

5.
6.
We propose a class of methods for graphon estimation based on exploiting connections with nonparametric regression. The idea is to construct an ordering of the nodes in the network, similar in spirit to Chan & Airoldi (2014). However, rather than considering orderings based only on the empirical degree as in Chan & Airoldi (2014), we use the nearest-neighbour algorithm which is an approximative solution to the travelling salesman problem. This algorithm in turn can handle general distances d^ between the nodes, allowing us to incorporate rich information from the network. Once an ordering is constructed, we formulate a two-dimensional-grid graph-denoising problem that we solve through fused-lasso regularization. For particular choices of the metric d^, we show that the corresponding two-step estimator can attain competitive rates when the true model is the stochastic block model, and when the underlying graphon is piecewise Hölder or has bounded variation.  相似文献   

7.
8.
Doubly truncated data appear in a number of applications, including astronomy and survival analysis. For doubly-truncated data, the lifetime T is observable only when UTV, where U and V are the left-truncated and right-truncated time, respectively. Based on the empirical likelihood approach of Zhou [21 Zhou, M. 2005. Empirical likelihood ratio with arbitrarily censored/truncated data by EM algorithm. J. Comput. Graph. Statist., 14: 643656. [Taylor & Francis Online], [Web of Science ®] [Google Scholar]], we propose a modified EM algorithm of Turnbull [19 Turnbull, B. W. 1976. The empirical distribution function with arbitrarily grouped censored and truncated data. J. R. Stat. Soc. Ser. B, 38: 290295.  [Google Scholar]] to construct the interval estimator of the distribution function of T. Simulation results indicate that the empirical likelihood method can be more efficient than the bootstrap method.  相似文献   

9.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号