首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
We analyze Poisson regression when covariates contain measurement errors and when multiple potential instrumental variables are available. Without empirical knowledge to select the most suitable variable as an instrument, we propose a novel model-averaging approach to resolve this issue. We prescribe an implementation and establish its optimality in terms of minimizing prediction risk. We further show that, as long as one model is correctly specified among all potential instrumental variable models, our method will lead to consistent prediction. The performance of our method is illustrated through simulations and a movie sales example.  相似文献   

2.
In this research, we propose simultaneous confidence intervals for all pairwise multiple comparisons in a two-way unbalanced design with unequal variances, using a parametric bootstrap approach. Simulation results show that Type 1 error of the multiple comparison test is close to the nominal level even for small samples. They also show that the proposed method outperforms Tukey–Kramer procedure when variances are heteroscedastic and group sizes are unequal.  相似文献   

3.
Summary.  We construct empirical Bayes intervals for a large number p of means. The existing intervals in the literature assume that variances     are either equal or unequal but known. When the variances are unequal and unknown, the suggestion is typically to replace them by unbiased estimators     . However, when p is large, there would be advantage in 'borrowing strength' from each other. We derive double-shrinkage intervals for means on the basis of our empirical Bayes estimators that shrink both the means and the variances. Analytical and simulation studies and application to a real data set show that, compared with the t -intervals, our intervals have higher coverage probabilities while yielding shorter lengths on average. The double-shrinkage intervals are on average shorter than the intervals from shrinking the means alone and are always no longer than the intervals from shrinking the variances alone. Also, the intervals are explicitly defined and can be computed immediately.  相似文献   

4.
In this paper, we propose a quantile approach to the multi-index semiparametric model for an ordinal response variable. Permitting non-parametric transformation of the response, the proposed method achieves a root-n rate of convergence and has attractive robustness properties. Further, the proposed model allows additional indices to model the remaining correlations between covariates and the residuals from the single-index, considerably reducing the error variance and thus leading to more efficient prediction intervals (PIs). The utility of the model is demonstrated by estimating PIs for functional status of the elderly based on data from the second longitudinal study of aging. It is shown that the proposed multi-index model provides significantly narrower PIs than competing models. Our approach can be applied to other areas in which the distribution of future observations must be predicted from ordinal response data.  相似文献   

5.
One of the most important issues in toxicity studies is the identification of the equivalence of treatments with a placebo. Because it is unacceptable to declare non‐equivalent treatments to be equivalent, it is important to adopt a reliable statistical method to properly control the family‐wise error rate (FWER). In dealing with this issue, it is important to keep in mind that overestimating toxicity equivalence is a more serious error than underestimating toxicity equivalence. Consequently asymmetric loss functions are more appropriate than symmetric loss functions. Recently Tao, Tang & Shi (2010) developed a new procedure based on an asymmetric loss function. However, their procedure is somewhat unsatisfactory because it assumes that the variances of various dose levels are known. This assumption is restrictive for some applications. In this study we propose an improved approach based on asymmetric confidence intervals without the restrictive assumption of known variances. The asymmetry guarantees reliability in the sense that the FWER is well controlled. Although our procedure is developed assuming that the variances of various dose levels are unknown but equal, simulation studies show that our procedure still performs quite well when the variances are unequal.  相似文献   

6.
We begin by describing how to find the limits of confidence intervals by using a few permutation tests of significance. Next, we demonstrate how the adaptive permutation test, which maintains its level of significance, produces confidence intervals that maintain their coverage probabilities. By inverting adaptive tests, adaptive confidence intervals can be found for any single parameter in a multiple regression model. These adaptive confidence intervals are often narrower than the traditional confidence intervals when the error distributions are long‐tailed or skewed. We show how much reduction in width can be achieved for the slopes in several multiple regression models and for the interaction effect in a two‐way design. An R function that can compute these adaptive confidence intervals is described and instructions are provided for its use with real data.  相似文献   

7.
Under certain conditions, many multiple contrast tests based on the difference of treatment means can also be conveniently expressed in terms of ratios. In this paper, a Williams test for trend is defined as ratios-to-control for ease of interpretation and to obtain directly comparable confidence intervals. Simultaneous confidence intervals for percentages are particularly helpful for interpretations in the case of multiple endpoints. Methods for constructing simultaneous confidence intervals are discussed under both homogeneous and heterogeneous error variances. This approach is available in the R extension package mratios. The proposed method is used to test for trend in an immunotoxicity study with several endpoints as an example.  相似文献   

8.
《统计学通讯:理论与方法》2012,41(16-17):2932-2943
In Measurement System Analysis a relevant issue is how to find confidence intervals for the parameters used to evaluate the capability of a gauge. In literature approximate solutions are available but they produce so wide intervals that they are often not effective in the decision process. In this article we introduce a new approach and, with particular reference to the parameter γR, i.e., the ratio of the variance due to the process and the variance due to the instrument, we show that, under quite realistic assumptions, we obtain confidence intervals narrower than other methods. An application to a real microelectronic case study is reported.  相似文献   

9.
In this article we study the coverage accuracy of one-sided bootstrap-t confidence intervals for the population variances combined with Hall's and Johnson's transformation. We compare the coverage accuracy of all suggested intervals and intervals based on the Chi-square statistic for variances of positively skewed distributions. In addition, we describe and discuss an application of the presented methods for measuring and analyzing revenue variability within the food retail industry. The results show that both Hall's transformation and Johnson's transformation approaches yield good coverage accuracy of the lower endpoint confidence intervals, better than method based on the Chi-square statistic. For the upper endpoint confidence intervals Hall's bootstrap-t method yields the best coverage accuracy when compared with other methods.  相似文献   

10.
The mixed effects model, in its various forms, is a common model in applied statistics. A useful strategy for fitting this model implements EM-type algorithms by treating the random effects as missing data. Such implementations, however, can be painfully slow when the variances of the random effects are small relative to the residual variance. In this paper, we apply the 'working parameter' approach to derive alternative EM-type implementations for fitting mixed effects models, which we show empirically can be hundreds of times faster than the common EM-type implementations. In our limited simulations, they also compare well with the routines in S-PLUS® and Stata® in terms of both speed and reliability. The central idea of the working parameter approach is to search for efficient data augmentation schemes for implementing the EM algorithm by minimizing the augmented information over the working parameter, and in the mixed effects setting this leads to a transfer of the mixed effects variances into the regression slope parameters. We also describe a variation for computing the restricted maximum likelihood estimate and an adaptive algorithm that takes advantage of both the standard and the alternative EM-type implementations.  相似文献   

11.
E. Brunel  A. Roche 《Statistics》2015,49(6):1298-1321
Our aim is to estimate the unknown slope function in the functional linear model when the response Y is real and the random function X is a second-order stationary and periodic process. We obtain our estimator by minimizing a standard (and very simple) mean-square contrast on linear finite dimensional spaces spanned by trigonometric bases. Our approach provides a penalization procedure which allows to automatically select the adequate dimension, in a non-asymptotic point of view. In fact, we can show that our penalized estimator reaches the optimal (minimax) rate of convergence in the sense of the prediction error. We complete the theoretical results by a simulation study and a real example that illustrates how the procedure works in practice.  相似文献   

12.
ABSTRACT

It is well known that ignoring heteroscedasticity in regression analysis adversely affects the efficiency of estimation and renders the usual procedure for constructing prediction intervals inappropriate. In some applications, such as off-line quality control, knowledge of the variance function is also of considerable interest in its own right. Thus the modeling of variance constitutes an important part of regression analysis. A common practice in modeling variance is to assume that a certain function of the variance can be closely approximated by a function of a known parametric form. The logarithm link function is often used even if it does not fit the observed variation satisfactorily, as other alternatives may yield negative estimated variances. In this paper we propose a rich class of link functions for more flexible variance modeling which alleviates the major difficulty of negative variances. We suggest also an alternative analysis for heteroscedastic regression models that exploits the principle of “separation” discussed in Box (Signal-to-Noise Ratios, Performance Criteria and Transformation. Technometrics 1988, 30, 1–31). The proposed method does not require any distributional assumptions once an appropriate link function for modeling variance has been chosen. Unlike the analysis in Box (Signal-to-Noise Ratios, Performance Criteria and Transformation. Technometrics 1988, 30, 1–31), the estimated variances and their associated asymptotic variances are found in the original metric (although a transformation has been applied to achieve separation in a different scale), making interpretation of results considerably easier.  相似文献   

13.
In stratified sample surveys, the problem of determining the optimum allocation is well known due to articles published in 1923 by Tschuprow and in 1934 by Neyman. The articles suggest the optimum sample sizes to be selected from each stratum for which sampling variance of the estimator is minimum for fixed total cost of the survey or the cost is minimum for a fixed precision of the estimator. If in a sample survey more than one characteristic is to be measured on each selected unit of the sample, that is, the survey is a multi-response survey, then the problem of determining the optimum sample sizes to various strata becomes more complex because of the non-availability of a single optimality criterion that suits all the characteristics. Many authors discussed compromise criterion that provides a compromise allocation, which is optimum for all characteristics, at least in some sense. Almost all of these authors worked out the compromise allocation by minimizing some function of the sampling variances of the estimators under a single cost constraint. A serious objection to this approach is that the variances are not unit free so that minimizing any function of variances may not be an appropriate objective to obtain a compromise allocation. This fact suggests the use of coefficient of variations instead of variances. In the present article, the problem of compromise allocation is formulated as a multi-objective non-linear programming problem. By linearizing the non-linear objective functions at their individual optima, the problem is approximated to an integer linear programming problem. Goal programming technique is then used to obtain a solution to the approximated problem.  相似文献   

14.
Although estimating the five parameters of an unknown Generalized Normal Laplace (GNL) density by minimizing the distance between the empirical and true characteristic functions seems appealing, the approach cannot be advocated in practice. This conclusion is based on extensive numerical simulations in which a fast minimization procedure delivers deceiving estimators with values that are quite far away from the truth. These findings can be predicted by the very large values obtained for the true asymptotic variances of the estimators of the five parameters of the true GNL density.  相似文献   

15.
The nonlinear responses of species to environmental variability can play an important role in the maintenance of ecological diversity. Nonetheless, many models use parametric nonlinear terms which pre-determine the ecological conclusions. Motivated by this concern, we study the estimate of the second derivative (curvature) of the link function in a functional single index model. Since the coefficient function and the link function are both unknown, the estimate is expressed as a nested optimization. We first estimate the coefficient function by minimizing squared error where the link function is estimated with a Nadaraya-Watson estimator for each candidate coefficient function. The first and second derivatives of the link function are then estimated via local-quadratic regression using the estimated coefficient function. In this paper, we derive a convergence rate for the curvature of the nonlinear response. In addition, we prove that the argument of the linear predictor can be estimated root-n consistently. However, practical implementation of the method requires solving a nonlinear optimization problem, and our results show that the estimates of the link function and the coefficient function are quite sensitive to the choices of starting values.  相似文献   

16.
Multiple outcomes are increasingly used to assess chronic disease progression. We discuss and show how desirability functions can be used to assess a patient overall response to a treatment using multiple outcome measures and each of them may contribute unequally to the final assessment. Because judgments on disease progression and the relative contribution of each outcome can be subjective, we propose a data-driven approach to minimize the biases by using desirability functions with estimated shapes and weights based on a given gold standard. Our method provides each patient with a meaningful overall progression score that facilitates comparison and clinical interpretation. We also extend the methodology in a novel way to monitor patients’ disease progression when there are multiple time points and illustrate our method using a longitudinal data set from a randomized two-arm clinical trial for scleroderma patients.  相似文献   

17.
We propose a class of Bayesian semiparametric mixed-effects models; its distinctive feature is the randomness of the grouping of observations, which can be inferred from the data. The model can be viewed under a more natural perspective, as a Bayesian semiparametric regression model on the log-scale; hence, in the original scale, the error is a mixture of Weibull densities mixed on both parameters by a normalized generalized gamma random measure, encompassing the Dirichlet process. As an estimate of the posterior distribution of the clustering of the random-effects parameters, we consider the partition minimizing the posterior expectation of a suitable class of loss functions. As a merely illustrative application of our model we consider a Kevlar fibre lifetime dataset (with censoring). We implement an MCMC scheme, obtaining posterior credibility intervals for the predictive distributions and for the quantiles of the failure times under different stress levels. Compared to a previous parametric Bayesian analysis, we obtain narrower credibility intervals and a better fit to the data. We found that there are three main clusters among the random-effects parameters, in accordance with previous frequentist analysis.  相似文献   

18.
Richmond (1982) uses a linear programming approach to the construction of simultaneous confidence intervals for a set of linear estimable parametric functions of the normal mean vector. We present a quadratic programming approach which constructs narrower confidence intervals than the linear programming approach given by Richmond (1982).  相似文献   

19.
Abstract. We study the coverage properties of Bayesian confidence intervals for the smooth component functions of generalized additive models (GAMs) represented using any penalized regression spline approach. The intervals are the usual generalization of the intervals first proposed by Wahba and Silverman in 1983 and 1985, respectively, to the GAM component context. We present simulation evidence showing these intervals have close to nominal ‘across‐the‐function’ frequentist coverage probabilities, except when the truth is close to a straight line/plane function. We extend the argument introduced by Nychka in 1988 for univariate smoothing splines to explain these results. The theoretical argument suggests that close to nominal coverage probabilities can be achieved, provided that heavy oversmoothing is avoided, so that the bias is not too large a proportion of the sampling variability. The theoretical results allow us to derive alternative intervals from a purely frequentist point of view, and to explain the impact that the neglect of smoothing parameter variability has on confidence interval performance. They also suggest switching the target of inference for component‐wise intervals away from smooth components in the space of the GAM identifiability constraints.  相似文献   

20.
A modelling approach to optimize a multiresponse system is presented. The approach aims to identify the setting of the input variables to maximize the degree of overall satisfaction with respect to all the responses. An exponential desirability functional form is suggested to simplify the desirability function assessment process. The approach proposed does not require any assumptions regarding the form or degree of the estimated response models and is robust to the potential dependences between response variables. It also takes into consideration the difference in the predictive ability as well as relative priority among the response variables. Properties of the approach are revealed via two real examples—one classical example taken from the literature and another that the authors have encountered in the steel industry.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号