首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 12 毫秒
1.
This article proposes a simplification of the model for dependent binary variables presented in Cox and Snell (1989 Cox , D. R. , Snell , E. J. ( 1989 ). Analysis of Binary Data . Vol. 32 of Monographs on Statistics and Applied Probability . London : Chapman & Hall . [Google Scholar]). The new model referred to as the simplified Cox model is developed for identically distributed and dependent binary variables. Properties of the model are presented, including expressions for the log-likelihood function and the Fisher information. Under mutual independence, a general expression for the restrictions of the parameters are derived. The simplified Cox model is illustrated using a data set from a clinical trial.  相似文献   

2.
Box-Behnken designs are popular with experimenters who wish to estimate a second-order model, due to their having three levels, their simplicity and their high efficiency for the second-order model. However, there are situations in which the model is inadequate due to lack of fit caused by higher-order terms. These designs have little ability to estimate third-order terms. Using combinations of factorial points, axial points, and complementary design points, we augment these designs and develop catalogues of third-order designs for 3–12 factors. These augmented designs can be used to estimate the parameters of a third-order response surface model. Since the aim is to make the most of a situation in which the experiment was designed for an inadequate model, the designs are clearly suboptimal and not rotatable for the third-order model, but can still provide useful information.  相似文献   

3.
ABSTRACT

Formulas for A- and C-optimal allocations for binary factorial experiments in the context of generalized linear models are derived. Since the optimal allocations depend on GLM weights, which often are unknown, a minimax strategy is considered. This is shown to be simple to apply to factorial experiments. Efficiency is used to evaluate the resulting design. In some cases, the minimax design equals the optimal design. For other cases no general conclusion can be drawn. An example of a two-factor logit model suggests that the minimax design performs well, and often better than a uniform allocation.  相似文献   

4.
This article considers model selection procedures based on choosing the model with the largest maximized log-likelihood minus a penalty, when key parameters are restricted to be in a closed interval. Its main emphasis is how these penalties might be chosen in small samples to give good properties of the resultant procedure. We illustrate two model selection problems in the context of Box–Cox transformations and their application to the linear regression model. Simulation results for both problems indicate that the new procedure clearly dominates existing procedures in terms of having higher probabilities of correctly selecting the true model.  相似文献   

5.
This paper presents three small sample tests for testing the heteroscedasticity among regression disturbances. The power of these tests are compared with two of the leading tests for this hypothesis, one by Goldfeld and Quandt [5] and the other by Theil [17]. We also provide a heuristic method of selecting the number of middle observations to be deleted for the Goldfeld-Quandt type of tests.  相似文献   

6.
Design of experiments is considered for the situation where estimation of the slopes of a response surface is the main interest. Under the D-minimax criterion, the objective is to minimize the generalized variance of the estimated axial slopes at a point maximized over all points in the region of interest in the factor space. For the third-order model over spherical regions, the D-minimax designs are derived in two and three dimensions. The efficiencies of some two- and three-dimensional designs available in the literature are also investigated.  相似文献   

7.
Summary. Vining and co-workers have used plots of the prediction variance trace (PVT) along the so-called prediction rays to compare mixture designs in a constrained region R . In the present paper, we propose a method for describing the distribution of the prediction variance within the region R by using quantile plots. More comprehensive comparisons between mixture designs are possible through the proposed plots than with the PVT plots. The utility of the quantile plots is illustrated with a four-component fertilizer experiment that was initiated in São Paulo, Brazil.  相似文献   

8.
9.
Supersaturated designs are factorial designs in which the number of potential effects is greater than the run size. They are commonly used in screening experiments, with the aim of identifying the dominant active factors with low cost. However, an important research field, which is poorly developed, is the analysis of such designs with non-normal response. In this article, we develop a variable selection strategy, through the modification of the PageRank algorithm, which is commonly used in the Google search engine for ranking Webpages. The proposed method incorporates an appropriate information theoretical measure into this algorithm and as a result, it can be efficiently used for factor screening. A noteworthy advantage of this procedure is that it allows the use of supersaturated designs for analyzing discrete data and therefore a generalized linear model is assumed. As it is depicted via a thorough simulation study, in which the Type I and Type II error rates are computed for a wide range of underlying models and designs, the presented approach can be considered quite advantageous and effective.  相似文献   

10.
Recently Beh and Farver investigated and evaluated three non‐iterative procedures for estimating the linear‐by‐linear parameter of an ordinal log‐linear model. The study demonstrated that these non‐iterative techniques provide estimates that are, for most types of contingency tables, statistically indistinguishable from estimates from Newton's unidimensional algorithm. Here we show how two of these techniques are related using the Box–Cox transformation. We also show that by using this transformation, accurate non‐iterative estimates are achievable even when a contingency table contains sampling zeros.  相似文献   

11.
This paper presents a simple and exact test for detecting a monotonic relation between the mean and variance in linear regression through the origin. This test resulted from utilizing uncorrelated Theil-residuals and the Goldfeld-Quandt peak test. A numerical example is provided to elucidate the method. A simulation experiment was performed to compare the empirical power of this test with those of the existing tests.  相似文献   

12.
This paper discusses deviance residual approximations in von Mises regression models. By using a relationship between the von Mises and the wrapped normal distributions, the paper shows that the deviance component of the von Mises distribution is approximately a linear function of the standard normal distribution. Two standardized forms are proposed for the deviance residual, and a simulation study is performed to compare the approximation of the proposed residuals to the standard normal distribution. An illustrative example is given.  相似文献   

13.
14.
Gnanadesikan 1977 illustrates the utility of the power transformations considered by Moore and Tukey (1954) Box and Cox (1964), and Andrews, Gnanadesikan, and Warner (1971). These transformations have been used to obtain and assess both the marginal and joint normality of the underlying distributions. This paper investigates the utility of this procedure in defining homoscedastic transformations in multivariate populations.  相似文献   

15.
The paper describes the use of frequentist and Bayesian shared-parameter joint models of longitudinal measurements of prostate-specific antigen (PSA) and the risk of prostate cancer (PCa). The motivating dataset corresponds to the screening arm of the Spanish branch of the European Randomized Screening for Prostate Cancer study. The results show that PSA is highly associated with the risk of being diagnosed with PCa and that there is an age-varying effect of PSA on PCa risk. Both the frequentist and Bayesian paradigms produced very close parameter estimates and subsequent 95% confidence and credibility intervals. Dynamic estimations of disease-free probabilities obtained using Bayesian inference highlight the potential of joint models to guide personalized risk-based screening strategies.  相似文献   

16.
Sa and Edwards (1993 Sa , P. , Edwards , D. ( 1993 ). Multiple comparisons with a control in response surface methodology . Technometrics 35 ( 4 ): 436445 .[Taylor & Francis Online], [Web of Science ®] [Google Scholar]) first proposed the Multiple Comparisons with a Control problem in Response Surface Methodology. They provided an exact solution for one predictor variable and a conservative solution when number of predictor variables is more than one. Merchant et al. (1998 Merchant , A. , McCann , M. , Edwards , D. ( 1998 ). Improved multiple comparisons with a control in response surface analysis . Technometrics 40 ( 4 ): 291303 .[Crossref], [Web of Science ®] [Google Scholar]) improved the solution for the latter case. This article improves Merchant et al.'s solution for the case of rotatable designs in two predictor variables.  相似文献   

17.
Various methods have been proposed for smoothing under the monotonicity constraint. We review the literature and implement an approach of monotone smoothing with B-splines for a generalized linear model response. The approach is expressed as a quadratic programming problem and is easily solved using the statistical software R. In a simulation study, we find that the approach performs better than other approaches with much faster computation time. The approach can also be used for smoothing under other shape constraints or mixed constraints. Supplementary materials of the appendices and R code to implement the developed approach is available online.  相似文献   

18.
19.
Motivated by involvement in an intervention study, the paper proposes a robust, heteroscedastic generalization of what is popularly known as Cohen's d. The approach has the additional advantage of being readily extended to situations where the goal is to compare more than two groups. The method arises quite naturally from a regression perspective in conjunction with a robust version of explanatory power. Moreover, it provides a single numeric summary of how the groups compare in contrast to other strategies aimed at dealing with heteroscedasticity. Kulinskaya and Staudte [16 Rousseeuw, P. J. and Leroy, A. M. 1987. Robust Regression & Outlier Detection, New York: Wiley. [Crossref] [Google Scholar]] studied a heteroscedastic measure of effect size similar to the one proposed here, but their measure of effect size depends on the sample sizes making it difficult for applied researchers to interpret the results. The approach used here is based on a generalization of Cohen's d that obviates the issue of unequal sample sizes. Simulations and illustrations demonstrate that the new measure of effect size can make a practical difference regarding the conclusions reached.  相似文献   

20.
In this paper, the Schwarz Information Criterion (SIC) is proposed to locate a change point in the simple linear regression model, as well as in the multiple linear regression model. The method is then applied to a financial data set, and a change point is successfully detected.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号