首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 593 毫秒
1.
Many time series encountered in practice are nonstationary, and instead are often generated from a process with a unit root. Because of the process of data collection or the practice of researchers, time series used in analysis and modeling are frequently obtained through temporal aggregation. As a result, the series used in testing for a unit root are often time series aggregates. In this paper, we study the effects of the use of aggregate time series on the Dickey–Fuller test for a unit root. We start by deriving a proper model for the aggregate series. Based on this model, we find the limiting distributions of the test statistics and illustrate how the tests are affected by the use of aggregate time series. The results show that those distributions shift to the right and that this effect increases with the order of aggregation, causing a strong impact both on the empirical significance level and on the power of the test. To correct this problem, we present tables of critical points appropriate for the tests based on aggregate time series and demonstrate their adequacy. Examples illustrate the conclusions of our analysis.  相似文献   

2.
The paper is largely concerned with twenty-one possible methods of sampling a plane area, with points as sampling units, for the purpose of estimating a portion of this area having certain defined characteristics. These methods result from a combination, two at a time, of random, stratified and systematic sampling in two perpendicular directions, with or without alignment of the sampled points. Eleven of these methods involve systematic sampling in one or both directions. The estimate of the proportion of the area of interest is simply the proportion of points in the total area, falling within the area having the defined characteristics. For each method the variance function is derived. Fourteen different types of space covariance functions are involved in these variance functions.  相似文献   

3.
A common approach to the design of an acceptance sampling plan is to require that the operating characteristic (OC) curve should pass through two designated points that would fix the curve in accordance with a desired degree of discrimination. This paper presents a search procedure for the selection of double sampling inspection plans of type DSP - (0, 1) for specified two points on the OC curve, namely acceptance quality limit, producer's risk, limiting quality and consumer's risk. Selection of the plans is discussed for both the cases of fraction non-conforming and the number of non-conformities per unit.  相似文献   

4.
By running the life tests at higher stress levels than normal operating conditions, accelerated life testing quickly yields information on the lifetime distribution of a test unit. The lifetime at the design stress is then estimated through extrapolation using a regression model. In constant-stress testing, a unit is tested at a fixed stress level until failure or the termination time point of the test, while step-stress testing allows the experimenter to gradually increase the stress levels at some pre-fixed time points during the test. In this article, the optimal k-level constant-stress and step-stress accelerated life tests are compared for the exponential failure data under Type-I censoring. The objective is to quantify the advantage of using the step-stress testing relative to the constant-stress one. A log-linear relationship between the mean lifetime parameter and stress level is assumed and the cumulative exposure model holds for the effect of changing stress in step-stress testing. The optimal design point is then determined under C-optimality, D-optimality, and A-optimality criteria. The efficiency of step-stress testing compared to constant-stress testing is discussed in terms of the ratio of optimal objective functions based on the information matrix.  相似文献   

5.
Before a surrogate end point can replace a final (true) end point in the evaluation of an experimental treatment, it must be formally 'validated'. The validation will typically require large numbers of observations. It is therefore useful to consider situations in which data are available from several randomized experiments. For two normally distributed end points Buyse and co-workers suggested a new definition of validity in terms of the quality of both trial level and individual level associations between the surrogate and true end points. This paper extends this approach to the important case of two failure time end points, using bivariate survival modelling. The method is illustrated by using two actual sets of data from cancer clinical trials.  相似文献   

6.
《随机性模型》2013,29(1):93-107
We study the optimal control of a production process subject to a deterministicdrift and to random shocks. The process mean is observable at discrete points of time after producing a batch and, at each such point, a decision is made whether to reset the process mean to some initial value or to continue with the production. The objective is to find the initial setting of the process mean and the resetting time that minimizes the expected average cost per unit time. It is shown that the optimal control policy is of a control limit type. An algorithm for finding the optimal control parameters is presented.  相似文献   

7.
Regression analyses are commonly performed with doubly limited continuous dependent variables; for instance, when modeling the behavior of rates, proportions and income concentration indices. Several models are available in the literature for use with such variables, one of them being the unit gamma regression model. In all such models, parameter estimation is typically performed using the maximum likelihood method and testing inferences on the model''s parameters are usually based on the likelihood ratio test. Such a test can, however, deliver quite imprecise inferences when the sample size is small. In this paper, we propose two modified likelihood ratio test statistics for use with the unit gamma regressions that deliver much more accurate inferences when the number of data points in small. Numerical (i.e. simulation) evidence is presented for both fixed dispersion and varying dispersion models, and also for tests that involve nonnested models. We also present and discuss two empirical applications.  相似文献   

8.
In this paper we propose a family of relativel simple nonparametrics tests for a unit root in a univariate time series. Almost all the tests proposed in the literature test the unit root hypothesis against the alternative that the time series involved is stationarity or trend stationary. In this paper we take the (trend) stationarity hypothesis as the null and the unit root hypothesis as the alternative. The order differnce with most of the tests proposed in the literature is that in all four cases the asymptotic null distribution is of a well-known type, namely standard Cauchy. In the first instance we propose four Cauchy tests of the stationarity hypothesis against the unit root hypothesis. Under H1 these four test statistics involved, divided by the sample size n, converge weakly to a non-central Cauchy distribution, to one, and to the product of two normal variates, respectively. Hence, the absolute values of these test statistics converge in probability to infinity 9at order n). The tests involved are therefore consistent against the unit root hypothesis. Moreover, the small sample performance of these test are compared by Monte Carlo simulations. Furthermore, we propose two additional Cauchy tests of the trend stationarity hypothesis against the alternative of a unit root with drift.  相似文献   

9.
Cross-lagged panel studies are studies in which two or more variables are measured for a large number of subjects at each of several points in time. The variables divide naturally into two sets, and the purpose of the analysis is to estimate and test the cross-effects between the two sets. One approach to this analysis is to treat the cross-effects as parameters in regression equations. This study contributes to this approach by extending the regression model to a multivariate model that captures the correlation among the variables and allows the errors in the model to be correlated over time.  相似文献   

10.
We present two stochastic models that describe the relationship between biomarker process values at random time points, event times, and a vector of covariates. In both models the biomarker processes are degradation processes that represent the decay of systems over time. In the first model the biomarker process is a Wiener process whose drift is a function of the covariate vector. In the second model the biomarker process is taken to be the difference between a stationary Gaussian process and a time drift whose drift parameter is a function of the covariates. For both models we present statistical methods for estimation of the regression coefficients. The first model is useful for predicting the residual time from study entry to the time a critical boundary is reached while the second model is useful for predicting the latency time from the infection until the time the presence of the infection is detected. We present our methods principally in the context of conducting inference in a population of HIV infected individuals.  相似文献   

11.
This paper is a comment on Soren Johansen's (1994) paper on estimating systems of trending variables. The pedagogical and diagnostic value of using univariate time series methods is emphasized together with the use of small scale experiments that give insight into the sensitivity of unit root test procedures to misspecification of the deterministic components. The test statistics used in the likelihood approach advocated by Johansen are compared with several other test statistics, in particular, those of Box and Tiao (1977) and Stock and Vatson (1988). We also compare the corresponding methods to estimate pulling equilibria. We present the outcomes of two Zlonte Carlo experiments to illustrate some points.  相似文献   

12.
This paper is a comment on Soren Johansen's (1994) paper on estimating systems of trending variables. The pedagogical and diagnostic value of using univariate time series methods is emphasized together with the use of small scale experiments that give insight into the sensitivity of unit root test procedures to misspecification of the deterministic components. The test statistics used in the likelihood approach advocated by Johansen are compared with several other test statistics, in particular, those of Box and Tiao (1977) and Stock and Vatson (1988). We also compare the corresponding methods to estimate pulling equilibria. We present the outcomes of two Zlonte Carlo experiments to illustrate some points.  相似文献   

13.
In this paper, we propose a design that uses a short‐term endpoint for accelerated approval at interim analysis and a long‐term endpoint for full approval at final analysis with sample size adaptation based on the long‐term endpoint. Two sample size adaptation rules are compared: an adaptation rule to maintain the conditional power at a prespecified level and a step function type adaptation rule to better address the bias issue. Three testing procedures are proposed: alpha splitting between the two endpoints; alpha exhaustive between the endpoints; and alpha exhaustive with improved critical value based on correlation. Family‐wise error rate is proved to be strongly controlled for the two endpoints, sample size adaptation, and two analysis time points with the proposed designs. We show that using alpha exhaustive designs greatly improve the power when both endpoints are effective, and the power difference between the two adaptation rules is minimal. The proposed design can be extended to more general settings. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

14.
The efficient use of surrogate or auxiliary information has been investigated within both model-based and design-based approaches to data analysis, particularly in the context of missing data. Here we consider the use of such data in epidemiological studies of disease incidence in which surrogate measures of disease status are available for all subjects at two time points, but definitive diagnoses are available only in stratified subsamples. We briefly review methods for the analysis of two-phase studies of disease prevalence at a single time point, and we discuss the extension of four of these methods to the analysis of incidence studies. Their performance is compared with special reference to a study of the incidence of senile dementia.  相似文献   

15.
Determination of preventive maintenance is an important issue for systems under degradation. A typical maintenance policy calls for complete preventive repair actions at pre-scheduled times and minimal repair actions whenever a failure occurs. Under minimal repair, failures are modeled according to a non homogeneous Poisson process. A perfect preventive maintenance restores the system to the as good as new condition. The motivation for this article was a maintenance data set related to power switch disconnectors. Two different types of failures could be observed for these systems according to their causes. The major difference between these types of failures is their costs. Assuming that the system will be in operation for an infinite time, we find the expected cost per unit of time for each preventive maintenance policy and hence obtain the optimal strategy as a function of the processes intensities. Assuming a parametrical form for the intensity function, large sample estimates for the optimal maintenance check points are obtained and discussed.  相似文献   

16.
Abstract

When we consider the improvement of the functional performances that are released by the new updates of the products, it is an interesting problem to revisit the existing replacement policies. For such a viewpoint, four replacement models with product update announcements, i.e., PUA for abbreviation, are given in this paper: Model 1, the unit is replaced at time T or at PUA over time T. Model 2, the unit is replaced at the Kth failure or at PUA over the Kth failure. By considering both time T and failure K, Models 3 and 4 are obtained based on the approaches of replacement first and last. We obtain the expected cost rates for four models and discuss analytically their optimal replacement policies Further, numerical examples are given when the time for PUA has an exponential distribution.  相似文献   

17.
Preventive maintenance (PM) scheduling of units is addressed as a crucial issue that effects on both economy and reliability of power systems. In this paper, we describe an application of statistical analysis for determining the best PM strategy in the case of parallel, series, and single-item replacement systems. A key aspect of industrial maintenance is the trade-off between cost and time of performing PM operations. The goals of this study is to determine the best time for performing PM operations in each system and also finding the number of spare parts and facilities in single-item replacement and parallel systems respectively so that the average cost per unit time is minimized. In this proposed maintenance strategy, PM operations are regularly performed on the production unit in equal time intervals. Finally, three examples are presented to demonstrate the effectiveness of the proposed models.  相似文献   

18.
This paper considers the use of orthogonal arrays of strength two as experimental designs for fitting a surrogate model. Contrary to standard space-filling designs or Latin hypercube designs, the points of an orthogonal array of strength two are well distributed when they are projected on the two-dimensional faces of the unit cube. The aim is to determine if this property allows one to fit an accurate surrogate model when the computer response is governed by second-order interactions of some input variables. The first part of the paper is devoted to the construction of orthogonal arrays with space-filling properties. In the second part, orthogonal arrays are compared with standard designs for fitting a Gaussian process model.  相似文献   

19.
In this paper we consider a system which has three subsystems A, B and C. A and B are one unit systems and two unit systems respectively, and C is exposed to a damage process. The units of B have exponential life times. In model 1, A has a general life time and the damage process of C is Poisson and in model 2, A has an exponential life time and C is exposed to a renewal damage process. Introducing a repair facility which repairs all the failures one by one, this paper presents the joint Laplace-Stieltjes transforms of the ur> and down times. Marginal down time distributions an- calculated whon there exists a repair facility vor every damage.  相似文献   

20.
Circular data are observations that are represented as points on a unit circle. Times of day and directions of wind are two such examples. In this work, we present a Bayesian approach to regress a circular variable on a linear predictor. The regression coefficients are assumed to have a nonparametric distribution with a Dirichlet process prior. The semiparametric Bayesian approach gives added flexibility to the model and is useful especially when the likelihood surface is ill behaved. Markov chain Monte Carlo techniques are used to fit the proposed model and to generate predictions. The method is illustrated using an environmental data set.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号