首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
In the article, a condition-based maintenance policy is proposed for a linear consecutive-k-out-of-n: F system. The failure times of components are assumed to be independent and identically distributed. It is assumed that the component states in the system can be known at any time and the system failure can be detected immediately. The preventive maintenance action is based on the number of working components in minimal cut sets of the system. If there is at least one minimal cut set consisting of only one working component, the system is maintained preventively after a certain time interval. The proposed policy is compared with corrective maintenance and age-based maintenance policies. As an extended case, it is assumed that the component states can only be known by inspection, but the system failure can be detected immediately. In this case, the system is inspected periodically and is also maintained preventively based on the system state at inspection. Numerical examples are studied to evaluate the performance of the proposed policy and investigate the effects of cost parameters on the expected cost rate.  相似文献   

2.
From the economical viewpoint in reliability theory, this paper addresses a scheduling replacement problem for a single operating system which works at random times for multiple jobs. The system is subject to stochastic failure which results the imperfect maintenance activity based on some random failure mechanism: minimal repair due to type-I (repairable) failure, or corrective replacement due to type-II (non-repairable) failure. Three scheduling models for the system with multiple jobs are considered: a single work, N tandem works, and N parallel works. To control the deterioration process, the preventive replacement is planned to undergo at a scheduling time T or the job's completion time of for each model. The objective is to determine the optimal scheduling parameters (T* or N*) that minimizes the mean cost rate function in a finite time horizon for each model. A numerical example is provided to illustrate the proposed analytical model. Because the framework and analysis are general, the proposed models extend several existing results.  相似文献   

3.
This article presents a generalization of the imperfect sequential preventive maintenance (PM) policy with minimal repair. As failures occur, the system experiences one of two types of failures: a Type-I failure (minor), rectified by a minimal repair; or a Type-II failure (catastrophic) that calls for an unplanned maintenance. In each maintenance period, the system is maintained following the occurrence of a Type-II failure or at age, whichever takes place first. At the Nth maintenance, the system is replaced rather than maintained. The imperfect PM model adopted in this study incorporates with improvement factors in the hazard-rate function. Taking age-dependent minimal repair costs into consideration, the objective consists of finding the optimal PM and replacement schedule that minimize the expected cost per unit time over an infinite time-horizon.  相似文献   

4.
As a flexible alternative to the Cox model, the accelerated failure time (AFT) model assumes that the event time of interest depends on the covariates through a regression function. The AFT model with non‐parametric covariate effects is investigated, when variable selection is desired along with estimation. Formulated in the framework of the smoothing spline analysis of variance model, the proposed method based on the Stute estimate ( Stute, 1993 [Consistent estimation under random censorship when covariables are present, J. Multivariate Anal. 45 , 89–103]) can achieve a sparse representation of the functional decomposition, by utilizing a reproducing kernel Hilbert norm penalty. Computational algorithms and theoretical properties of the proposed method are investigated. The finite sample size performance of the proposed approach is assessed via simulation studies. The primary biliary cirrhosis data is analyzed for demonstration.  相似文献   

5.
The hazard function describes the instantaneous rate of failure at a time t, given that the individual survives up to t. In applications, the effect of covariates produce changes in the hazard function. When dealing with survival analysis, it is of interest to identify where a change point in time has occurred. In this work, covariates and censored variables are considered in order to estimate a change-point in the Weibull regression hazard model, which is a generalization of the exponential model. For this more general model, it is possible to obtain maximum likelihood estimators for the change-point and for the parameters involved. A Monte Carlo simulation study shows that indeed, it is possible to implement this model in practice. An application with clinical trial data coming from a treatment of chronic granulomatous disease is also included.  相似文献   

6.
For the time-to-event outcome, current methods for sample size determination are based on the proportional hazard model. However, if the proportionality assumption fails to capture the relationship between the hazard time and covariates, the proportional hazard model is not suitable to analyze survival data. The accelerated failure time model is an alternative method to deal with survival data. In this article, we address the issue that the relationship between the hazard time and the treatment effect is satisfied with the accelerated failure time model to design a multi-regional trial for a phase III clinical trial. The log-rank test is employed to deal with the heterogeneous effect size among regions. The test statistic for the overall treatment effect is used to determine the total sample size for a multi-regional trial and the consistent trend is used to rationalize partition sample size to each region.  相似文献   

7.
Abstract

This paper presents a preventive replacement problem when a system is operating successive works with random times and suffering stochastic shocks. The works cause random amount additive damage to the system, and the system fails whenever the cumulative damage reaches a failure level threshold. As an external shock occurs, the system experiences one of the two types of shocks with age-dependent maintenance mechanism: type-I (minor) shock is rectified by a minimal repair, or type-II (catastrophic) shock causes the system to fail. To control the deterioration process, preventive replacement is scheduled to replace the system at a continuous age T or at a discrete number N of working cycles, whichever occurs first, and corrective replacement is performed immediately whenever the system fails due to either shock or damage. The optimal preventive replacement schedule that minimizes the expected cost rate is discussed analytically and computed numerically. The proposed model provides a general framework for analyzing maintenance policies and extends several existing results.  相似文献   

8.
Abstract. The focus of this article is on simultaneous confidence bands over a rectangular covariate region for a linear regression model with k>1 covariates, for which only conservative or approximate confidence bands are available in the statistical literature stretching back to Working & Hotelling (J. Amer. Statist. Assoc. 24 , 1929; 73–85). Formulas of simultaneous confidence levels of the hyperbolic and constant width bands are provided. These involve only a k‐dimensional integral; it is unlikely that the simultaneous confidence levels can be expressed as an integral of less than k‐dimension. These formulas allow the construction for the first time of exact hyperbolic and constant width confidence bands for at least a small k(>1) by using numerical quadrature. Comparison between the hyperbolic and constant width bands is then addressed under both the average width and minimum volume confidence set criteria. It is observed that the constant width band can be drastically less efficient than the hyperbolic band when k>1. Finally it is pointed out how the methods given in this article can be applied to more general regression models such as fixed‐effect or random‐effect generalized linear regression models.  相似文献   

9.
It has been modeled for several replacement policies in literatures that the whole life cycle or operating interval of an operating unit should be finite rather than infinite as is done with the traditional method. However, it is more natural to consider the case in which the finite life cycle is a fluctuated parameter that could be used to estimate replacement times, which will be taken up in this article. For this, we first formulate a general model in which the unit is replaced at random age U, random time Y for the first working number, random life cycle S, or at failure X, whichever occurs first. The following models included in the general model, such that replacement done at age T when variable U is a degenerate distribution, and replacement done at working numbers N summed by number N of variable Y, are optimized. We obtain the total expected cost until replacement and the expected replacement cost rate for each model. Optimal age T, working number N, and a pair of (T, N) are discussed analytically and computed numerically.  相似文献   

10.
In this article, a repairable system with age-dependent failure type and minimal repair based on a cumulative repair-cost limit policy is studied, where the information of entire repair-cost history is adopted to decide whether the system is repaired or replaced. As the failures occur, the system has two failure types: (i) a Type-I failure (minor) type that is rectified by a minimal repair, and (ii) a Type-II failure (catastrophic) type that calls for a replacement. We consider a bivariate replacement policy, denoted by (n,T), in which the system is replaced at life age T, or at the n-th Type-I failure, or at the kth Type-I failure (k < n and due to a minor failure at which the accumulated repair cost exceeds the pre-determined limit), or at the first Type-II failure, whichever occurs first. The optimal minimum-cost replacement policy (n,T)* is derived analytically in terms of its existence and uniqueness. Several classical models in maintenance literature could be regard as special cases of the presented model. Finally, a numerical example is given to illustrate the theoretical results.  相似文献   

11.
Consider a system that is subject to shocks that arrive according to a non homogeneous Poisson process. As the shocks occur, the system has m + 1 failure modes including the following: (i) a non repairable failure (catastrophic) mode that calls for a replacement and (ii) m repairable failure (non catastrophic) modes that are rectified by minimal repairs. In this article, we propose an age-replacement model with minimal repair based on using the natural conjugate prior of Bayesian method. In addition, a safety constraint is considered to control the risk of occurring catastrophic failures in a specified time interval. The minimum-cost replacement policy is studied in terms of its existence and safety constraint. A numerical example is also presented to illustrate the proposed model.  相似文献   

12.
Joint modeling of degradation and failure time data   总被引:1,自引:0,他引:1  
This paper surveys some approaches to model the relationship between failure time data and covariate data like internal degradation and external environmental processes. These models which reflect the dependency between system state and system reliability include threshold models and hazard-based models. In particular, we consider the class of degradation–threshold–shock models (DTS models) in which failure is due to the competing causes of degradation and trauma. For this class of reliability models we express the failure time in terms of degradation and covariates. We compute the survival function of the resulting failure time and derive the likelihood function for the joint observation of failure times and degradation data at discrete times. We consider a special class of DTS models where degradation is modeled by a process with stationary independent increments and related to external covariates through a random time scale and extend this model class to repairable items by a marked point process approach. The proposed model class provides a rich conceptual framework for the study of degradation–failure issues.  相似文献   

13.
The outcomes AUCT (area-under-curve from time zero to time t) of n individuals randomized to one of two groups TR or RT, where the group name denotes the order in which the subjects receive a test formulation (T) or a reference formulation (R), are used to assess average bioequivalence for the two formulations. The classical method is the mixed model, for example, proc mixed or proc glm with random statement in SAS can be used to analyze this type of data. This is equivalent to the marginal likelihood approach in a normal–normal model. There are some limitations for this approach. It is not appropriate if the random effect is not normally distributed. In this article, we introduce a hierarchical quasi-likelihood approach. Instead of assuming the random effect is normal, we make assumptions only about the mean and the variance function of the random effect. Our method is flexible to model the random effect. Since we can estimate the random effect for each individual, we can check the adequacy of the distribution assumption about the random effect. This method can also be used to handle high-dimensional crossover data. Simulation studies are conducted to check the finite sample performance of the method under various conditions and two real data examples are used for illustration.  相似文献   

14.
In many clinical studies where time to failure is of primary interest, patients may fail or die from one of many causes where failure time can be right censored. In some circumstances, it might also be the case that patients are known to die but the cause of death information is not available for some patients. Under the assumption that cause of death is missing at random, we compare the Goetghebeur and Ryan (1995, Biometrika, 82, 821–833) partial likelihood approach with the Dewanji (1992, Biometrika, 79, 855–857)partial likelihood approach. We show that the estimator for the regression coefficients based on the Dewanji partial likelihood is not only consistent and asymptotically normal, but also semiparametric efficient. While the Goetghebeur and Ryan estimator is more robust than the Dewanji partial likelihood estimator against misspecification of proportional baseline hazards, the Dewanji partial likelihood estimator allows the probability of missing cause of failure to depend on covariate information without the need to model the missingness mechanism. Tests for proportional baseline hazards are also suggested and a robust variance estimator is derived.  相似文献   

15.
Abstract

In this article, we propose a two-stage generalized case–cohort design and develop an efficient inference procedure for the data collected with this design. In the first-stage, we observe the failure time, censoring indicator and covariates which are easy or cheap to measure, and in the second-stage, select a subcohort by simple random sampling and a subset of failures in remaining subjects from the first-stage subjects to observe their exposures which are different or expensive to measure. We derive estimators for regression parameters in the accelerated failure time model under the two-stage generalized case–cohort design through the estimated augmented estimating equation and the kernel function method. The resulting estimators are shown to be consistent and asymptotically normal. The finite sample performance of the proposed method is evaluated through the simulation studies. The proposed method is applied to a real data set from the National Wilm’s Tumor Study Group.  相似文献   

16.
This article addresses issues in creating public-use data files in the presence of missing ordinal responses and subsequent statistical analyses of the dataset by users. The authors propose a fully efficient fractional imputation (FI) procedure for ordinal responses with missing observations. The proposed imputation strategy retrieves the missing values through the full conditional distribution of the response given the covariates and results in a single imputed data file that can be analyzed by different data users with different scientific objectives. Two most critical aspects of statistical analyses based on the imputed data set,  validity  and  efficiency, are examined through regression analysis involving the ordinal response and a selected set of covariates. It is shown through both theoretical development and simulation studies that, when the ordinal responses are missing at random, the proposed FI procedure leads to valid and highly efficient inferences as compared to existing methods. Variance estimation using the fractionally imputed data set is also discussed. The Canadian Journal of Statistics 48: 138–151; 2020 © 2019 Statistical Society of Canada  相似文献   

17.
ABSTRACT

In this article, we obtain exact expression for the distribution of the time to failure of discrete time cold standby repairable system under the classical assumptions that both working time and repair time of components are geometric. Our method is based on alternative representation of lifetime as a waiting time random variable on a binary sequence, and combinatorial arguments. Such an exact expression for the time to failure distribution is new in the literature. Furthermore, we obtain the probability generating function and the first two moments of the lifetime random variable.  相似文献   

18.
We propose a consistent and locally efficient method of estimating the model parameters of a logistic mixed effect model with random slopes. Our approach relaxes two typical assumptions: the random effects being normally distributed, and the covariates and random effects being independent of each other. Adhering to these assumptions is particularly difficult in health studies where, in many cases, we have limited resources to design experiments and gather data in long‐term studies, while new findings from other fields might emerge, suggesting the violation of such assumptions. So it is crucial to have an estimator that is robust to such violations; then we could make better use of current data harvested using various valuable resources. Our method generalizes the framework presented in Garcia & Ma (2016) which also deals with a logistic mixed effect model but only considers a random intercept. A simulation study reveals that our proposed estimator remains consistent even when the independence and normality assumptions are violated. This contrasts favourably with the traditional maximum likelihood estimator which is likely to be inconsistent when there is dependence between the covariates and random effects. Application of this work to a study of Huntington's disease reveals that disease diagnosis can be enhanced using assessments of cognitive performance. The Canadian Journal of Statistics 47: 140–156; 2019 © 2019 Statistical Society of Canada  相似文献   

19.
Abstract. A right‐censored version of a U ‐statistic with a kernel of degree m 1 is introduced by the principle of a mean preserving reweighting scheme which is also applicable when the dependence between failure times and the censoring variable is explainable through observable covariates. Its asymptotic normality and an expression of its standard error are obtained through a martingale argument. We study the performances of our U ‐statistic by simulation and compare them with theoretical results. A doubly robust version of this reweighted U ‐statistic is also introduced to gain efficiency under correct models while preserving consistency in the face of model mis‐specifications. Using a Kendall's kernel, we obtain a test statistic for testing homogeneity of failure times for multiple failure causes in a multiple decrement model. The performance of the proposed test is studied through simulations. Its usefulness is also illustrated by applying it to a real data set on graft‐versus‐host‐disease.  相似文献   

20.
In this paper, we consider the problem of adaptive density or survival function estimation in an additive model defined by Z=X+Y with X independent of Y, when both random variables are non‐negative. This model is relevant, for instance, in reliability fields where we are interested in the failure time of a certain material that cannot be isolated from the system it belongs. Our goal is to recover the distribution of X (density or survival function) through n observations of Z, assuming that the distribution of Y is known. This issue can be seen as the classical statistical problem of deconvolution that has been tackled in many cases using Fourier‐type approaches. Nonetheless, in the present case, the random variables have the particularity to be supported. Knowing that, we propose a new angle of attack by building a projection estimator with an appropriate Laguerre basis. We present upper bounds on the mean squared integrated risk of our density and survival function estimators. We then describe a non‐parametric data‐driven strategy for selecting a relevant projection space. The procedures are illustrated with simulated data and compared with the performances of a more classical deconvolution setting using a Fourier approach. Our procedure achieves faster convergence rates than Fourier methods for estimating these functions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号