首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Due to the high reliability and high testing cost of electro-explosive devices, even though an accelerated test is performed, one may observe very few failures or even no failures at all due to censoring. In this paper, we consider modelling the reliability of such devices by an exponential lifetime distribution in which the failure rate is assumed to be a function of some covariates and that the observed data are binary. The Bayesian approach, with three different prior settings, is used to develop inference on the failure rate, lifetime and the reliability under some settings. A Monte Carlo simulation study is carried out to show that this approach is quite useful and suitable for analysing data of the considered form, especially when the failure rates are very small. Finally, illustrative data are analysed using this approach.  相似文献   

2.
This article extends a random preventive maintenance scheme, called repair alert model, when there exist environmental variables that effect on system lifetimes. It can be used for implementing age-dependent maintenance policies on engineering devices. In other words, consider a device that works for a job and is subject to failure at a random time X, and the maintenance crew can avoid the failure by a possible replacement at some random time Z. The new model is flexible to including covariates with both fixed and random effects. The problem of estimating parameters is also investigated in details. Here, the observations are in the form of random signs censoring data (RSCD) with covariates. Therefore, this article generalizes derived statistical inferences on the basis of RSCD albeit without covariates in past literature. To do this, it is assumed that the system lifetime distribution belongs to the log-location-scale family of distributions. A real dataset is also analyzed on basis of the results obtained.  相似文献   

3.
By running the life tests at higher stress levels than normal operating conditions, accelerated life testing quickly yields information on the lifetime distribution of a test unit. The lifetime at the design stress is then estimated through extrapolation using a regression model. In constant-stress testing, a unit is tested at a fixed stress level until failure or the termination time point of the test, while step-stress testing allows the experimenter to gradually increase the stress levels at some pre-fixed time points during the test. In this article, the optimal k-level constant-stress and step-stress accelerated life tests are compared for the exponential failure data under Type-I censoring. The objective is to quantify the advantage of using the step-stress testing relative to the constant-stress one. A log-linear relationship between the mean lifetime parameter and stress level is assumed and the cumulative exposure model holds for the effect of changing stress in step-stress testing. The optimal design point is then determined under C-optimality, D-optimality, and A-optimality criteria. The efficiency of step-stress testing compared to constant-stress testing is discussed in terms of the ratio of optimal objective functions based on the information matrix.  相似文献   

4.
The non-parametric maximum likelihood estimators (MLEs) are derived for survival functions associated with individual risks or system components in a reliability framework. Lifetimes are observed for systems that contain one or more of those components. Analogous to a competing risks model, the system is assumed to fail upon the first instance of any component failure; i.e. the system is configured in series. For any given risk or component type, the asymptotic distribution is shown to depend explicitly on the unknown survival function of the other risks, as well as the censoring distribution. Survival functions with increasing failure rate are investigated as a special case. The order restricted MLE is shown to be consistent under mild assumptions of the underlying component lifetime distributions.  相似文献   

5.
We consider the problem of estimating the current failure intensity for the power-law (Weibull) process. Closed-form optimum estimators under the criteria of minimum risks as well as Pitman-closeness are derived for the failure truncated case. A unique Pitman-closest estimator which is also invariant of the choice of the loss function within a very wide class of loss functions is obtained. In the frequentist setup, no admissible estimator under these criteria are available for the time truncated scheme due to the lack of any pivotal quantity. We present a Bayesian approach, which circumvents this problem and provides a uniform solution. In the Bayesian framework, we provide an algorithm based on Markov Chain Monte Carlo (MCMC) technique, which facilitates the evaluation of the estimators. The theoretical findings are supplemented by substantial numerical investigation.  相似文献   

6.
Degradation testing (DT) is a useful approach to assessing the reliability of highly reliable products which are not likely to fail under the traditional life tests or accelerated life tests. There have been a great number of excellent studies investigating the estimation of the failure time distribution and the optimal design (e.g., the optimal setting of the inspection frequency, the number of measurement, and the termination time) for DTs. However, the lifetime distributions considered in the studies mentioned above are all those without failure-free life. Here, failure-free life is characterized by a threshold parameter below which no failure is possible. The main purpose of this article is to deal with the optimal design of a DT with a two-parameter exponential lifetime distribution. More specifically, with respect to a DT where a linearized degradation model is used to model the degradation process and the lifetime is assumed to follow a two-parameter exponential distribution, under the constraint that the total experimental cost does not exceed a predetermined budget, the optimal combination of the inspection frequency, the sample size, and the termination time are determined by minimizing the mean squared error of the estimated 100p-th percentile of the lifetime distribution of the product. An example is provided to illustrate the proposed method and the corresponding sensitivity analysis is also discussed.  相似文献   

7.
In this paper, we develop a double acceptance sampling plan for half exponential power distribution when the lifetime experiment is truncated at a prefixed time. The zero and one failure schemes are considered. We obtain the minimum sample sizes of the first and second samples necessary to ensure the specified mean life at the given consumer’s confidence level. The operating characteristic values and the minimum ratios of the mean life to the specified life are also analyzed. Numerical example is provided to illustrate the double acceptance sampling plan.  相似文献   

8.
System characteristics of a redundant repairable system are studied from a Bayesian viewpoint with different types of priors assumed for the unknown parameters. The system consists of two primary units, one standby unit, and one repair facility which is activated when switching to standby fails. Times to failure and times to repair of the operating units are assumed to follow exponential distributions. When time to failure and time to repair have uncertain parameters, a Bayesian approach is adopted to evaluate system characteristics. Monte Carlo simulation is used to derive the posterior distribution for the mean time to system failure and steady-state availability. Some numerical experiments are performed to illustrate the results derived in this paper.  相似文献   

9.
The mixture of Type I and Type I1 censoring schemes, called the hybrid censoring, is quite important in life–testing experiments. Epstein(1954, 1960) introduced this testing scheme and proposed a two–sided confidence interval to estimate the mean lifetime, θ, when the underlying lifetime distribution is assumed to be exponential. There are some two–sided confidence intervals and credible intervals proposed by Fairbanks et al. (1982) and Draper and Guttman (1987) respectively. In this paper we obtain the exact two–sided confidence interval of θ following the approach of Chen and Bhattacharya (1988). We also obtain the asymptotic confidence intervals in the Hybrid censoring case. It is important to observe that the results for Type I and Type II censoring schemes can be obtained as particular cases of the Hybrid censoring scheme. We analyze one data set and compare different methods by Monte Carlo simulations.  相似文献   

10.
ABSTRACT

It is a very important topic these days to assessing the lifetime performance of products in manufacturing or service industries. Lifetime performance indices CL is used to measure the larger-the-better type quality characteristics to evaluate the process performance for the improvement of quality and productivity. The lifetimes of products are assumed to have Burr XII distribution. The maximum likelihood estimator is used to estimate the lifetime performance index based on the progressive type I interval censored sample. The asymptotic distribution of this estimator is also developed. We use this estimator to build the new hypothesis testing algorithmic procedure with respect to a lower specification limit. Finally, two practical examples are given to illustrate the use of this testing algorithmic procedure to determine whether the process is capable.  相似文献   

11.
□ In recent years, signatures are widely used for analysis of coherent systems consisting of unreliable components. If component lifetimes are independent and identically distributed, then system lifetime distribution function is a convex combination of distribution functions of order statistics for component lifetimes. Coefficients of this convex combination are called signatures. This article considers the case when a system operates in a so-called random environment, i.e., component failure rates are jointly modulated by a finite-state continuous-time Markov chain. In this model, component lifetimes remain exchangeable. An expression for distribution function of time to system failure is derived. Here, a crucial role is played by an elaborated procedure of deriving a distribution function of order statistics for system component lifetimes. A numerical example illustrates the suggested approach and analyzes the influence of random environment on the distribution function of system lifetime.  相似文献   

12.
The median service lifetime of respirator safety devices produced by different manufacturers is determined using frailty models to account for unobserved differences in manufacturing processes and raw materials. The gamma and positive stable frailty distributions are used to obtain survival distribution estimates when the baseline hazard is assumed to be Weibull. Frailty distributions are compared using laboratory test data of the failure times for 104 respirator cartridges produced by 10 different manufacturers tested with three different challenge agents. Likelihood ratio tests indicate that both frailty models provide a significant improvement over a Weibull model assuming independence. Results are compared to fixed effects approaches for analysis of this data.  相似文献   

13.
In the past few years, the Lindley distribution has gained popularity for modeling lifetime data as an alternative to the exponential distribution. This paper provides two new characterizations of the Lindley distribution. The first characterization is based on a relation between left truncated moments and failure rate function. The second characterization is based on a relation between right truncated moments and reversed failure rate function.  相似文献   

14.
The Wehrly–Johnson family of bivariate circular distributions is by far the most general one currently available for modelling data on the torus. It allows complete freedom in the specification of the marginal circular densities as well as the binding circular density which regulates any dependence that might exist between them. We propose a parametric bootstrap approach for testing the goodness-of-fit of Wehrly–Johnson distributions when the forms of their marginal and binding densities are assumed known. The approach admits the use of any test for toroidal uniformity, and we consider versions of it incorporating three such tests. Simulation is used to illustrate the operating characteristics of the approach when the underlying distribution is assumed to be bivariate wrapped Cauchy. An analysis of wind direction data recorded at a Texan weather station illustrates the use of the proposed goodness-of-fit testing procedure.  相似文献   

15.
In this paper we consider the determination of Bayesian life test acceptance sampling plans for finite lots when the underlying lifetime distribution is the two parameter exponential. It is assumed that the prior distribution is the natural conjugate prior, that the costs associated with the actions accept and reject are known functions of the lifetimes of the items, and that the cost of testing a sample is proportional to the duration of the test. Type 2 censored sampling is considered where a sample of size n is observed only until the rth failure occurs and the decision of whether to accept or reject the remainder of the lot is made on the basis of the r observed lifetimes. Obtaining the optimal sample size and the optimal censoring number are difficult problems when the location parameter of the distribution is restricted to be non-negative. The case when the positivity restriction on the location parameter is removed has been investigated. An example is provided for illustration.  相似文献   

16.
In this paper an axiomatic approach is used to construct accelerated life testing (ALT) models for Nonhomogeneous Poisson Processes (NHPPs). First, the models of random lifetime variables and Nonhomogeneous Poisson Processes used for modeling non-repairable and repairable systems are compared. Then, an axiomatic approach for the construction of ALT models for NHPPs is given. Particular models are considered that can be constructed by this method.  相似文献   

17.
We extend the bivariate Wiener process considered by Whitmore and co-workers and model the joint process of a marker and health status. The health status process is assumed to be latent or unobservable. The time to reach the primary end point or failure (death, onset of disease, etc.) is the time when the latent health status process first crosses a failure threshold level. Inferences for the model are based on two kinds of data: censored survival data and marker measurements. Covariates, such as treatment variables, risk factors and base-line conditions, are related to the model parameters through generalized linear regression functions. The model offers a much richer potential for the study of treatment efficacy than do conventional models. Treatment effects can be assessed in terms of their influence on both the failure threshold and the health status process parameters. We derive an explicit formula for the prediction of residual failure times given the current marker level. Also we discuss model validation. This model does not require the proportional hazards assumption and hence can be widely used. To demonstrate the usefulness of the model, we apply the methods in analysing data from the protocol 116a of the AIDS Clinical Trials Group.  相似文献   

18.
For the recapture debugging design introduced by Nayak (1988) we consider the problem of estimating the hitting rates of the faults remaining in a system. In the context of a conditional likelihood, moment estimators are derived and are shown to be asymptotically normal and fully efficient. Fixed sample properties of the moment estimators are compared, through simulation, with those of the conditional maximum likelihood estimators. Also considered is a procedure for testing the assumption that faults have identical hitting rates; this provides a test of fit of the Jelinski-Moranda (1972) model. It is assumed that the residual hitting rates follow a log linear rate model and that the testing process is truncated when the gaps between the detection of new errors exceed a fixed amount of time.  相似文献   

19.
Modelling accelerated life test data by using a Bayesian approach   总被引:1,自引:0,他引:1  
Summary. Because of the high reliability of many modern products, accelerated life tests are becoming widely used to obtain timely information about their time-to-failure distributions. We propose a general class of accelerated life testing models which are motivated by the actual failure process of units from a limited failure population with a positive probability of not failing during the technological lifetime. We demonstrate a Bayesian approach to this problem, using a new class of models with non-monotone hazard rates, the hazard model with potential scope for use far beyond accelerated life testing. Our methods are illustrated with the modelling and analysis of a data set on lifetimes of printed circuit boards under humidity accelerated life testing.  相似文献   

20.
We will pursue a Bayesian nonparametric approach in the hierarchical mixture modelling of lifetime data in two situations: density estimation, when the distribution is a mixture of parametric densities with a nonparametric mixing measure, and accelerated failure time (AFT) regression modelling, when the same type of mixture is used for the distribution of the error term. The Dirichlet process is a popular choice for the mixing measure, yielding a Dirichlet process mixture model for the error; as an alternative, we also allow the mixing measure to be equal to a normalized inverse-Gaussian prior, built from normalized inverse-Gaussian finite dimensional distributions, as recently proposed in the literature. Markov chain Monte Carlo techniques will be used to estimate the predictive distribution of the survival time, along with the posterior distribution of the regression parameters. A comparison between the two models will be carried out on the grounds of their predictive power and their ability to identify the number of components in a given mixture density.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号