首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Modern highly reliable products usually have complex structure and many functions. This means that they may have two or more performance characteristics. All the performance characteristics can reflect the product's performance degradation over time, and they may be independent or dependent. If the performance characteristics are independent, they can be modelled separately. But if they are not independent, it is very important to find the joint distribution function of the performance characteristics for estimating the reliability of the product as accurately as possible. Here, we suppose that a product has two performance characteristics and the degradation paths of these two performance characteristics can be governed by a Wiener process with a time-scale transformation, and that the dependency of the performance characteristics can be described by a copula function. The parameters of the two performance characteristics and the copula function can be estimated jointly. The model in such a situation is very complicated and analytically intractable and becomes cumbersome from a computational viewpoint. For this reason, the Bayesian Markov chain Monte Carlo method is developed for this problem that allows the maximum-likelihood estimates of the parameters to be determined in an efficient manner. For an illustration of the proposed model, a numerical example about fatigue cracks is presented.  相似文献   

2.
Testing the reliability at a nominal stress level may lead to extensive test time. Estimations of reliability parameters can be obtained faster thanks to step-stress accelerated life tests (ALT). Usually, a transfer functional defined among a given class of parametric functions is required, but Bagdonavi?ius and Nikulin showed that ALT tests are still possible without any assumption about this functional. When shape and scale parameters of the lifetime distribution change with the stress level, they suggested an ALT method using a model called CHanging Shape and Scale (CHSS). They estimated the lifetime parameters at the nominal stress with maximum likelihood estimation (MLE). However, this method usually requires an initialization of lifetime parameters, which may be difficult when no similar product has been tested before. This paper aims to face this issue by using an iterating least square estimation (LSE) method. It will enable one to initialize the optimization required to carry out the MLE and it will give estimations that can sometimes be better than those given by MLE.  相似文献   

3.
As assumed hypothetical consensus category corresponding to a case being classified provides a basis for assessment of reliability of judges. Equivalent judges are characterised by the joint probability distribution of the judge assignment and the consensus category. Estimates of the conditional probabilities of judge assignment given consensus category and of consensus category given judge assignments are indices of reliability. All parameters can be estimated if data include classifications of a number of cases by 3 or more judges. Restrictive assumptions are imposed to obtain models for data from classifications by two judges. Maximum likelihood estimation is discussed and illustrated by example for the 3 or more judges case.  相似文献   

4.
The paper considers Bayesian analysis of the generalized four-parameter gamma distribution. Estimation of parameters using classical techniques is associated with important technical problems while Bayesian methods are not currently available for such distributions. Posterior inference is performed using numerical methods organized around Gibbs sampling. Predictive distributions and reliability can be estimated routinely using the proposed methods.  相似文献   

5.
This article considers the constant stress accelerated life test for series system products, where independent log-normal distributed lifetimes are assumed for the components. Based on Type-I progressive hybrid censored and masked data, the expectation-maximization algorithm is applied to obtain the estimation for the unknown parameters, and the parametric bootstrap method is used for the standard deviation estimation. In addition, Bayesian approach combining latent variable with Gibbs sampling is developed. Further, the reliability functions of the system and components are estimated at use stress level. The proposed method is illustrated through a numerical example under different masking probabilities and censoring schemes.  相似文献   

6.
For some operable products with critical reliability constraints, it is important to estimate accurately their residual lives so that maintenance actions can be arranged suitably and efficiently. In the literature, most publications have dealt with this issue by only considering one-dimensional degradation data. However, this may be not reasonable in situations wherein a product may have two or more performance characteristics (PCs). In such situations, multi-dimensional degradation data should be taken into account. Here, for the target product with multivariate PCs, methods of residual life (RL) estimation are developed. This is done with the assumption that the degradation of PCs over time is governed by a multivariate Wiener process with nonlinear drifts. Both the population-based degradation information and the degradation history of the target product up-to-date are combined to estimate the RL of the product. Specifically, the population-based degradation information is first used to obtain the estimates of the unknown parameters of the model through the EM algorithm. Then, the degradation history of the target product is adopted to update the degradation model, based on which the RL is estimated accordingly. To illustrate the validity and the usefulness of the proposed method, a numerical example about fatigue cracks is finally presented and analysed.  相似文献   

7.
There may be situations in which either the reliability data do not fit to popular lifetime models or the estimation of the parameters is not easy, while there may be other distributions which are not popular but either they provide better goodness-of-fit or have a smaller number of parameters to be estimated, or they have both the advantages. This paper proposes the Maxwell distribution as a lifetime model and supports its usefulness in the reliability theory through real data examples. Important distributional properties and reliability characteristics of this model are elucidated. Estimation procedures for the parameter, mean life, reliability and failure-rate functions are developed. In view of cost constraints and convenience of intermediate removals, the progressively Type-II censored sample information is used in the estimation. The efficiencies of the estimates are studied through simulation. Apart from researchers and practitioners in the reliability theory, the study is also useful for scientists in physics and chemistry, where the Maxwell distribution is widely used.  相似文献   

8.
This article considers a k level step-stress accelerated life testing (ALT) on series system products, where independent Weibull-distributed lifetimes are assumed for the components. Due to cost considerations or environmental restrictions, causes of system failures are masked and type-I censored observations might occur in the collected data. Bayesian approach combined with auxiliary variables is developed for estimating the parameters of the model. Further, the reliability and hazard rate functions of the system and components are estimated at a specified time at use stress level. The proposed method is illustrated through a numerical example based on two priors and various masking probabilities.  相似文献   

9.
Most software reliability models use the maximum likelihood method to estimate the parameters of the model. The maximum likelihood method assumes that the inter-failure time distributions contribute equally to the likelihood function. Since software reliability is expected to exhibit growth, a weighted likelihood function that gives higher weights to latter inter-failure times compared to earlier ones is suggested. The accuracy of the predictions obtained using the weighted likelihood method is compared with the predictions obtained when the parameters are estimated by the maximum likelihood method on three real datasets. A simulation study is also conducted.  相似文献   

10.
To assess the reliability of highly reliable products that have two or more performance characteristics (PCs) in an accurate manner, relations between the PCs should be taken duly into account. If they are not independent, it would then become important to describe the dependence of the PCs. For many products, the constant-stress degradation test cannot provide sufficient data for reliability evaluation and for this reason, accelerated degradation test is usually performed. In this article, we assume that a product has two PCs and that the PCs are governed by a Wiener process with a time scale transformation, and the relationship between the PCs is described by the Frank copula function. The copula parameter is dependent on stress and assumed to be a function of stress level that can be described by a logistic function. Based on these assumptions, a bivariate constant-stress accelerated degradation model is proposed here. The direct likelihood estimation of parameters of such a model becomes analytically intractable, and so the Bayesian Markov chain Monte Carlo (MCMC) method is developed here for this model for obtaining the maximum likelihood estimates (MLEs) efficiently. For an illustration of the proposed model and the method of inference, a simulated example is presented along with the associated computational results.  相似文献   

11.
Exponential distribution has an extensive application in reliability. Introducing shape parameter to this distribution have produced various distribution functions. In their study in 2009, Gupta and Kundu brought another distribution function using Azzalini's method, which is applicable in reliability and named as weighted exponential (WE) distribution. The parameters of this distribution function have been recently estimated by the above two authors in classical statistics. In this paper, Bayesian estimates of the parameters are derived. To achieve this purpose we use Lindley's approximation method for the integrals that cannot be solved in closed form. Furthermore, a Gibbs sampling procedure is used to draw Markov chain Monte Carlo samples from the posterior distribution indirectly and then the Bayes estimates of parameters are derived. The estimation of reliability and hazard functions are also discussed. At the end of the paper, some comparisons between classical and Bayesian estimation methods are studied by using Monte Carlo simulation study. The simulation study incorporates complete and Type-II censored samples.  相似文献   

12.
Temporal aggregation of cyclical models with business cycle applications   总被引:1,自引:0,他引:1  
This paper focuses on temporal aggregation of the cyclical component model as introduced by Harvey (1989). More specifically, it provides the properties of the aggregate process for any generic period of aggregation. As a consequence, the exact link between aggregate and disaggregate parameters can be easily derived. The cyclical model is important due to its relevance in the analysis of business cycle. Given this, two empirical applications are presented in order to compare the estimated parameters of the quarterly models for German and US gross domestic products with those of the corresponding models aggregated to annual frequency.  相似文献   

13.
In this paper, Erlang–Lindley distribution (ErLD) is proposed which offers a more flexible model for waiting time data. It has the property that it can accommodate increasing, bathtub, and inverted bathtub shapes. Several statistical and reliability properties are derived and studied. The moments, its associated measures, and the limiting distributions of order statistics are derived. The model parameters are estimated by maximum likelihood and method of moments. An application of the proposed distribution to some waiting time data shows that it can give a better fit than other important lifetime models.  相似文献   

14.
Step-stress accelerated degradation test (SSADT) plays an important role in assessing the lifetime distribution of highly reliable products under normal operating conditions when there are not enough test units available for testing purposes. Recently, the optimal SSADT plans are presented based on an underlying assumption that there is only one performance characteristic. However, many highly reliable products usually have complex structure, with their reliability being evaluated by two or more performance characteristics. At the same time, the degradation of these performance characteristics would be always positive and strictly increasing. In such a case, the gamma process is usually considered as a degradation process due to its independent and nonnegative increments properties. Therefore, it is of great interest to design an efficient SSADT plan for the products with multiple performance characteristics based on gamma processes. In this work, we first introduce reliability model of the degradation products with two performance characteristics based on gamma processes, and then present the corresponding SSADT model. Next, under the constraint of total experimental cost, the optimal settings such as sample size, measurement times, and measurement frequency are obtained by minimizing the asymptotic variance of the estimated 100 qth percentile of the product’s lifetime distribution. Finally, a numerical example is given to illustrate the proposed procedure.  相似文献   

15.
Storage reliability that measures the ability of products in a dormant state to keep their required functions is studied in this paper. Unlike the operational reliability, storage reliability for certain types of products may not be always 100% at the beginning of storage since there are existing possible initial failures that are normally neglected in the models of storage reliability. In this paper, a new combinatorial approach, the nonparametric measure for the estimates of the number of failed products and the current reliability at each testing time in storage, and the parametric measure for the estimates of the initial reliability and the failure rate based on the exponential reliability function, is proposed for estimating and predicting the storage reliability with possible initial failures. The proposed method has taken into consideration that the initial failure and the reliability testing data, before and during the storage process, are available for providing more accurate estimates of both initial failure probability and the probability of storage failures. When storage reliability prediction that is the main concern in this field should be made, the nonparametric estimates of failure numbers can be used into the parametric models for the failure process in storage. In the case of exponential models, the assessment and prediction method for storage reliability is provided in this paper. Finally, numerical examples are given to illustrate the method. Furthermore, a detailed comparison between the proposed method and the traditional method, for examining the rationality of assessment and prediction on the storage reliability, is presented. The results should be useful for planning a storage environment, decision-making concerning the maximum length of storage, and identifying the production quality.  相似文献   

16.
Summary.  The concept of reliability denotes one of the most important psychometric properties of a measurement scale. Reliability refers to the capacity of the scale to discriminate between subjects in a given population. In classical test theory, it is often estimated by using the intraclass correlation coefficient based on two replicate measurements. However, the modelling framework that is used in this theory is often too narrow when applied in practical situations. Generalizability theory has extended reliability theory to a much broader framework but is confronted with some limitations when applied in a longitudinal setting. We explore how the definition of reliability can be generalized to a setting where subjects are measured repeatedly over time. On the basis of four defining properties for the concept of reliability, we propose a family of reliability measures which circumscribes the area in which reliability measures should be sought. It is shown how different members assess different aspects of the problem and that the reliability of the instrument can depend on the way that it is used. The methodology is motivated by and illustrated on data from a clinical study on schizophrenia. On the basis of this study, we estimate and compare the reliabilities of two different rating scales to evaluate the severity of the disorder.  相似文献   

17.
From the viewpoint of service level agreements (SLAs), Internet service providers and customers are gradually focusing on transmission accuracy. The Internet service provider should provide the specific bandwidth and individual accuracy rate requirement by their SLAs to each customer. This paper mainly evaluates the system reliability that a stochastic computer network can fulfill all requirements at all sinks. An efficient algorithm is proposed to generate the lower boundary points, minimal capacity vectors satisfying the demand and accuracy rate requirement for all sinks. The system reliability can be computed in terms of such points by applying recursive sum of disjoint products.  相似文献   

18.
For reliability-critical and expensive products, it is necessary to estimate their residual lives based on available information, such as the degradation data, so that proper maintenance actions can be arranged to reduce or even avoid the occurrence of failures. In this work, by assuming that the product-to-product variability of the degradation is characterized by a skew-normal distribution, a generalized Wiener process-based degradation model is developed. Following that, the issue of residual life (RL) estimation of the target product is addressed in detail. The proposed degradation model provides greater flexibility to capture a variety of degradation processes, since several commonly used Wiener process-based degradation models can be seen as special cases. Through the EM algorism, the population-based degradation information is used to estimate the parameters of the model. Whenever new degradation measurement information of the target product becomes available, the degradation model is first updated based on the Bayesian method. In this way, the RL of the target product can be estimated in an adaptive manner. Finally, the developed methodology is demonstrated by a simulation study.  相似文献   

19.
Environmental variables have an important effect on the reliability of many products such as coatings and polymeric composites. Long-term prediction of the performance or service life of such products must take into account the probabilistic/stochastic nature of the outdoor weather. In this article, we propose a time series modeling procedure to model the time series data of daily accumulated degradation. Daily accumulated degradation is the total amount of degradation accrued within one day and can be obtained by using a degradation rate model for the product and the weather data. The fitted model of the time series can then be used to estimate the future distribution of cumulative degradation over a period of time, and to compute reliability measures such as the probability of failure. The modeling technique and estimation method are illustrated using the degradation of a solar reflector material. We also provide a method to construct approximate confidence intervals for the probability of failure.  相似文献   

20.
Meta-analytical approaches have been extensively used to analyze medical data. In most cases, the data come from different studies or independent trials with similar characteristics. However, these methods can be applied in a broader sense. In this paper, we show how existing meta-analytic techniques can also be used as well when dealing with parameters estimated from individual hierarchical data. Specifically, we propose to apply statistical methods that account for the variances (and possibly covariances) of such measures. The estimated parameters together with their estimated variances can be incorporated into a general linear mixed model framework. We illustrate the methodology by using data from a first-in-man study and a simulated data set. The analysis was implemented with the SAS procedure MIXED and example code is offered.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号