首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 453 毫秒
1.
Continuing the studies of Johnson et al (1980) and Johnson and Kotz (1981), further distributions arising from models of errors in inspection and grading of samples from finite, possibly stratified lots are obtained. Screening, and hierarchal screening forms of inspection are also considered, and the effects of errors on the advantages of these techniques assessed.  相似文献   

2.
Graff and Roeloffs' (1972) modification of the Dorfman (1943) screening procedure, and their analysis of the effects of inspection error on properties of the procedure, is extended to hierarchical procedures, using the results of Kotz and Johnson (1982).  相似文献   

3.
4.
5.
We derive an explicit and computationally convenient form for the probability density function of the estimator of the process capability index Cpmk ( Pearn, Kotz and Johnson ), when sampling from a normal distribution.  相似文献   

6.
The well-known Johnson system of distributions was developed by N. L. Johnson (1949). Slifker and Shapiro (1980) presented a criterion for choosing a member from the three distributional classes (SB,SL, and Sv) in the Johnson system to fit a set of data. The criterion is based on the value of a quantile ratio which depends on a specified positive z value and the parameters of the distribution. In this paper, we present some properties of the quantile ratio for various distributions and for some selected z values. Some comments are made on using the criterion for selecting a Johnson distribution to fit empirical data.  相似文献   

7.
The influence of individual points in an ordinal logistic model is considered when the aim is to determine their effects on the predictive probability in a Bayesian predictive approach. Our concern is to study the effects produced when the data are slightly perturbed, in particular by observing how these perturbations will affect the predictive probabilities and consequently the classification of future cases. We consider the extent of the change in the predictive distribution when an individual point is omitted (deleted) from the sample by use of a divergence measure suggested by Johnson (1985) as a measure of discrepancy between the full data and the data with the case deleted. The methodology is illustrated on some data used in Titterington et al. (1981).  相似文献   

8.
This article compares four methods used to approximate value at risk (VaR) from the first four moments of a probability distribution: Cornish–Fisher, Edgeworth, Gram–Charlier, and Johnson distributions. Increasing rearrangements are applied to the first three methods. Simulation results suggest that for large sample situations, Johnson distributions yield the most accurate VaR approximation. For small sample situations with small tail probabilities, Johnson distributions yield the worst approximation. A particularly relevant case would be in banking applications for calculating the size of operational risk to cover certain loss types. For this case, the rearranged Gram–Charlier method is recommended.  相似文献   

9.
The Wehrly–Johnson family of bivariate circular distributions is by far the most general one currently available for modelling data on the torus. It allows complete freedom in the specification of the marginal circular densities as well as the binding circular density which regulates any dependence that might exist between them. We propose a parametric bootstrap approach for testing the goodness-of-fit of Wehrly–Johnson distributions when the forms of their marginal and binding densities are assumed known. The approach admits the use of any test for toroidal uniformity, and we consider versions of it incorporating three such tests. Simulation is used to illustrate the operating characteristics of the approach when the underlying distribution is assumed to be bivariate wrapped Cauchy. An analysis of wind direction data recorded at a Texan weather station illustrates the use of the proposed goodness-of-fit testing procedure.  相似文献   

10.
We describe an algorithm to fit an SU -Curve of the Johnson system by moment matching. The algorithm follows from a new parametrization, and reduces the problem to a root finding procedure that can be implemented efficiently using a bisection or a Newton-Raphson method. This allows the four parameters of the Johnson curve to be determined to any desired degree of accuracy, and is fast enough to be implemented in a real-time setting. A practical application of the method lies in the fact that many firms use the Johnson system to manage financial risk  相似文献   

11.
The expansion, in standard form, consists of some 66 terms involving polyno - mials in a normal deviate, and cumulants and cumulant products to order ten. An assumed order of magnitude reduces these terms to eight groups. Sign patterns in the terms are not obvious. We take a number of Pearson densities and assess from the expansions a set of standard percentiles (1%, 5%, 95%, 99%). Validity of the as-sessments is pivoted on two alternative models:(i) the Bowman-Shenton algorithm for percentage points of Pearson densities, (ii) the 4-moment Johnson translation model. This approach has wide application since the models have proved to be remarkably reliable when compared, and also when compared with simulation as-sessments. A brief account is given of acceleration of convergence for the series, but there seems to be no analogue of the Padè or Levin algorithms.

The Cornish-Fisher application to the Fisher z-statistic is studied and the cumu- lants defined in general. Irwin's expression for the density of means from Pearson Type II is recalled. There is an error in the Cornish-Fisher treatment of the z-statistic but this is one which has its source in the write-up. Again the Irwin density in the general case has a factor missing.  相似文献   

12.
13.
We consider a model when a process involving the production of elements is under inspection. The elements have possible failures due to competing risks. We assume the availability of a data set of failure times, D1, obtained when the process is under control. Our main goal is to test if the failure rates in D1 are equal to or less than the failure rates in another data set D2, against "undesirable" neighbouring alternatives. A class of tests based on a two-dimensional vector statistic is obtained. Linear test statistics with weight functions giving optimal local asymptotic power are derived. Martingale techniques are used. An example illustrates the derivation of reasonable tests  相似文献   

14.
15.
A regular supply of applicants to Queen's University in Kingston, Ontario is provided by 65 high schools. Each high school can be characterized by a series of grading standards which change from year to year. To aid admissions decisions, it is desirable to forecast the current year's grading standards for all 65 high schools using grading standards estimated from past year's data. We develop and apply a Bayesian break-point time-series model that generates forecasts which involve smoothing across time for each school and smoothing across schools. “Break point” refers to a point in time which divides the past into the “old past” and the “recent past” where the yearly observations in the recent past are exchangeable with the observations in the year to be forecast. We show that this model works fairly well when applied to 11 years of Queen's University data. The model can be applied to other data sets with the parallel time-series structure and short history, and can be extended in several ways to more complicated structures.  相似文献   

16.
In this paper we shall deal with the acceptance sampling plans when the remainder of rejected lots is inspected. We shall consider two types of LTPD plans- for inspection by variables and for inspection by variables and attributes (all items from the sample are inspected by variables, remainder of rejected lots is inspected by attributes). We shall report on an algorithm allowing the calculation of these plans when the non-central t distribution is used for the operating characteristic. The calculation is considerably difficult, algorithm for non-central t distribution takes several minutes. For the calculation we shall use an original method.  相似文献   

17.
从效率和社会福利的视角,构建混合所有制企业和民营企业的古诺竞争模型,发现影响企业效率和社会福利的因素非常复杂,并无确定结论孰优孰劣,但当它们技术效率相同时,混合所有制企业不仅能获得较高利润,还能给社会带来较高福利,这是其发展的理论机制。再利用1998—2007年《中国工业企业数据库》30多万家的企业数据,采用3年滚动窗口检验发现混合所有制企业的各项指标一直处于上升通道,因而它是一种极具发展潜力的企业类型,这是其发展的现实机制。  相似文献   

18.
In this paper we consider properties of the logarithmic and Tukey's lambda-type transformations of random variables that follow beta or unit-gamma distributions. Beta distributions often arise as models for random proportions, and unit-gamma distributions, although not well- known, may serve the same purpose. The latter possess many properties similar to those of beta distributions. Some transformations of random variables that follow a beta distribution are considered by Johnson (1949) and Johnson and Kotz (1970,1973). These are used to obtain a -new"random variable that potentially approximately follows a normal distribution, so that practical analyses become possible. We study normality -related properties of the above transformations. This is done for the first time for unit-gamma distributions. Under the logarithmic transformation the beta and unit-gamma distributions become, respectively, the logarithmic F and generalized logistic distributions. The distributions of the transformed beta and unit-gamma distributions after application of Tukey's lambda-type transformations cannot be derived easily; however, we obtain the first four moments and expressions for the skewness and kudos is of the transformed variables. Values of skewness and kurtosis for a variety of different parameter values are calculated, and in consequence, the near (or not near) normality of the transformed variables is evaluated. Comments on the use of the various transformations are provided..  相似文献   

19.
In this article, we apply the simulated annealing algorithm to determine optimally spaced inspection times for the two-parameter Weibull distribution for any given progressive Type-I grouped censoring plan. We examine how the asymptotic relative efficiencies of the estimates are affected by the position of the monitoring points and the number of monitoring points used. A comparison of different inspection plans is made that will enable the user to select a plan for a specified quality goal. Using the same algorithm, we can also determine an optimal progressive Type-I grouped censoring plan when the inspection times and the expected proportions of total failures in the experiment are pre-fixed. Finally, we discuss the sample size and the acceptance constant of the progressively Type-I grouped censored reliability sampling plan when the optimal inspection times are used.  相似文献   

20.
This paper attempts to develop a repetitive group sampling (RGS) plan by variables inspection for controlling the process fraction defective or the number of nonconformities when the quality characteristic follows a normal distribution and has only the lower or upper specification limit. The proposed sampling plan is derived by the exact sampling distribution rather than the approximation approach. The plan parameters are solved by a nonlinear optimization model which minimizes the average sample number required for inspection and fulfills the classical two-point conditions on the operating characteristic (OC) curve. The efficiency of the proposed variables RGS is examined and also compared with the existing variables single sampling plan in terms of the sample size required for inspection. The results indicate that the proposed variables RGS plan could significantly reduce samples required for inspection compared to the traditional variables single sampling plan.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号