首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
This paper presents a new Laplacian approximation to the posterior density of η = g(θ). It has a simpler analytical form than that described by Leonard et al. (1989). The approximation derived by Leonard et al. requires a conditional information matrix Rη to be positive definite for every fixed η. However, in many cases, not all Rη are positive definite. In such cases, the computations of their approximations fail, since the approximation cannot be normalized. However, the new approximation may be modified so that the corresponding conditional information matrix can be made positive definite for every fixed η. In addition, a Bayesian procedure for contingency-table model checking is provided. An example of cross-classification between the educational level of a wife and fertility-planning status of couples is used for explanation. Various Laplacian approximations are computed and compared in this example and in an example of public school expenditures in the context of Bayesian analysis of the multiparameter Fisher-Behrens problem.  相似文献   

2.
A modified normal-based approximation for calculating the percentiles of a linear combination of independent random variables is proposed. This approximation is applicable in situations where expectations and percentiles of the individual random variables can be readily obtained. The merits of the approximation are evaluated for the chi-square and beta distributions using Monte Carlo simulation. An approximation to the percentiles of the ratio of two independent random variables is also given. Solutions based on the approximations are given for some classical problems such as interval estimation of the normal coefficient of variation, survival probability, the difference between or the ratio of two binomial proportions, and for some other problems. Furthermore, approximation to the percentiles of a doubly noncentral F distribution is also given. For all the problems considered, the approximation provides simple satisfactory solutions. Two examples are given to show applications of the approximation.  相似文献   

3.
Abstract

An economical quality control procedure is given when measurement error is present. The functional quality characteristic is observed indirectly and assumed to follow an integrated moving average of order one process. It is supposed to be independent of a white noise representing the measurement error. The limiting long-run average cost rate is evaluated. The optimum value of the control parameters, the inspection interval, and the control limit are obtained by minimizing the approximation of the loss function or the long-run cost rate.  相似文献   

4.
Progressively censored data from a classical Pareto distribution are to be used to make inferences about its shape and precision parameters and the reliability function. An approximation form due to Tierney and Kadane (1986) is used for obtaining the Bayes estimates. Bayesian prediction of further observations from this distribution is also considered. When the Bayesian approach is concerned, conjugate priors for either the one or the two parameters cases are considered. To illustrate the given procedures, a numerical example and a simulation study are given.  相似文献   

5.
The cost of certain types of warranties is closely related to functions that arise in renewal theory. The problem of estimating the warranty cost for a random sample of size n can be reduced to estimating these functions. In an earlier paper, I gave several methods of estimating the expected number of renewals, called the renewal function. This answered an important accounting question of how to arrive at a good approximation of the expected warranty cost. In this article, estimation of the renewal function is reviewed and several extensions are given. In particular, a resampling estimator of the renewal function is introduced. Further, I argue that managers may wish to examine other summary measures of the warranty cost, in particular the variability. To estimate this variability, I introduce estimators, both parametric and nonparametric, of the variance associated with the number of renewals. Several numerical examples are provided.  相似文献   

6.
First‐order probability matching priors are priors for which Bayesian and frequentist inference, in the form of posterior quantiles, or confidence intervals, agree to a second order of approximation. The authors show that the matching priors developed by Peers (1965) and Tibshirani (1989) are readily and uniquely implemented in a third‐order approximation to the posterior marginal density. The authors further show how strong orthogonality of parameters simplifies the arguments. Several examples illustrate their results.  相似文献   

7.
Properties of a scaled Burr type X distribution are given. Closed-form expressions for the moments only exist for certain special cases, so upper and lower bounds for the first moment are given, as well as an approximation based on these bounds. Maximum likelihood estimation is considered, and the asymptotic properties of these estimators are discussed for i.i.d. samples, as well as for Types I and II censoring. Finally, an extension to a multivariate Burr type X distribution is introduced.  相似文献   

8.
This article considers the problem of testing slopes in k straight lines with'heterogeneous variances. The statistic Fβ is proposed and the null and non-null distributions of Fβ derived under normality assumption. The power function values are then approximated by Laguerre polynomial expansion for normal and non-normal universes. For the example given in Graybill ‘1976, p. 295’, it is shown that the Satterthwaite approximation provides a close approximation to the null and non-null distributions in all the cases; it is also shown that the Fβ test is quite robust with respect to departure from normality in the case of mixtures of two normals.  相似文献   

9.
In this paper, a notion of generalized inner product spaces is introduced to study optimal estimating functions. The basic technique involves an idea of orthogonal projection first introduced by Small and McLeish (1988, 1989, 1991, 1992, 1994). A characterization of orthogonal projections in generalized inner product spaces is given. It is shown that the orthogonal projection of the score function into a linear subspace of estimating functions is optimal in that subspace, and a characterization of optimal estimating functions is given. As special cases of the main results of this paper, we derive the results of Godambe (1985) on the foundation of estimation in stochastic processes, the result of Godambe and Thompson (1989) on the extension of quasi-likelihood, and the generalized estimating equations for multivariate data due to Liang and Zeger (1986). Also we have derived optimal estimating functions in the Bayesian framework.  相似文献   

10.
The construction of a balanced incomplete block design (BIBD) is formulated in terms of combinatorial optimization by defining a cost function that reaches its lower bound on all and only those configurations corresponding to a BIBD. This cost function is a linear combination of distribution measures for each of the properties of a block design (number of plots, uniformity of rows, uniformity of columns, and balance). The approach generalizes naturally to a super-class BIBDs, which we call maximally balanced maximally uniform designs (MBMUDs), that allow two consecutive values for their design parameters [r,r+1;k,k+1;λ,λ+1]. In terms of combinatorial balance, MBMUDs are the closest possible approximation to BIBDs for all experimental settings where no set of admissible parameters exists. Thus, other design classes previously proposed with the same approximation aim—such as RDGs, SRDGs and NBIBDs of type I—can be viewed as particular cases of MBMUDs. Interestingly, experimental results show that the proposed combinatorial cost function has a monotonic relation with A- and D-statistical optimality in the space of designs with uniform rows and columns, while its computational cost is much lower.  相似文献   

11.
This paper addresses the problem of maximum a posteriori (MAP) sequence estimation in general state-space models. We consider two algorithms based on the sequential Monte Carlo (SMC) methodology (also known as particle filtering). We prove that they produce approximations of the MAP estimator and that they converge almost surely. We also derive a lower bound for the number of particles that are needed to achieve a given approximation accuracy. In the last part of the paper, we investigate the application of particle filtering and MAP estimation to the global optimization of a class of (possibly non-convex and possibly non-differentiable) cost functions. In particular, we show how to convert the cost-minimization problem into one of MAP sequence estimation for a state-space model that is “matched” to the cost of interest. We provide examples that illustrate the application of the methodology as well as numerical results.  相似文献   

12.
Double sampling scheme is used when cheap auxiliary variables may be measured to improve the estimation of a finite population parameter. Several estimators for population mean, ratio of means and variance are available, when two dependent samples are drawn. However, there are few proposals for the case of independent samples. In this paper both cases of dependent and independent samples are dealt with. A general approach for estimating a finite population parameter is given, showing that all the proposed estimators are particular cases of the same general class. The minimum variance bound for any estimator in this class is provided (at the first order of approximation). Furthermore, an optimal estimator which reaches this minimum is found.  相似文献   

13.
This article considers a sampling plan when the quality characteristic follows the exponential distribution. We provide the exact approach and propose an approximated approach. In the proposed approximation, a new statistic combined with a Weibull transformation is used for the normal approximation. The plan parameters are obtained through the two-point approach at the acceptable quality level (AQL) and the limiting quality level (LQL). The tables for plan parameters are reported according to various values of the AQL and the LQL when the producer's and the consumer's risks are given. A real example is given to illustrate the proposed approximation approach.  相似文献   

14.
Monotone bounds, depending on the underlying multinomial probabilities and the sample size, are given for the chi-squared approximation to the distribution of Pearson's statistic for goodness of fit for simple hypotheses. These bounds apply to the distribution of a single statistic and to the joint distribution of two statistics associated with the margins of a two-way table, in both the central and non-central cases.  相似文献   

15.
This paper is concerned with the measurement of total factor productivity(TFP) growth and technical change in Swedish manufacturing industries during the period 1964-1989. Two alternative formulations of technical change, viz., the time trend model and the general index model are used for measuring TFP growth and technical change. Measures of these are embedded in a cost minimization framework where we estimate the translog cost function along with the associated cost share equations. The cost function accommodates industry-specific variability through an error component model. The models are estimated using the iterative seemingly unrelated regression method after some transformations to the cost function. Empirical results show that the general index model is superior to the time trend model when the pattern of technical change and total factor productivity growth are examined.  相似文献   

16.
For a sum of not identic ally but independently distributed discrete random variables, its higher order large-deviation approximation in given. They are compared with the normal and Edge-worth type approximations in various cases. Consequently, the large-deviation approximations give sufficiently accurate results.  相似文献   

17.
This paper is concerned with the measurement of total factor productivity(TFP) growth and technical change in Swedish manufacturing industries during the period 1964–1989. Two alternative formulations of technical change, viz., the time trend model and the general index model are used for measuring TFP growth and technical change. Measures of these are embedded in a cost minimization framework where we estimate the translog cost function along with the associated cost share equations. The cost function accommodates industry-specific variability through an error component model. The models are estimated using the iterative seemingly unrelated regression method after some transformations to the cost function. Empirical results show that the general index model is superior to the time trend model when the pattern of technical change and total factor productivity growth are examined.  相似文献   

18.
Screening programs for breast cancer are widely used to reduce the impact of breast cancer in populations. For example, the South Australian Breast X–ray Service, BreastScreen SA, established in 1989, is a participant in the National Program of Early Detection of Breast Cancer. BreastScreen SA has collected information on both screening–detected and interval or self–reported cases, which enables the estimation of various important attributes of the screening mechanism. In this paper, a tailored model is fitted to the BreastScreen SA data. The probabilities that the screening detects a tumour of a given size and that an individual reports a tumour by a specified size in the absence of screening are estimated. Estimates of the distribution of sizes detected in the absence of screening, and at the first two screenings, are also given.  相似文献   

19.
A dual acceptance criterion in terms of the sample mean and an extremum (minimum or maximum) has been used in many inspection procedures in diverse industries. An approximation is given in Vangel (Technometrics, 2002, pp. 242-248) for the joint distribution of the sample mean and an extremum when the population is normally distributed. In this paper we obtain a simple expression that depends on the distribution of the sample mean and the truncated sample mean. This expression allows us to evaluate the joint distribution exactly, in two cases, or approximately, in more general cases, making the dual acceptance criterion easier to calculate in practice. We present a saddlepoint approximation for the joint tail probability, with the application to the dual acceptance criterion under the assumption of normality.  相似文献   

20.
The approximate normality of the cube root of the noncentral chi-square observed by Aty (1954) and an Edgeworth-series expansion are used to derive an approximation for the doubly noncentral-F distribution. Another approximation in terms of a noncentral-F distribution is also proposed. Both these approximations are seen to compare favorably with some earlier approximations due to Das Gupta (1968) and Tiku (1972). The problem of approximating the cumulants of the doubly noncentral-F variable, which is pivotal in Tiku’s approximation, is examined and use of a noncentral-F distribution is seen to provide a good solution for it. A FORTRAN routine for the Edgeworth-series approximation is given.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号