首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A maximum likelihood methodology for the parameters of models with an intractable likelihood is introduced. We produce a likelihood-free version of the stochastic approximation expectation-maximization (SAEM) algorithm to maximize the likelihood function of model parameters. While SAEM is best suited for models having a tractable “complete likelihood” function, its application to moderately complex models is a difficult or even impossible task. We show how to construct a likelihood-free version of SAEM by using the “synthetic likelihood” paradigm. Our method is completely plug-and-play, requires almost no tuning and can be applied to both static and dynamic models.  相似文献   

2.
How do we communicate nuanced regulatory information to different audiences, recognizing that the consumer audience is very different from the physician audience? In particular, how do we communicate the heterogeneity of treatment effects - the potential differences in treatment effects based on sex, race, and age? That is a fundamental question at the heart of this panel discussion. Each panelist addressed a specific “challenge question” during their 5-minute presentation, and the list of questions is provided. The presentations were followed by a question and answer session with members of the audience and the panelists.  相似文献   

3.
In this article, we study a new class of non negative distributions generated by the symmetric distributions around zero. For the special case of the distribution generated using the normal distribution, properties like moments generating function, stochastic representation, reliability connections, and inference aspects using methods of moments and maximum likelihood are studied. Moreover, a real data set is analyzed, illustrating the fact that good fits can result.  相似文献   

4.
Erlang distribution has wide applications in the field of reliability models, stochastic activity networks and many other fields. In this paper a recurrence relation for computing all moments of all order statistics arising from independent nonidentically distributed Erlang variables is established.  相似文献   

5.
A new method has been proposed to introduce an extra parameter to a family of distributions for more flexibility. A special case has been considered in detail, namely one-parameter exponential distribution. Various properties of the proposed distribution, including explicit expressions for the moments, quantiles, mode, moment-generating function, mean residual lifetime, stochastic orders, order statistics, and expression of the entropies, are derived. The maximum likelihood estimators of unknown parameters cannot be obtained in explicit forms, and they have to be obtained by solving non linear equations only. Further, we consider an extension of the two-parameter exponential distribution also, mainly for data analysis purposes. Two datasets have been analyzed to show how the proposed models work in practice.  相似文献   

6.
Several authors have discussed Kalman filtering procedures using a mixture of normals as a model for the distributions of the noise in the observation and/or the state space equations. Under this model, resulting posteriors involve a mixture of normal distributions, and a “collapsing method” must be found in order to keep the recursive procedure simple. We prove that the Kullback-Leibler distance between the mixture posterior and that of a single normal distribution is minimized when we choose the mean and variance of the single normal distribution to be the mean and variance of the mixture posterior. Hence, “collapsing by moments” is optimal in this sense. We then develop the resulting optimal algorithm for “Kalman filtering” for this situation, and illustrate its performance with an example.  相似文献   

7.
Hubert (1987Assignment Methods in Combinatorial Data Analysis) presented a class of permutation, or random assignment, techniques for assessing correspondence between general k-dimensional proximity measures on a set of “objects.” A major problem in higher-order assignment models is the prohibitive level of computation that is required. We present the first three exact moments of a test statistic for the symmetric cubic assignment model. Efficient computational formulas for the first three moments have been derived, thereby permitting approximation of the permutation distribution using well-known methods.  相似文献   

8.
Summary. The availability of intraday data on the prices of speculative assets means that we can use quadratic variation-like measures of activity in financial markets, called realized volatility, to study the stochastic properties of returns. Here, under the assumption of a rather general stochastic volatility model, we derive the moments and the asymptotic distribution of the realized volatility error—the difference between realized volatility and the discretized integrated volatility (which we call actual volatility). These properties can be used to allow us to estimate the parameters of stochastic volatility models without recourse to the use of simulation-intensive methods.  相似文献   

9.
We describe and illustrate approaches to data augmentation in multi-way contingency tables for which partial information, in the form of subsets of marginal totals, is available. In such problems, interest lies in questions of inference about the parameters of models underlying the table together with imputation for the individual cell entries. We discuss questions of structure related to the implications for inference on cell counts arising from assumptions about log-linear model forms, and a class of simple and useful prior distributions on the parameters of log-linear models. We then discuss “local move” and “global move” Metropolis–Hastings simulation methods for exploring the posterior distributions for parameters and cell counts, focusing particularly on higher-dimensional problems. As a by-product, we note potential uses of the “global move” approach for inference about numbers of tables consistent with a prescribed subset of marginal counts. Illustration and comparison of MCMC approaches is given, and we conclude with discussion of areas for further developments and current open issues.  相似文献   

10.
C. R. Rao pointed out that “The role of statistical methodology is to extract the relevant information from a given sample to answer specific questions about the parent population” and raised the question “What population does a sample represent”? Wrong specification can lead to invalid inference giving rise to a third kind of error. Rao introduced the concept of weighted distributions as a method of adjustment applicable to many situations.

In this paper, we study the relationship between the weighted distributions and the parent distributions in the context of reliability and life testing. These relationships depend on the nature of the weight function and give rise to interesting connections between the different ageing criteria of the two distributions. As special cases, the length biased distribution, the equilibrium distribution of the backward and forward recurrence times and the residual life distribution, which frequently arise in practice, are studied and their relationships with the original distribution are examined. Their survival functions, failure rates and mean residual life functions are compared and some characterization results are established.  相似文献   

11.
We develop a discrete-time affine stochastic volatility model with time-varying conditional skewness (SVS). Importantly, we disentangle the dynamics of conditional volatility and conditional skewness in a coherent way. Our approach allows current asset returns to be asymmetric conditional on current factors and past information, which we term contemporaneous asymmetry. Conditional skewness is an explicit combination of the conditional leverage effect and contemporaneous asymmetry. We derive analytical formulas for various return moments that are used for generalized method of moments (GMM) estimation. Applying our approach to S&P500 index daily returns and option data, we show that one- and two-factor SVS models provide a better fit for both the historical and the risk-neutral distribution of returns, compared to existing affine generalized autoregressive conditional heteroscedasticity (GARCH), and stochastic volatility with jumps (SVJ) models. Our results are not due to an overparameterization of the model: the one-factor SVS models have the same number of parameters as their one-factor GARCH competitors and less than the SVJ benchmark.  相似文献   

12.
In many practical situation the regression analysis with stochastic regressors is used. The estimations of this model are often influenced by a high degree of multicollinearity. For avoidance of this fact a criterion and a procedure for the selection of an optimal subset for regression will be derived on the base of the partition of the moments of the conditional normal distribution of the regressand under the condition of the regressors. Further two stage procedures improving the result of the subset regression. based also on the partition of the conditional moments will be given.  相似文献   

13.
关于大统计学理论问题的再思考   总被引:3,自引:1,他引:2       下载免费PDF全文
杨灿 《统计研究》1997,14(1):24-28
关于大统计学理论问题的再思考杨灿ABSTRACT"GeneralStatistics",asaninnovatoryconception,isreasonablebecauseitac-cordswiththeinherentrequirements...  相似文献   

14.
In this article, we give a new family of univariate distributions generated by the Logistic random variable. A special case of this family is the Logistic-Uniform distribution. We show that the Logistic-Uniform distribution provides great flexibility in modeling for symmetric, negatively and positively skewed, bathtub-shaped, “J”-shaped, and reverse “J”-shaped distributions. We discuss simulation issues, estimation by the methods of moments, maximum likelihood, and the new method of minimum spacing distance estimator. We also derive Shannon entropy and asymptotic distribution of the extreme order statistics of this distribution. The new distribution can be used effectively in the analysis of survival data since the hazard function of the distribution can be “J,” bathtub, and concave-convex shaped. The usefulness of the new distribution is illustrated through two real datasets by showing that it is more flexible in analyzing the data than the Beta Generalized-Exponential, Beta-Exponential, Beta-Normal, Beta-Laplace, Beta Generalized half-Normal, β-Birnbaum-Saunders, Gamma-Uniform, Beta Generalized Pareto, Beta Modified Weibull, Beta-Pareto, Generalized Modified Weibull, Beta-Weibull, and Modified-Weibull distributions.  相似文献   

15.
In this paper, we introduce a new distribution, called the alpha-skew generalized normal (ASGN), for GARCH models in modeling daily Value-at-Risk (VaR). Basic structural properties of the proposed distribution are derived including probability and cumulative density functions, moments and stochastic representation. The real data application based on ISE-100 index is given to show the performance of GARCH model specified under ASGN innovation distribution with respect to normal, Student’s-t, skew normal and generalized normal models in terms of the VaR accuracy. The empirical results show that GARCH model with ASGN innovation distribution generates the most accurate VaR forecasts for all confidence levels.  相似文献   

16.
After a brief review of social applications of Markov chains, the paper discusses nonlinear (“interactive”) Markov models in discrete and continuous time. The rather subtle relationship between the deterministic and stochastic versions of such models is explored by means of examples. It is shown that the behaviour of nonlinear systems over time periods of practical interest depends critically on the total size as well as on the system parameters. Particular attention is paid to strong and weak forms of quasi-stationarity exhibited by stochastic systems.  相似文献   

17.
The Minimum Wage     
Do moderate increases in the minimum wage reduce employment? If not, do they nevertheless raise wages? To examine these questions, we apply techniques of time series analysis and systems estimation that are commonly used in macroeconomics and finance to five panels of data that contain between 11 and 34 low-wage industries. Our answers are “No” and “Yes,” respectively. We find that increases in the federal minimum wage between 1947 and 1997 have raised average wages in many of these industries, especially the lowest wage ones. The effect on employment, however, is mixed and typically nonsignificant, even when average wages have risen.  相似文献   

18.
The frailty model in survival analysis accounts for unobserved heterogeneity between individuals by assuming that the hazard rate of an individual is the product of an individual specific quantity, called “frailty” and a baseline hazard rate. It is well known that the choice of the frailty distribution strongly affects the nonparametric estimate of the baseline hazard as well as that of the conditional probabilities. This paper reviews the basic concepts of a frailty model, presents various probability inequalities and other monotonicity results which may prove useful in choosing among alternative specifications. More specifically, our main result lies in seeing how well known stochastic orderings between distributions of two frailities translate into orderings between the corresponding survival functions. Some probabilistic aspects and implications of the models resulting from competing choices of the distributions of frailty or the baseline are compared.  相似文献   

19.
Conventional production function specifications are shown to impose restrictions on the probability distribution of output that cannot be tested with the conventional models. These restrictions have important implications for firm behavior under uncertainty. A flexible representation of a firm's stochastic technology is developed based on the moments of the probability distribution of output. These moments are a unique representation of the technology and are functions of inputs. Large-sample estimators are developed for a linear moment model that is sufficiently flexible to test the restrictions implied by conventional production function specifications. The flexible moment-based approach is applied to milk production data. The first three moments of output are statistically significant functions of inputs. The cross-moment restrictions implied by conventional models are rejected.  相似文献   

20.
The composed error of a stochastic frontier (SF) model consists of two random variables, and the identification of the model relies heavily on the distribution assumptions for each of these variables. While the literature has put much effort into applying various SF models to a wide range of empirical problems, little has been done to test the distribution assumptions of these two variables. In this article, by exploiting the specification structures of the SF model, we propose a centered-residuals-based method of moments which can be easily and flexibly applied to testing the distribution assumptions on both of the random variables and to estimating the model parameters. A Monte Carlo simulation is conducted to assess the performance of the proposed method. We also provide two empirical examples to demonstrate the use of the proposed estimator and test using real data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号