排序方式: 共有89条查询结果,搜索用时 9 毫秒
11.
《Econometrica : journal of the Econometric Society》2017,85(3):991-1012
We propose a novel methodology for evaluating the accuracy of numerical solutions to dynamic economic models. It consists in constructing a lower bound on the size of approximation errors. A small lower bound on errors is a necessary condition for accuracy: If a lower error bound is unacceptably large, then the actual approximation errors are even larger, and hence, the approximation is inaccurate. Our lower‐bound error analysis is complementary to the conventional upper‐error (worst‐case) bound analysis, which provides a sufficient condition for accuracy. As an illustration of our methodology, we assess approximation in the first‐ and second‐order perturbation solutions for two stylized models: a neoclassical growth model and a new Keynesian model. The errors are small for the former model but unacceptably large for the latter model under some empirically relevant parameterizations. 相似文献
12.
《Risk analysis》2018,38(8):1576-1584
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed‐form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling‐based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed‐form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks’s method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. 相似文献
13.
14.
《Journal of Statistical Computation and Simulation》2012,82(2-4):223-238
We propose a Monte Carlo sampling algorithm for estimating guananteed-coverage tolerance factors for non-normal continuous distributions with known shape but u n p w n location and scale. The algorithm is based on reformulating this root-finding problem as a quantile-estimation problem. The reformulation leads to a geometrical interpretation of the tolerance-interval factor. For arbitrary distribution shapes, we analytically and empirically investigate various relationships among tolerance- interval coverage, confidence, and sample size. 相似文献
15.
In this paper we provide a comprehensive Bayesian posterior analysis of trend determination in general autoregressive models. Multiple lag autoregressive models with fitted drifts and time trends as well as models that allow for certain types of structural change in the deterministic components are considered. We utilize a modified information matrix-based prior that accommodates stochastic nonstationarity, takes into account the interactions between long-run and short-run dynamics and controls the degree of stochastic nonstationarity permitted. We derive analytic posterior densities for all of the trend determining parameters via the Laplace approximation to multivariate integrals. We also address the sampling properties of our posteriors under alternative data generating processes by simulation methods. We apply our Bayesian techniques to the Nelson-Plosser macroeconomic data and various stock price and dividend data. Contrary to DeJong and Whiteman (1989a,b,c), we do not find that the data overwhelmingly favor the existence of deterministic trends over stochastic trends. In addition, we find evidence supporting Perron's (1989) view that some of the Nelson and Plosser data are best construed as trend stationary with a change in the trend function occurring at 1929. 相似文献
16.
Results of the Monte Carlo study of the performance of a maximum likelihood estimation in a Weibull parametric regression model with two explanatory variables are presented. One simulation run contained 1000 samples censored on the average by the amount of 0-30%. Each simulatedsample was generated in a form of two-factor two-level balanced experiment. The confidence intervals were computed using the large-sample normal approximation via the matrix of observed information. For small sample sizes the estimates of the scale parameter b of the loglifetime were significantly negatively biased, which resulted in a poor quality of confidence intervals for b and the low-level quantiles. All estimators improved their quality when the nominal value of b decreased. A moderate amount of censoring improved the quality of point and confidence estimation. The reparametrization b 7 produced rather accurate confidence intervals. Exact confidence intervals for b in case of non-censoring were obtained using the pivotal quantity b/b. 相似文献
17.
江水法 《南昌航空大学学报》2004,6(2):9-14
本文综合运用心理学、哲学、经济学、教育学的成果,以中国大学本科教育为例,对劳动复杂程度的模糊度量问题进行了探讨.认为专业劳动的复杂度由专业学习难度决定,专业学习难度由其横断学科和经验知识含量决定,而这些含量由专家系统提供的主要课程设置来体现.并由此建立了专业学科难度模糊谱系和相应的专业劳动复杂度模糊谱系. 相似文献
18.
This paper provides the percentiles obtained by simulation of an informational test statistic. It gives evidence that the widely used chi-square approximation to this test statistic is not suitable. 相似文献
19.
20.
《Journal of Statistical Computation and Simulation》2012,82(1):41-56
In this paper the researchers are presenting an upper bound for the distribution function of quadratic forms in normal vector with mean zero and positive definite covariance matrix. They also will show that the new upper bound is more precise than the one introduced by Okamoto [4] and the one introduced by Siddiqui [5]. Theoretical Error bounds for both, the new and Okamoto upper bounds are derived in this paper. For larger number of terms in any given positive definite quadratic form, a rough and easier upper bound is suggested. 相似文献