共查询到20条相似文献,搜索用时 46 毫秒
1.
一、引言传统的债券定价方法—未来现金流量贴现法 ,是由美国的威廉姆斯 (Williams .JohnHenry)根据现值理论推导而来的 ,曾被广大投资者用来作为衡量债券投资价值的方法。然而 ,该模型由于贴现率的选取没有确定的标准 ,具有比较大的随意性 ,因而导致所计算的债券价格也表现出较大的随意性 ,逐渐暴露其不足之处。随着利率期限结构理论的不断发展 ,债券定价方法也相应地获得了很大的进展。尤其是最近十几年来出现了利率期限结构的随机过程无套利分析方法 ,该方法认为利率期限结构和债券价格同某些随机因素 (即状态变量 )相… 相似文献
2.
3.
论整群不等概率抽样技术彭念一ABSTRACTThepaperdiscussedtheimportantsignificanceofsamplingtechnologywithunequalprobability,expoundeditsbasicpr... 相似文献
4.
一、指数与随机变量关系的问题当研究某种商品的价格或物量的变动程度时,是通过计算价格指数p_1/p_0或物量指数q_1/q_0来反映的。但在研究多种商品构成的复杂总体的变动时,通常是计算加权比率平均指数或加权综合指数(简称综合指数)。这里需要解决的一个中心问题是,以基期指标还是以报告期指标作权数。这个问题,一直是统计界争论而未能解决的问题。这里试引入离散型随机变量,将个体指数和随机变量、权数和概率之间建立一一对应的关系。 相似文献
5.
6.
产业结构转换能力是指一个国家或地区为获得经济发展而具有的适时适宜地推动产业结构演进的能力。这种能力的大小决定了一国或地区产业结构演进的速度,产业结构转换能力愈强,产业结构演进的速度就快,产业结构的高级化就愈迅速;反之,产业结 相似文献
7.
8.
转换中的中国统计体系吴涧生编者按;本文是在世界银行经济学家和统计专家们于1991年对我国统计体系进行全面考察后所提出的报告基础上编译而成的,其特色主要在于它以市场经济下的统计体系为基点,以我国新国民经济核算体系的建立与发展为主线,系统而概括地评价了我... 相似文献
9.
回归分析中虚拟变量的系数转换曹志祥在工业统计回归分析中,有时遇到自变量是属性变量的情形,即该变量描述的现象是足性的,或是可分类的,但回归分析中不宜直接使用属性变量。因为,对属性变量所赋与的离散值之间的相等间距掩盖了不同属类之间的差异,用属性变量直接回... 相似文献
10.
在抽样中,我们经常会碰到观察个体大小不一的情况。例如,以乡为观察个体的农产量抽样估计、以学校为观察个体的学生身体素质抽样估计等。当我们所关心的标志即观察标志与个体大小基本呈等比例变化时,如果不分个体的大小而赋予它们同等被抽中的概率(即采用等该抽样法),那么在仍然采用简单估计的前提下,抽中大的个体就会给出过高的总体估计值,抽中小的个体就会给出过低的总体估计值,从而使抽样估计受到大的抽样误差的支配。例如,在由一个以县为个体的样本数据来估计全国出生人数时,就会产生这种情况。为了减少抽样误差,人们就想到了不等概率抽样设计,即按个体大小赋予它们不同被抽中的概率并据以给出总体估计值。然而不等概抽样的一个很大的缺点是操作过程麻烦,尤其是当样本容量大于2时的不重复的不等概抽样十分困难。因此在实践中,人们希望能在保证必需的抽样精度的前提下,仍通过等抽样这种简便的方法来解决观察个体大小不一的这类抽样估计问题,即把不等概问题等概化处理。通过初步研究,本文提出以下三种处理方法:分层抽样估计法、转移观察标志法和回归估计法。 相似文献
11.
For the assessment of agreement using probability criteria, we obtain an exact test, and for sample sizes exceeding 30, we give a bootstrap-t test that is remarkably accurate. We show that for assessing agreement, the total deviation index approach of Lin [2000. Total deviation index for measuring individual agreement with applications in laboratory performance and bioequivalence. Statist. Med. 19, 255–270] is not consistent and may not preserve its asymptotic nominal level, and that the coverage probability approach of Lin et al. [2002. Statistical methods in assessing agreement: models, issues and tools. J. Amer. Statist. Assoc. 97, 257–270] is overly conservative for moderate sample sizes. We also show that the nearly unbiased test of Wang and Hwang [2001. A nearly unbiased test for individual bioequivalence problems using probability criteria. J. Statist. Plann. Inference 99, 41–58] may be liberal for large sample sizes, and suggest a minor modification that gives numerically equivalent approximation to the exact test for sample sizes 30 or less. We present a simple and accurate sample size formula for planning studies on assessing agreement, and illustrate our methodology with a real data set from the literature. 相似文献
12.
《Journal of statistical planning and inference》2005,135(2):477-486
The probability to select the correct model is calculated for likelihood-ratio-based criteria to compare two nested models. If the more extended of the two models is true, the difference between twice the maximised log-likelihoods is approximately noncentral chi-square distributed with d.f. the difference in the number of parameters. The noncentrality parameter of this noncentral chi-square distribution can be approximated by twice the minimum Kullback–Leibler divergence (MKLD) of the best-fitting simple model to the true version of the extended model.The MKLD, and therefore the probability to select the correct model increases approximately proportionally to the number of observations if all observations are performed under the same conditions. If a new set of observations can only be performed under different conditions, the model parameters may depend on the conditions, and therefore have to be estimated for each set of observations separately. An increase in observations will then go together with an increase in the number of model parameters. In this case, the power of the likelihood-ratio test will increase with an increasing number of observations. However, the probability to choose the correct model with the AIC will only increase if for each set of observations the MKLD is more than 0.5. If the MKLD is less than 0.5, that probability will decrease. The probability to choose the correct model with the BIC will always decrease, sometimes after an initial increase for a small number of observation sets. The results are illustrated by a simulation study with a set of five nested nonlinear models for binary data. 相似文献
13.
Robert G. Staudte 《Statistics》2017,51(4):782-800
For every discrete or continuous location-scale family having a square-integrable density, there is a unique continuous probability distribution on the unit interval that is determined by the density-quantile composition introduced by Parzen in 1979. These probability density quantiles (pdQs) only differ in shape, and can be usefully compared with the Hellinger distance or Kullback–Leibler divergences. Convergent empirical estimates of these pdQs are provided, which leads to a robust global fitting procedure of shape families to data. Asymmetry can be measured in terms of distance or divergence of pdQs from the symmetric class. Further, a precise classification of shapes by tail behaviour can be defined simply in terms of pdQ boundary derivatives. 相似文献
14.
15.
16.
Luigi Greco 《Statistical Methods and Applications》1992,1(2):289-294
Summary The probability integral (p.i.) values of the correlation coefficient in samples from a normal bi-variate population are usually
computed by approximate methods, except for the first few values ofn. In this note we shall obtain the explicit expression for any sample size through a relation which also enables us to calculate
easily and quickly the p.i. exact values as well as those of the density function (d.f.). From this p.i. expression it is
also possible to obtain, among others, that of Student'st. 相似文献
17.
Hui Quan Zhixing Xu Junxiang Luo Gautier Paux Meehyung Cho Xun Chen 《Pharmaceutical statistics》2023,22(4):633-649
To design a phase III study with a final endpoint and calculate the required sample size for the desired probability of success, we need a good estimate of the treatment effect on the endpoint. It is prudent to fully utilize all available information including the historical and phase II information of the treatment as well as external data of the other treatments. It is not uncommon that a phase II study may use a surrogate endpoint as the primary endpoint and has no or limited data for the final endpoint. On the other hand, external information from the other studies for the other treatments on the surrogate and final endpoints may be available to establish a relationship between the treatment effects on the two endpoints. Through this relationship, making full use of the surrogate information may enhance the estimate of the treatment effect on the final endpoint. In this research, we propose a bivariate Bayesian analysis approach to comprehensively deal with the problem. A dynamic borrowing approach is considered to regulate the amount of historical data and surrogate information borrowing based on the level of consistency. A much simpler frequentist method is also discussed. Simulations are conducted to compare the performances of different approaches. An example is used to illustrate the applications of the methods. 相似文献
18.
A concept of the lack-of-memory property at a given time point c > 0 is introduced. It is equivalent to the concept of the almost-lack-of-memory (ALM) property of the random variables. A representation theorem is given for the cumulative distribution function of such random variables as well as for corresponding decompositions in terms of independent random variables. It is shown that a periodic failure rate for a random variable is equivalent to the ALM property. In addition some properties of the service time of an unreliable server are observed. 相似文献
19.
Peter Challenor 《Significance》2004,1(4):155-158
If you look at a map of the air temperature of the surface of the Earth, you will see that North West Europe, including the UK, is warmer than Alaska, which is at the same latitude but on the Pacific rather than Atlantic Ocean. At school you were probably told that this was because of the Gulf Stream. However, there is a very similar current in the Pacific—the Kuroshio—which takes warm water north past Japan and then out into the Atlantic. Peter Challenor asks: What is the unique feature of the Atlantic that keeps us warm and could it change in the next few years? 相似文献
20.
市场有效性 (marketefficiency)已经受到了广泛的检验。它探讨资本市场在证券价格形成中是否充分而准确反映全部相关信息。如果股票价格反映了全部能从市场交易数据中得到的信息 ,如过去的股价史、交易量、空头的利益等 ,这表明股市达到弱式有效性 (weakefficiency)。如果股票价格反映了投资者所能得到的有关公司股价、业绩等历史信息和与公司前景有关的其他公开信息 ,这表明股市达到半强式有效性 (semi strongefficiency)。如果股票价格反映了全部与公司有关的信息 ,甚至包括仅为内幕人… 相似文献