首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Many stochastic processes considered in applied probability models, and, in particular, in reliability theory, are processes of the following form: Shocks occur according to some point process, and each shock causes the process to have a random jump. Between shocks the process increases or decreases in some deterministic fashion. In this paper we study processes for which the rate of increase or decrease between shocks depends only on the height of the process. For such processes we find conditions under which the processes can be stochastically compared. We also study hybrid processes in which periods of increase and periods of decrease alternate. A further result yields a stochastic comparison of processes that start with a random jump, rather than processes in which there is at the beginning some random delay time before the first jump.Supported by NSF Grant DMS 9303891.  相似文献   

2.
Point processes are the stochastic models most suitable for describing physical phenomena that appear at irregularly spaced times, such as the earthquakes. These processes are uniquely characterized by their conditional intensity, that is, by the probability that an event will occur in the infinitesimal interval (t, t+Δt), given the history of the process up tot. The seismic phenomenon displays different behaviours on different time and size scales; in particular, the occurrence of destructive shocks over some centuries in a seismogenic region may be explained by the elastic rebound theory. This theory has inspired the so-called stress release models: their conditional intensity translates the idea that an earthquake produces a sudden decrease in the amount of strain accumulated gradually over time along a fault, and the subsequent event occurs when the stress exceeds the strength of the medium. This study has a double objective: the formulation of these models in the Bayesian framework, and the assignment to each event of a mark, that is its magnitude, modelled through a distribution that depends at timet on the stress level accumulated up to that instant. The resulting parameter space is constrained and dependent on the data, complicating Bayesian computation and analysis. We have resorted to Monte Carlo methods to solve these problems.  相似文献   

3.
We propose a new procedure for combining multiple tests in samples of right-censored observations. The new method is based on multiple constrained censored empirical likelihood where the constraints are formulated as linear functionals of the cumulative hazard functions. We prove a version of Wilks’ theorem for the multiple constrained censored empirical likelihood ratio, which provides a simple reference distribution for the test statistic of our proposed method. A useful application of the proposed method is, for example, examining the survival experience of different populations by combining different weighted log-rank tests. Real data examples are given using the log-rank and Gehan-Wilcoxon tests. In a simulation study of two sample survival data, we compare the proposed method of combining tests to previously developed procedures. The results demonstrate that, in addition to its computational simplicity, the combined test performs comparably to, and in some situations more reliably than previously developed procedures. Statistical software is available in the R package ‘emplik’.  相似文献   

4.
Considering the Fisher information about a single parameter contained in a progressively Type-II censored sample, the problem of optimal progressive censoring plans arises. By introducing the notion of asymptotically optimal designs, we show that right censoring is ‘almost’ optimal for many important distributions including scale families of extreme value, normal, logistic, and Laplace distributions. Moreover, it turns out that for the extreme value distribution right censoring is the only asymptotically optimal one-step censoring scheme.  相似文献   

5.
基于跨境电子商务的特点,本文构建了一个包含异质性贸易品生产商的三部门开放经济动态随机一般均衡模型,并在模型中引入了跨境电子商务出口贸易中介部门,定量分析了国外跨境电子商务税收冲击的经济效应。结果发现:①国外跨境电子商务税收冲击对我国产出的影响较为显著,抑制强度达71.3%,持续时间大致为10季;②对于厂商而言,税收冲击会提高产品价格,抑制国外居民消费,引发国内产出下降;为追求利润最大化,国内资本和劳动会转向传统贸易品和非贸易品;③提高跨境电子商务在出口贸易中的占比虽会导致收敛周期变长,但能更有效应对税收冲击;④跨境电子商务产品替代弹性越小,税收冲击的负面影响越小;且减少替代弹性能使税收冲击响应强度按64.22%的速度衰减,同时使产出波动周期平均缩短4.75季。据此,提出培育品牌卖家、增强消费者的品牌认同感,细化目标消费者、提高消费价格粘性,出台临时性出口补贴应对机制、落实解决出口退税难等对策。  相似文献   

6.
基于含实体与虚拟经济对货币供给影响机制的动态随机一般均衡(DSGE)模型,探析中国货币供给是内生的抑或是外生的,并重点研究了货币需求冲击和投机性冲击对整个宏观经济稳定的影响。研究结果表明:中国货币供给具有内生性;货币需求冲击和投机性冲击对货币供给波动有较大影响,进而对通货膨胀产生显著作用,因此加强公众预期的引导和管理,从而减弱货币需求冲击和投机性冲击的效应,对于中国防范和遏制通胀具有重要意义。  相似文献   

7.
The main goal in small area estimation is to use models to ‘borrow strength’ from the ensemble because the direct estimates of small area parameters are generally unreliable. However, model-based estimates from the small areas do not usually match the value of the single estimate for the large area. Benchmarking is done by applying a constraint, internally or externally, to ensure that the ‘total’ of the small areas matches the ‘grand total’. This is particularly useful because it is difficult to check model assumptions owing to the sparseness of the data. We use a Bayesian nested error regression model, which incorporates unit-level covariates and sampling weights, to develop a method to internally benchmark the finite population means of small areas. We use two examples to illustrate our method. We also perform a simulation study to further assess the properties of our method.  相似文献   

8.
中国经济向新常态转换的冲击影响机制研究   总被引:1,自引:0,他引:1  
本文分析了中国自2007年以来逐步向新常态经济转换的冲击影响机制。我们应用SV-TVP-VAR模型分析了2001Q1~2015Q3间以技术冲击和投资冲击代表的供求冲击对中国经济波动的动态影响机制,结果表明这两种冲击的影响机制都在2007年左右发生了结构性的变化,具体表现为:首先从影响的方向来看,投资冲击的短期影响为正但波动性加大,中长期的影响则变为负值且影响逐步增强,而无论是从短期还是中长期来看技术冲击对中国经济增长的正面影响逐步增强,但从2014年以来其影响有所下降;其次从影响的数量来看,分时段的方差分解表明2007年之后投资冲击对产出波动的解释力度大幅上升,而技术冲击的影响比较平稳。这些结论说明中国经济向新常态的转换主要源于需求侧的不利冲击,但最近以来供求冲击都呈现了不利影响的趋势,为此我们也提出了相应的政策建议。  相似文献   

9.
Hierarchical models are popular in many applied statistics fields including Small Area Estimation. One well known model employed in this particular field is the Fay–Herriot model, in which unobservable parameters are assumed to be Gaussian. In Hierarchical models assumptions about unobservable quantities are difficult to check. For a special case of the Fay–Herriot model, Sinharay and Stern [2003. Posterior predictive model checking in Hierarchical models. J. Statist. Plann. Inference 111, 209–221] showed that violations of the assumptions about the random effects are difficult to detect using posterior predictive checks. In this present paper we consider two extensions of the Fay–Herriot model in which the random effects are assumed to be distributed according to either an exponential power (EP) distribution or a skewed EP distribution. We aim to explore the robustness of the Fay–Herriot model for the estimation of individual area means as well as the empirical distribution function of their ‘ensemble’. Our findings, which are based on a simulation experiment, are largely consistent with those of Sinharay and Stern as far as the efficient estimation of individual small area parameters is concerned. However, when estimating the empirical distribution function of the ‘ensemble’ of small area parameters, results are more sensitive to the failure of distributional assumptions.  相似文献   

10.
《Econometric Reviews》2013,32(4):385-424
This paper introduces nonlinear dynamic factor models for various applications related to risk analysis. Traditional factor models represent the dynamics of processes driven by movements of latent variables, called the factors. Our approach extends this setup by introducing factors defined as random dynamic parameters and stochastic autocorrelated simulators. This class of factor models can represent processes with time varying conditional mean, variance, skewness and excess kurtosis. Applications discussed in the paper include dynamic risk analysis, such as risk in price variations (models with stochastic mean and volatility), extreme risks (models with stochastic tails), risk on asset liquidity (stochastic volatility duration models), and moral hazard in insurance analysis.

We propose estimation procedures for models with the marginal density of the series and factor dynamics parameterized by distinct subsets of parameters. Such a partitioning of the parameter vector found in many applications allows to simplify considerably statistical inference. We develop a two- stage Maximum Likelihood method, called the Finite Memory Maximum Likelihood, which is easy to implement in the presence of multiple factors. We also discuss simulation based estimation, testing, prediction and filtering.  相似文献   

11.
In this paper, two classes of censored δ shock models are studied. The first model is the lattice renewal binomial censored δ shock model, in which shocks occur independently at discrete time epochs according to Bernoulli trials. The second model is the Markovian-censored δ shock model, in which shocks arrive in accordance with a discrete time Markov chain. In both censored δ shock models, lifetime distributions of the considered systems are derived. By using the probability generating function method, the expectation and variance of the lifetime in the binomial-censored δ shock model are obtained, too. Some numerical examples are also provided to illustrate the proposed model.  相似文献   

12.
宋潇  柳明 《统计研究》2016,33(10):46-56
通过增加企业的海外融资和金融加速器机制,本文建立了一个可以改变资本项目开放程度的动态一般均衡模型,用于分析自贸区模式和全开放模式下海外资本市场冲击对本国经济的影响。模拟结果表明,长期来看,当海外资本市场波动较小时资本项目开放更有利于经济体的稳定;而当海外资本市场波动较大时,盲目开放则会加剧经济波动。短期来说,无论海外资本市场是否稳定,资本项目的开放都会增加经济体遭受海外冲击的风险。相比于全开放,自贸区所设定的局部开放模式能够有效地缓解海外资本市场波动对于本国投资和产出的短期负向冲击。  相似文献   

13.
李政等 《统计研究》2018,35(2):29-39
本文采用递归MVMQ-CAViaR模型,对境内外人民币利率极端风险溢出效应进行实证检验。结果表明:境内外人民币利率存在极端风险溢出效应,短期品种表现出显著的双向极端风险溢出,而长期品种以在岸利率对离岸利率单向极端风险溢出为主。在极端风险层面在岸市场仍然处于利率信息的中心地位,且暂时没有旁落离岸市场的担忧。一旦离岸人民币利率发生极端变动,央行会及时进行干预并引导市场参与者预期,降低在岸利率未来的极端风险水平;但面对小的离岸市场冲击,央行更倾向于让在岸利率进行自我调节,离岸冲击会提高在岸利率未来的极端风险水平。本文构建的模型具有较好的预测能力,有助于金融监管部门对离岸利率极端风险进行动态准确地管理。  相似文献   

14.
In brain mapping, the regions of the brain that are ‘activated’ by a task or external stimulus are detected by thresholding an image of test statistics. Often the experiment is repeated on several different subjects or for several different stimuli on the same subject, and the researcher is interested in the common points in the brain where ‘activation’ occurs in all test statistic images. The conjunction is thus defined as those points in the brain that show ‘activation’ in all images. We are interested in which parts of the conjunction are noise, and which show true activation in all test statistic images. We would expect truly activated regions to be larger than usual, so our test statistic is based on the volume of clusters (connected components) of the conjunction. Our main result is an approximate P-value for this in the case of the conjunction of two Gaussian test statistic images. The results are applied to a functional magnetic resonance experiment in pain perception.  相似文献   

15.
Absolute risk is the chance that a person with given risk factors and free of the disease of interest at age a will be diagnosed with that disease in the interval (a, a + τ]. Absolute risk is sometimes called cumulative incidence. Absolute risk is a “crude” risk because it is reduced by the chance that the person will die of competing causes of death before developing the disease of interest. Cohort studies admit flexibility in modeling absolute risk, either by allowing covariates to affect the cause-specific relative hazards or to affect the absolute risk itself. An advantage of cause-specific relative risk models is that various data sources can be used to fit the required components. For example, case–control data can be used to estimate relative risk and attributable risk, and these can be combined with registry data on age-specific composite hazard rates for the disease of interest and with national data on competing hazards of mortality to estimate absolute risk. Family-based designs, such as the kin-cohort design and collections of pedigrees with multiple affected individuals can be used to estimate the genotype-specific hazard of disease. Such analyses must be adjusted for ascertainment, and failure to take into account residual familial risk, such as might be induced by unmeasured genetic variants or by unmeasured behavioral or environmental exposures that are correlated within families, can lead to overestimates of mutation-specific absolute risk in the general population.  相似文献   

16.
This article investigates if the impact of uncertainty shocks on the U.S. economy has changed over time. To this end, we develop an extended factor augmented vector autoregression (VAR) model that simultaneously allows the estimation of a measure of uncertainty and its time-varying impact on a range of variables. We find that the impact of uncertainty shocks on real activity and financial variables has declined systematically over time. In contrast, the response of inflation and the short-term interest rate to this shock has remained fairly stable. Simulations from a nonlinear dynamic stochastic general equilibrium (DSGE) model suggest that these empirical results are consistent with an increase in the monetary authorities’ antiinflation stance and a “flattening” of the Phillips curve. Supplementary materials for this article are available online.  相似文献   

17.
Sometimes additive hazard rate model becomes more important to study than the celebrated (Cox, 1972) proportional hazard rate model. But the concept of the hazard function is sometimes abstract, in comparison to the concept of mean residual life function. In this paper, we have defined a new model called ‘dynamic additive mean residual life model’ where the covariates are time dependent, and study the closure of this model under different stochastic orders.  相似文献   

18.
Summary MultiGaussian models have the intrinsic property of imposing very little continuity to extreme values. If the variable that is being modeled is hydraulic conductivity and the processes being studied are groundwater flow and mass transport, the absence of continuous paths of extreme values will have a retardation effect in the computed travel times. In the case of radionuclide release of nuclear waste from a deep geological repository, underestimation of travel times may lead to unsafe decision making. To demonstrate the impact of the low continuity of extreme value implicit to multiGaussian modes, travel times are computed in a site similar to Finnsj?n-one of the sites in crystaline rock studied in Sweden-using two stochastic models with the same histogram and covariance, one of them is multiGaussian, and the other is not and displays high connectivity of extreme high values. The results show that the multiGaussian model leads to less conservative results than the non-multiGaussian one. Invoking the parisimony principle to select a multiGaussian model as the simplest model that can be fully described by a mean value and a covariance function should not be justification enough for such selection. If there is not enough data to characterize the connectivity of the extreme values and therefore distriminate whether a multiGaussian model is appropriate or not, less parismonious models must also be considered.  相似文献   

19.
国际油价冲击对中国宏观经济的影响   总被引:3,自引:0,他引:3  
段继红 《统计研究》2010,27(7):25-29
 长期以来,伴随油价冲击的往往是国际经济和社会的剧烈动荡,这使得油价冲击对宏观经济的影响成为日益重要的研究课题。本文首次运用结构向量自回归(SVAR)模型,研究了国际油价波动对我国宏观经济所产生的动态冲击效应。实证研究发现:国际油价上涨确实对产出有逆向影响,但冲击后的产出变化在回归到零值后会越过零值继续上升;国际油价上涨对CPI有正向影响,但影响不显著,且CPI并不会在当期就对油价冲击做出响应,而是有一个相当的滞后期,然后在达到一个高点之后慢慢下降,逐渐回归到0值,但在达到0值后还会继续向下;国际油价上涨对一年期存款利率基本没有影响。针对造成这种实证结果的原因,本文最后给出了相应的解释和政策建议。  相似文献   

20.
In a seminal paper, Godambe [1985. The foundations of finite sample estimation in stochastic processes. Biometrika 72, 419–428.] introduced the ‘estimating function’ approach to estimation of parameters in semi-parametric models under a filtering associated with a martingale structure. Later, Godambe [1987. The foundations of finite sample estimation in stochastic processes II. Bernoulli, Vol. 2. V.N.V. Science Press, 49–54.] and Godambe and Thompson [1989. An extension of quasi-likelihood Estimation. J. Statist. Plann. Inference 22, 137–172.] replaced this filtering by a more flexible conditioning. Abraham et al. [1997. On the prediction for some nonlinear time-series models using estimating functions. In: Basawa, I.V., et al. (Eds.), IMS Selected Proceedings of the Symposium on Estimating Functions, Vol. 32. pp. 259–268.] and Thavaneswaran and Heyde [1999. Prediction via estimating functions. J. Statist. Plann. Inference 77, 89–101.] invoked the theory of estimating functions for one-step ahead prediction in time-series models. This paper addresses the problem of simultaneous estimation of parameters and multi-step ahead prediction of a vector of future random variables in semi-parametric models by extending the inimitable approach of 13 and 14. The proposed technique is in conformity with the paradigm of the modern theory of estimating functions leading to finite sample optimality within a chosen class of estimating functions, which in turn are used to get the predictors. Particular applications of the technique give predictors that enjoy optimality properties with respect to other well-known criteria.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号