首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   149篇
  免费   5篇
  国内免费   3篇
管理学   14篇
丛书文集   6篇
综合类   86篇
社会学   5篇
统计学   46篇
  2023年   2篇
  2022年   1篇
  2020年   2篇
  2019年   5篇
  2018年   5篇
  2017年   5篇
  2016年   6篇
  2015年   2篇
  2014年   9篇
  2013年   25篇
  2012年   4篇
  2011年   5篇
  2010年   7篇
  2009年   8篇
  2008年   9篇
  2007年   9篇
  2006年   13篇
  2005年   2篇
  2004年   3篇
  2003年   4篇
  2002年   5篇
  2001年   4篇
  2000年   5篇
  1998年   1篇
  1997年   1篇
  1996年   3篇
  1995年   1篇
  1994年   2篇
  1993年   4篇
  1992年   1篇
  1990年   2篇
  1988年   1篇
  1983年   1篇
排序方式: 共有157条查询结果,搜索用时 15 毫秒
71.
We describe a novel deterministic approximate inference technique for conditionally Gaussian state space models, i.e. state space models where the latent state consists of both multinomial and Gaussian distributed variables. The method can be interpreted as a smoothing pass and iteration scheme symmetric to an assumed density filter. It improves upon previously proposed smoothing passes by not making more approximations than implied by the projection onto the chosen parametric form, the assumed density. Experimental results show that the novel scheme outperforms these alternative deterministic smoothing passes. Comparisons with sampling methods suggest that the performance does not degrade with longer sequences.  相似文献   
72.
基于运动单站测向交叉定位中将滤波初值分成两类,该文从几何原理上解释了不同类型的滤波初值对定位结果的影响;提出了通过增加相位差观测量,利用相位差的变化率和方位角信息,将非线性系统转换成线性系统,再利用卡尔曼滤波来改善定位结果对滤波初值依赖性的方法。转换后的线性系统在利用卡尔曼滤波时,初始值可以根据以前的测量值得到。模拟仿真表明该方法能有效摆脱定位结果对滤波初值的依赖,且定位精度在2%以内。  相似文献   
73.
网络民意的失控及其舆论传播影响分析   总被引:1,自引:0,他引:1  
网络舆论提供了个体与大众之间平等对话的条件,但网站信息的同类搜集和网址链接,以及"虚拟社区"的集体民意同质性,往往导致信息的自主"协同过滤"(collaborative filtering).网民受众所能听到和看到的只能是自己的"回音"和"影子",这种狭隘的民意氛围又很可能导致"群体极化"(group polarization)现象的产生,"民意"很可能只是完全的自我情绪发泄和不可控的符号暴力.网络舆论环境极有可能导致"假民意"的流行,并为那些偏离常规的行为提供某种合法性和不可控性,出现"多数人暴政"的危险.对网络舆论必须进行有效的调控和引导.  相似文献   
74.
鉴于已知的一些Vague集间的相似度量和距离公式有缺陷,提出用分段函数表达的Vague(值)集间的接近度的定义,给出了三个加权接近度公式.给出在Vague环境下用Vague集间的接近度进行网络信息过滤的方法.应用实例表明所给公式皆是有效的.  相似文献   
75.
76.
We develop a continuous-time model for analyzing and valuing catastrophe mortality contingent claims based on stochastic modeling of the force of mortality. We derive parameter estimates from a 105-year time series of U.S. population mortality data using a simulated maximum likelihood approach based on a particle filter. Relying on the resulting parameters, we calculate loss profiles for a representative catastrophe mortality transaction and compare them to the “official” loss profiles that are provided by the issuers to investors and rating agencies. We find that although the loss profiles are subject to great uncertainties, the official figures fall significantly below the corresponding risk statistics based on our model. In particular, we find that the annualized incidence probability of a mortality catastrophe, defined as a 15% increase in aggregated mortality probabilities, is about 1.4%—compared to about 0.1% according to the official loss profiles.  相似文献   
77.
In this paper, me shall investigate a bootstrap method hasd on a martingale representation of the relevant statistic for inference to a class of functionals of the survival distribution. The method is similar in spirit to Efron's (1981) bootstrap, and thus in the present paper will be referred to as “martingale-based bootstrap” The method was derived from Lin,Wei and Ying (1993), who appiied the method in checking the Cox model with cumulative sums of martingale-based residuals. It is shown that this martingale-based bootstrap gives a correct first-order asymptotic approximation to the distribution function of the corresponding functional of the Kaplan-Meier estimator. As a consequence, confidence intervals constructed by the martingale-based bootstrap have asymptotially correct coverage probability. Our simulation study indicats that the martingale-based bootst strap method for a small and moderate sample sizes can be uniformly better than the usual bootstrap method in estimating the sampling distribution for a mean function and a point probability in survival analysis.  相似文献   
78.
当前所有实物期权理论研究都是基于完全信息(full information)假设.本文则通过研究投资者在部分信息(partial information)下极大化无限期消费效用的最优投资消费问题,得出实物期权的消费效用无差别价格.通过控制系统的分离原理,运用Kalman滤波技术和随机控制方法,得到了CARA效用函数情形下实物期权的自由边界偏微分方程.利用有限差分法,解得实物期权的隐含价值及最优执行水平从而得到最优投资消费策略和效用函数的数值解.通过蒙特卡洛模拟,给出了投资者在完全信息和部分信息下的动态决策差异,并且通过比较两种信息水平下的投资者福利给出了信息价值的测算.  相似文献   
79.
FRANZ Konecny 《Statistics》2013,47(1):113-118
In this paper we are concerned with a class of simple point processes, whose unobservable stochastic intensity is a shot-noise process. We derive a stochastic equation for the conditional moment generating function of the intensity, which can be solved in a recursive way. This yields explicit expression for the minimum variance estimate of the intensity as well as the likelihood ration with respect to the reference measure, on the basis of point process observations.  相似文献   
80.
This article presents a review of some modern approaches to trend extraction for one-dimensional time series, which is one of the major tasks of time series analysis. The trend of a time series is usually defined as a smooth additive component which contains information about the time series global change, and we discuss this and other definitions of the trend. We do not aim to review all the novel approaches, but rather to observe the problem from different viewpoints and from different areas of expertise. The article contributes to understanding the concept of a trend and the problem of its extraction. We present an overview of advantages and disadvantages of the approaches under consideration, which are: the model-based approach (MBA), nonparametric linear filtering, singular spectrum analysis (SSA), and wavelets. The MBA assumes the specification of a stochastic time series model, which is usually either an autoregressive integrated moving average (ARIMA) model or a state space model. The nonparametric filtering methods do not require specification of model and are popular because of their simplicity in application. We discuss the Henderson, LOESS, and Hodrick–Prescott filters and their versions derived by exploiting the Reproducing Kernel Hilbert Space methodology. In addition to these prominent approaches, we consider SSA and wavelet methods. SSA is widespread in the geosciences; its algorithm is similar to that of principal components analysis, but SSA is applied to time series. Wavelet methods are the de facto standard for denoising in signal procession, and recent works revealed their potential in trend analysis.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号