首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   891篇
  免费   36篇
  国内免费   22篇
管理学   281篇
民族学   1篇
人口学   14篇
丛书文集   8篇
理论方法论   9篇
综合类   261篇
社会学   11篇
统计学   364篇
  2024年   1篇
  2023年   4篇
  2022年   13篇
  2021年   9篇
  2020年   30篇
  2019年   34篇
  2018年   42篇
  2017年   52篇
  2016年   34篇
  2015年   31篇
  2014年   42篇
  2013年   127篇
  2012年   63篇
  2011年   71篇
  2010年   39篇
  2009年   42篇
  2008年   58篇
  2007年   45篇
  2006年   53篇
  2005年   43篇
  2004年   26篇
  2003年   19篇
  2002年   17篇
  2001年   13篇
  2000年   5篇
  1999年   9篇
  1998年   4篇
  1997年   5篇
  1996年   6篇
  1995年   2篇
  1994年   2篇
  1993年   1篇
  1992年   2篇
  1990年   2篇
  1988年   1篇
  1987年   1篇
  1981年   1篇
排序方式: 共有949条查询结果,搜索用时 15 毫秒
1.
In this paper, we consider the deterministic trend model where the error process is allowed to be weakly or strongly correlated and subject to non‐stationary volatility. Extant estimators of the trend coefficient are analysed. We find that under heteroskedasticity, the Cochrane–Orcutt‐type estimator (with some initial condition) could be less efficient than Ordinary Least Squares (OLS) when the process is highly persistent, whereas it is asymptotically equivalent to OLS when the process is less persistent. An efficient non‐parametrically weighted Cochrane–Orcutt‐type estimator is then proposed. The efficiency is uniform over weak or strong serial correlation and non‐stationary volatility of unknown form. The feasible estimator relies on non‐parametric estimation of the volatility function, and the asymptotic theory is provided. We use the data‐dependent smoothing bandwidth that can automatically adjust for the strength of non‐stationarity in volatilities. The implementation does not require pretesting persistence of the process or specification of non‐stationary volatility. Finite‐sample evaluation via simulations and an empirical application demonstrates the good performance of proposed estimators.  相似文献   
2.
Many recent papers have used semiparametric methods, especially the log-periodogram regression, to detect and estimate long memory in the volatility of asset returns. In these papers, the volatility is proxied by measures such as squared, log-squared, and absolute returns. While the evidence for the existence of long memory is strong using any of these measures, the actual long memory parameter estimates can be sensitive to which measure is used. In Monte-Carlo simulations, I find that if the data is conditionally leptokurtic, the log-periodogram regression estimator using squared returns has a large downward bias, which is avoided by using other volatility measures. In United States stock return data, I find that squared returns give much lower estimates of the long memory parameter than the alternative volatility measures, which is consistent with the simulation results. I conclude that researchers should avoid using the squared returns in the semiparametric estimation of long memory volatility dependencies.  相似文献   
3.
There is an emerging consensus in empirical finance that realized volatility series typically display long range dependence with a memory parameter (d) around 0.4 (Andersen et al., 2001; Martens et al., 2004). The present article provides some illustrative analysis of how long memory may arise from the accumulative process underlying realized volatility. The article also uses results in Lieberman and Phillips (2004, 2005) to refine statistical inference about d by higher order theory. Standard asymptotic theory has an O(n-1/2) error rate for error rejection probabilities, and the theory used here refines the approximation to an error rate of o(n-1/2). The new formula is independent of unknown parameters, is simple to calculate and user-friendly. The method is applied to test whether the reported long memory parameter estimates of Andersen et al. (2001) and Martens et al. (2004) differ significantly from the lower boundary (d = 0.5) of nonstationary long memory, and generally confirms earlier findings.  相似文献   
4.
Impacts of complex emergencies or relief interventions have often been evaluated by absolute mortality compared to international standardized mortality rates. A better evaluation would be to compare with local baseline mortality of the affected populations. A projection of population-based survival data into time of emergency or intervention based on information from before the emergency may create a local baseline reference. We find a log-transformed Gaussian time series model where standard errors of the estimated rates are included in the variance to have the best forecasting capacity. However, if time-at-risk during the forecasted period is known then forecasting might be done using a Poisson time series model with overdispersion. Whatever, the standard error of the estimated rates must be included in the variance of the model either in an additive form in a Gaussian model or in a multiplicative form by overdispersion in a Poisson model. Data on which the forecasting is based must be modelled carefully concerning not only calendar-time trends but also periods with excessive frequency of events (epidemics) and seasonal variations to eliminate residual autocorrelation and to make a proper reference for comparison, reflecting changes over time during the emergency. Hence, when modelled properly it is possible to predict a reference to an emergency-affected population based on local conditions. We predicted childhood mortality during the war in Guinea-Bissau 1998-1999. We found an increased mortality in the first half-year of the war and a mortality corresponding to the expected one in the last half-year of the war.  相似文献   
5.
To capture mean and variance asymmetries and time‐varying volatility in financial time series, we generalize the threshold stochastic volatility (THSV) model and incorporate a heavy‐tailed error distribution. Unlike existing stochastic volatility models, this model simultaneously accounts for uncertainty in the unobserved threshold value and in the time‐delay parameter. Self‐exciting and exogenous threshold variables are considered to investigate the impact of a number of market news variables on volatility changes. Adopting a Bayesian approach, we use Markov chain Monte Carlo methods to estimate all unknown parameters and latent variables. A simulation experiment demonstrates good estimation performance for reasonable sample sizes. In a study of two international financial market indices, we consider two variants of the generalized THSV model, with US market news as the threshold variable. Finally, we compare models using Bayesian forecasting in a value‐at‐risk (VaR) study. The results show that our proposed model can generate more accurate VaR forecasts than can standard models.  相似文献   
6.
信息条件下的城市运输规划方法   总被引:1,自引:0,他引:1  
交通网络中信息的引入,改变了以往人们的出行行为。常规的出行需求预测模型在信息条件下,需要进行合理改进。本文提出了将传统的出行产生模型与动态交通模拟模型集成进行信息条件下的城市运输规划研究的一种新的方法框架  相似文献   
7.
我国股票市场上的交易方式有两种,即集合竞价和连续竞价,交易方式的差异会对股价的波动性产生影响,而市场的波动性对股票市场而言是双刃剑,因此从交易方式角度探求股市的适度波动成为理论工作的一个重心.本文以上海股票市场为研究对象,针对2001年的全部交易数据进行实证研究.结果表明,集合竞价形成的开盘价格收益率的方差大于连续竞价形成的收盘价格收益率的方差,其原因在于集合竞价与连续竞价相比,其交易过程的透明度差和交易指令具有不可更改性,据此,从交易制度上提出政策建议.  相似文献   
8.
文章介绍进入买方市场后 ,产能有限的制造企业进行需求管理的重要性 ,对需求管理的内涵、基本内容 (包括销售预测、订单管理、出货管理、销售分析 )、方法工具和工作流程进行具体分析。  相似文献   
9.
组合预测误差信息矩阵研究   总被引:11,自引:0,他引:11  
研究组合预测误差信息矩阵的结构与组合预测方法性质之间的联系,首次提出冗余信息概念,对最优组合预测方法的组合结构进行了研究。  相似文献   
10.
《Econometric Reviews》2008,27(1):268-297
Nonlinear functions of multivariate financial time series can exhibit long memory and fractional cointegration. However, tools for analysing these phenomena have principally been justified under assumptions that are invalid in this setting. Determination of asymptotic theory under more plausible assumptions can be complicated and lengthy. We discuss these issues and present a Monte Carlo study, showing that asymptotic theory should not necessarily be expected to provide a good approximation to finite-sample behavior.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号