首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1677篇
  免费   38篇
  国内免费   7篇
管理学   169篇
人口学   3篇
丛书文集   5篇
理论方法论   5篇
综合类   52篇
社会学   9篇
统计学   1479篇
  2024年   2篇
  2023年   4篇
  2022年   12篇
  2021年   5篇
  2020年   19篇
  2019年   64篇
  2018年   64篇
  2017年   113篇
  2016年   47篇
  2015年   32篇
  2014年   44篇
  2013年   395篇
  2012年   198篇
  2011年   45篇
  2010年   33篇
  2009年   43篇
  2008年   53篇
  2007年   64篇
  2006年   50篇
  2005年   59篇
  2004年   47篇
  2003年   36篇
  2002年   26篇
  2001年   36篇
  2000年   37篇
  1999年   39篇
  1998年   40篇
  1997年   22篇
  1996年   16篇
  1995年   15篇
  1994年   17篇
  1993年   6篇
  1992年   12篇
  1991年   8篇
  1990年   1篇
  1989年   3篇
  1988年   3篇
  1986年   2篇
  1985年   1篇
  1984年   3篇
  1983年   1篇
  1982年   1篇
  1981年   1篇
  1978年   1篇
  1977年   1篇
  1975年   1篇
排序方式: 共有1722条查询结果,搜索用时 15 毫秒
51.
Estimation and Properties of a Time-Varying EGARCH(1,1) in Mean Model   总被引:1,自引:1,他引:0  
Time-varying GARCH-M models are commonly employed in econometrics and financial economics. Yet the recursive nature of the conditional variance makes likelihood analysis of these models computationally infeasible. This article outlines the issues and suggests to employ a Markov chain Monte Carlo algorithm which allows the calculation of a classical estimator via the simulated EM algorithm or a simulated Bayesian solution in only O(T) computational operations, where T is the sample size. Furthermore, the theoretical dynamic properties of a time-varying-parameter EGARCH(1,1)-M are derived. We discuss them and apply the suggested Bayesian estimation to three major stock markets.  相似文献   
52.
Seasonal fractional ARIMA (ARFISMA) model with infinite variance innovations is used in the analysis of seasonal long-memory time series with large fluctuations (heavy-tailed distributions). Two methods, which are the empirical characteristic function (ECF) procedure developed by Knight and Yu [The empirical characteristic function in time series estimation. Econometric Theory. 2002;18:691–721] and the Two-Step method (TSM) are proposed to estimate the parameters of stable ARFISMA model. The ECF method estimates simultaneously all the parameters, while the TSM considers in the first step the Markov Chains Monte Carlo–Whittle approach introduced by Ndongo et al. [Estimation of long-memory parameters for seasonal fractional ARIMA with stable innovations. Stat Methodol. 2010;7:141–151], combined with the maximum likelihood estimation method developed by Alvarez and Olivares [Méthodes d'estimation pour des lois stables avec des applications en finance. Journal de la Société Française de Statistique. 2005;1(4):23–54] in the second step. Monte Carlo simulations are also used to evaluate the finite sample performance of these estimation techniques.  相似文献   
53.
This article analyzes a growing group of fixed T dynamic panel data estimators with a multifactor error structure. We use a unified notational approach to describe these estimators and discuss their properties in terms of deviations from an underlying set of basic assumptions. Furthermore, we consider the extendability of these estimators to practical situations that may frequently arise, such as their ability to accommodate unbalanced panels and common observed factors. Using a large-scale simulation exercise, we consider scenarios that remain largely unexplored in the literature, albeit being of great empirical relevance. In particular, we examine (i) the effect of the presence of weakly exogenous covariates, (ii) the effect of changing the magnitude of the correlation between the factor loadings of the dependent variable and those of the covariates, (iii) the impact of the number of moment conditions on bias and size for GMM estimators, and finally (iv) the effect of sample size. We apply each of these estimators to a crime application using a panel data set of local government authorities in New South Wales, Australia; we find that the results bear substantially different policy implications relative to those potentially derived from standard dynamic panel GMM estimators. Thus, our study may serve as a useful guide to practitioners who wish to allow for multiplicative sources of unobserved heterogeneity in their model.  相似文献   
54.
Tests for equality of variances using independent samples are widely used in data analysis. Conover et al. [A comparative study of tests for homogeneity of variance, with applications to the outer continental shelf bidding data. Technometrics. 1981;23:351–361], won the Youden Prize by comparing 56 variations of popular tests for variance on the basis of robustness and power in 60 different scenarios. None of the tests they compared were robust and powerful for the skewed distributions they considered. This study looks at 12 variations they did not consider, and shows that 10 are robust for the skewed distributions they considered plus the lognormal distribution, which they did not study. Three of these 12 have clearly superior power for skewed distributions, and are competitive in terms of robustness and power for all of the distributions considered. They are recommended for general use based on robustness, power, and ease of application.  相似文献   
55.
In simulation studies for discriminant analysis, misclassification errors are often computed using the Monte Carlo method, by testing a classifier on large samples generated from known populations. Although large samples are expected to behave closely to the underlying distributions, they may not do so in a small interval or region, and thus may lead to unexpected results. We demonstrate with an example that the LDA misclassification error computed via the Monte Carlo method may often be smaller than the Bayes error. We give a rigorous explanation and recommend a method to properly compute misclassification errors.  相似文献   
56.
In this paper, we investigate four existing and three new confidence interval estimators for the negative binomial proportion (i.e., proportion under inverse/negative binomial sampling). An extensive and systematic comparative study among these confidence interval estimators through Monte Carlo simulations is presented. The performance of these confidence intervals are evaluated in terms of their coverage probabilities and expected interval widths. Our simulation studies suggest that the confidence interval estimator based on saddlepoint approximation is more appealing for large coverage levels (e.g., nominal level≤1% ) whereas the score confidence interval estimator is more desirable for those commonly used coverage levels (e.g., nominal level>1% ). We illustrate these confidence interval construction methods with a real data set from a maternal congenital heart disease study.  相似文献   
57.
The finite-sample size properties of momentum-threshold autoregressive (MTAR) asymmetric unit root tests are examined in the presence of level shifts under the null hypothesis. The original MTAR test using a fixed threshold is found to exhibit severe size distortion when a break in level occurs early in the sample period, leading to an increased probability of an incorrect inference of asymmetric stationarity. For later breaks the test is also shown to suffer from undersizing. In contrast, the use of consistent-threshold estimation results in a test which is relatively robust to level shifts.  相似文献   
58.
This paper presents an efficient Monte Carlo simulation scheme based on the variance reduction methods to evaluate arithmetic average Asian options in the context of the double Heston's stochastic volatility model with jumps. This paper consists of two essential parts. The first part presents a new flexible stochastic volatility model, namely, the double Heston model with jumps. In the second part, by combining two variance reduction procedures via Monte Carlo simulation, we propose an efficient Monte Carlo simulation scheme for pricing arithmetic average Asian options under the double Heston model with jumps. Numerical results illustrate the efficiency of our method.  相似文献   
59.
We develop a novel computational methodology for Bayesian optimal sequential design for nonparametric regression. This computational methodology, that we call inhomogeneous evolutionary Markov chain Monte Carlo, combines ideas of simulated annealing, genetic or evolutionary algorithms, and Markov chain Monte Carlo. Our framework allows optimality criteria with general utility functions and general classes of priors for the underlying regression function. We illustrate the usefulness of our novel methodology with applications to experimental design for nonparametric function estimation using Gaussian process priors and free-knot cubic splines priors.  相似文献   
60.
Previous literature has shown that the addition of an untested surplus-lag Granger causality test can provide highly robust to stationary, non stationary, long memory, and structural break processes in the forcing variables. This study extends this approach to the partial unit root framework by simulation. Results show good size and power. Therefore, the surplus-lag approach is also robust to partial unit root processes.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号