首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   19461篇
  免费   253篇
  国内免费   2篇
管理学   2821篇
民族学   134篇
人口学   3153篇
丛书文集   40篇
理论方法论   1237篇
综合类   475篇
社会学   8487篇
统计学   3369篇
  2023年   72篇
  2020年   147篇
  2019年   210篇
  2018年   1834篇
  2017年   1920篇
  2016年   1294篇
  2015年   229篇
  2014年   269篇
  2013年   1724篇
  2012年   625篇
  2011年   1347篇
  2010年   1238篇
  2009年   965篇
  2008年   1058篇
  2007年   1238篇
  2006年   247篇
  2005年   429篇
  2004年   421篇
  2003年   354篇
  2002年   268篇
  2001年   220篇
  2000年   201篇
  1999年   160篇
  1998年   133篇
  1997年   132篇
  1996年   160篇
  1995年   115篇
  1994年   89篇
  1993年   127篇
  1992年   140篇
  1991年   125篇
  1990年   133篇
  1989年   112篇
  1988年   115篇
  1987年   113篇
  1986年   103篇
  1985年   88篇
  1984年   118篇
  1983年   103篇
  1982年   99篇
  1981年   67篇
  1980年   97篇
  1979年   107篇
  1978年   75篇
  1977年   83篇
  1976年   75篇
  1975年   88篇
  1974年   71篇
  1973年   59篇
  1972年   65篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
941.
In April 2013, all of the major academic publishing houses moved thousands of journal titles to an original hybrid model, under which authors of accepted papers can choose between an expensive open access (OA) track and the traditional track available only to subscribers. This paper argues that authors might now use a publication strategy as a quality signaling device. The imperfect information game between authors and readers presents several types of Perfect Bayesian Equilibria, including a separating equilibrium in which only authors of high-quality papers are driven toward the open access track. The publishing house should choose an open-access publication fee that supports the emergence of the highest return equilibrium. Journal structures will evolve over time according to the journals’ accessibility and quality profiles.  相似文献   
942.
Robust parameter designs (RPDs) enable the experimenter to discover how to modify the design of the product to minimize the effect due to variation from noise sources. The aim of this article is to show how this amount of work can be reduced under modified central composite design (MCCD). We propose a measure of extended scaled prediction variance (ESPV) for evaluation of RPDs on MCCD. Using these measures, we show that we can check the error or bias associated with estimating the model parameters and suggest the values of α recommended for MCCS under minimum ESPV.  相似文献   
943.
We investigate the exact coverage and expected length properties of the model averaged tail area (MATA) confidence interval proposed by Turek and Fletcher, CSDA, 2012, in the context of two nested, normal linear regression models. The simpler model is obtained by applying a single linear constraint on the regression parameter vector of the full model. For given length of response vector and nominal coverage of the MATA confidence interval, we consider all possible models of this type and all possible true parameter values, together with a wide class of design matrices and parameters of interest. Our results show that, while not ideal, MATA confidence intervals perform surprisingly well in our regression scenario, provided that we use the minimum weight within the class of weights that we consider on the simpler model.  相似文献   
944.
We present the parallel and interacting stochastic approximation annealing (PISAA) algorithm, a stochastic simulation procedure for global optimisation, that extends and improves the stochastic approximation annealing (SAA) by using population Monte Carlo ideas. The efficiency of standard SAA algorithm crucially depends on its self-adjusting mechanism which presents stability issues in high dimensional or rugged optimisation problems. The proposed algorithm involves simulating a population of SAA chains that interact each other in a manner that significantly improves the stability of the self-adjusting mechanism and the search for the global optimum in the sampling space, as well as it inherits SAA desired convergence properties when a square-root cooling schedule is used. It can be implemented in parallel computing environments in order to mitigate the computational overhead. As a result, PISAA can address complex optimisation problems that it would be difficult for SAA to satisfactory address. We demonstrate the good performance of the proposed algorithm on challenging applications including Bayesian network learning and protein folding. Our numerical comparisons suggest that PISAA outperforms the simulated annealing, stochastic approximation annealing, and annealing evolutionary stochastic approximation Monte Carlo.  相似文献   
945.
946.
Crime or disease surveillance commonly rely in space-time clustering methods to identify emerging patterns. The goal is to detect spatial-temporal clusters as soon as possible after its occurrence and to control the rate of false alarms. With this in mind, a spatio-temporal multiple cluster detection method was developed as an extension of a previous proposal based on a spatial version of the Shiryaev–Roberts statistic. Besides the capability of multiple cluster detection, the method have less input parameter than the previous proposal making its use more intuitive to practitioners. To evaluate the new methodology a simulation study is performed in several scenarios and enlighten many advantages of the proposed method. Finally, we present a case study to a crime data-set in Belo Horizonte, Brazil.  相似文献   
947.
Residual marked empirical process-based tests are commonly used in regression models. However, they suffer from data sparseness in high-dimensional space when there are many covariates. This paper has three purposes. First, we suggest a partial dimension reduction adaptive-to-model testing procedure that can be omnibus against general global alternative models although it fully use the dimension reduction structure under the null hypothesis. This feature is because that the procedure can automatically adapt to the null and alternative models, and thus greatly overcomes the dimensionality problem. Second, to achieve the above goal, we propose a ridge-type eigenvalue ratio estimate to automatically determine the number of linear combinations of the covariates under the null and alternative hypotheses. Third, a Monte-Carlo approximation to the sampling null distribution is suggested. Unlike existing bootstrap approximation methods, this gives an approximation as close to the sampling null distribution as possible by fully utilising the dimension reduction model structure under the null model. Simulation studies and real data analysis are then conducted to illustrate the performance of the new test and compare it with existing tests.  相似文献   
948.
\(\alpha \)-Stable distributions are a family of probability distributions found to be suitable to model many complex processes and phenomena in several research fields, such as medicine, physics, finance and networking, among others. However, the lack of closed expressions makes their evaluation analytically intractable, and alternative approaches are computationally expensive. Existing numerical programs are not fast enough for certain applications and do not make use of the parallel power of general purpose graphic processing units. In this paper, we develop novel parallel algorithms for the probability density function and cumulative distribution function—including a parallel Gauss–Kronrod quadrature—, quantile function, random number generator and maximum likelihood estimation of \(\alpha \)-stable distributions using OpenCL, achieving significant speedups and precision in all cases. Thanks to the use of OpenCL, we also evaluate the results of our library with different GPU architectures.  相似文献   
949.
This study extends the affine Nelson–Siegel model by introducing the time-varying volatility component in the observation equation of yield curve, modeled as a standard EGARCH process. The model is illustrated in state-space framework and empirically compared to the standard affine and dynamic Nelson–Siegel model in terms of in-sample fit and out-of-sample forecast accuracy. The affine based extended model that accounts for time-varying volatility outpaces the other models for fitting the yield curve and produces relatively more accurate 6- and 12-month ahead forecasts, while the standard affine model comes with more precise forecasts for the very short forecast horizons. The study concludes that the standard and affine Nelson–Siegel models have higher forecasting capability than their counterpart EGARCH based models for the short forecast horizons, i.e., 1 month. The EGARCH based extended models have excellent performance for the medium and longer forecast horizons.  相似文献   
950.
Optimum experimental design theory has recently been extended for parameter estimation in copula models. The use of these models allows one to gain in flexibility by considering the model parameter set split into marginal and dependence parameters. However, this separation also leads to the natural issue of estimating only a subset of all model parameters. In this work, we treat this problem with the application of the \(D_s\)-optimality to copula models. First, we provide an extension of the corresponding equivalence theory. Then, we analyze a wide range of flexible copula models to highlight the usefulness of \(D_s\)-optimality in many possible scenarios. Finally, we discuss how the usage of the introduced design criterion also relates to the more general issue of copula selection and optimal design for model discrimination.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号