首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   11276篇
  免费   332篇
  国内免费   1篇
管理学   1637篇
民族学   44篇
人才学   1篇
人口学   933篇
丛书文集   52篇
理论方法论   1138篇
综合类   173篇
社会学   5143篇
统计学   2488篇
  2023年   65篇
  2020年   161篇
  2019年   257篇
  2018年   245篇
  2017年   369篇
  2016年   292篇
  2015年   234篇
  2014年   259篇
  2013年   1965篇
  2012年   373篇
  2011年   309篇
  2010年   284篇
  2009年   256篇
  2008年   337篇
  2007年   342篇
  2006年   278篇
  2005年   266篇
  2004年   240篇
  2003年   224篇
  2002年   245篇
  2001年   268篇
  2000年   239篇
  1999年   206篇
  1998年   179篇
  1997年   193篇
  1996年   177篇
  1995年   151篇
  1994年   131篇
  1993年   173篇
  1992年   166篇
  1991年   150篇
  1990年   169篇
  1989年   144篇
  1988年   139篇
  1987年   138篇
  1986年   121篇
  1985年   117篇
  1984年   150篇
  1983年   117篇
  1982年   124篇
  1981年   91篇
  1980年   115篇
  1979年   134篇
  1978年   89篇
  1977年   93篇
  1976年   95篇
  1975年   97篇
  1974年   82篇
  1973年   68篇
  1972年   67篇
排序方式: 共有10000条查询结果,搜索用时 296 毫秒
31.
There is an emerging consensus in empirical finance that realized volatility series typically display long range dependence with a memory parameter (d) around 0.4 (Andersen et al., 2001; Martens et al., 2004). The present article provides some illustrative analysis of how long memory may arise from the accumulative process underlying realized volatility. The article also uses results in Lieberman and Phillips (2004, 2005) to refine statistical inference about d by higher order theory. Standard asymptotic theory has an O(n-1/2) error rate for error rejection probabilities, and the theory used here refines the approximation to an error rate of o(n-1/2). The new formula is independent of unknown parameters, is simple to calculate and user-friendly. The method is applied to test whether the reported long memory parameter estimates of Andersen et al. (2001) and Martens et al. (2004) differ significantly from the lower boundary (d = 0.5) of nonstationary long memory, and generally confirms earlier findings.  相似文献   
32.
Summary. We develop a general methodology for tilting time series data. Attention is focused on a large class of regression problems, where errors are expressed through autoregressive processes. The class has a range of important applications and in the context of our work may be used to illustrate the application of tilting methods to interval estimation in regression, robust statistical inference and estimation subject to constraints. The method can be viewed as 'empirical likelihood with nuisance parameters'.  相似文献   
33.
Expectations, Capital Gains, and Income   总被引:2,自引:0,他引:2  
A theoretical framework for the measurement of income under uncertainty is developed that addresses some long-standing controversies about the treatment of capital gains. The consequences for economic analysis and policy making are potentially serious, because the treatment of capital gains can significantly affect some major macroeconomic aggregates, including national income and savings, balance of payments deficits, government deficits, and depreciation. (JEL O47 , P44 , Q32 )  相似文献   
34.
Missing data, and the bias they can cause, are an almost ever‐present concern in clinical trials. The last observation carried forward (LOCF) approach has been frequently utilized to handle missing data in clinical trials, and is often specified in conjunction with analysis of variance (LOCF ANOVA) for the primary analysis. Considerable advances in statistical methodology, and in our ability to implement these methods, have been made in recent years. Likelihood‐based, mixed‐effects model approaches implemented under the missing at random (MAR) framework are now easy to implement, and are commonly used to analyse clinical trial data. Furthermore, such approaches are more robust to the biases from missing data, and provide better control of Type I and Type II errors than LOCF ANOVA. Empirical research and analytic proof have demonstrated that the behaviour of LOCF is uncertain, and in many situations it has not been conservative. Using LOCF as a composite measure of safety, tolerability and efficacy can lead to erroneous conclusions regarding the effectiveness of a drug. This approach also violates the fundamental basis of statistics as it involves testing an outcome that is not a physical parameter of the population, but rather a quantity that can be influenced by investigator behaviour, trial design, etc. Practice should shift away from using LOCF ANOVA as the primary analysis and focus on likelihood‐based, mixed‐effects model approaches developed under the MAR framework, with missing not at random methods used to assess robustness of the primary analysis. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   
35.
Summary.  We discuss the inversion of the gas profiles (ozone, NO3, NO2, aerosols and neutral density) in the upper atmosphere from the spectral occultation measurements. The data are produced by the 'Global ozone monitoring of occultation of stars' instrument on board the Envisat satellite that was launched in March 2002. The instrument measures the attenuation of light spectra at various horizontal paths from about 100 km down to 10–20 km. The new feature is that these data allow the inversion of the gas concentration height profiles. A short introduction is given to the present operational data management procedure with examples of the first real data inversion. Several solution options for a more comprehensive statistical inversion are presented. A direct inversion leads to a non-linear model with hundreds of parameters to be estimated. The problem is solved with an adaptive single-step Markov chain Monte Carlo algorithm. Another approach is to divide the problem into several non-linear smaller dimensional problems, to run parallel adaptive Markov chain Monte Carlo chains for them and to solve the gas profiles in repetitive linear steps. The effect of grid size is discussed, and we present how the prior regularization takes the grid size into account in a way that effectively leads to a grid-independent inversion.  相似文献   
36.
With its roots in American pragmatism, symbolic interactionism has created a distinctive perspective and produced numerous important contributions and now offers significant prospects for the future. In this article, I review my intellectual journey with this perspective over forty years. This journey was initiated within the American society, sociology, and symbolic interaction of circa 1960. I note many of the contributions made by interactionists since that time, with particular focus on those who have contributed to the study of social organization and social process. I offer an agenda for the future based on currently underdeveloped areas that have potential. These are inequality orders, institutional analysis, collective action across space and time, and the integration of temporal and spatial orders. The article concludes with calls for further efforts at cross‐perspective dialogues, more attention to feminist scholars, and an elaborated critical pragmatism.  相似文献   
37.
38.
39.
40.
One hundred and sixty-four elite union leaders in the United States completed a survey to assess the condition of the American labor movement and the factors affecting it. The respondents included high-level international union and state federation officers, central labor presidents, and 58 shop stewards. This study compares the responses of these labor officials and finds that they have similar, negative assessments of the condition of the American labor movement. The study also finds that these leaders agree that the most important factors affecting the labor movement are collective bargaining rights, union leadership, union member solidarity, and the NLRB.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号