首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   19432篇
  免费   745篇
  国内免费   203篇
管理学   1837篇
劳动科学   2篇
民族学   127篇
人才学   1篇
人口学   435篇
丛书文集   1034篇
理论方法论   571篇
综合类   9238篇
社会学   1311篇
统计学   5824篇
  2024年   29篇
  2023年   156篇
  2022年   176篇
  2021年   234篇
  2020年   376篇
  2019年   487篇
  2018年   568篇
  2017年   758篇
  2016年   607篇
  2015年   641篇
  2014年   1004篇
  2013年   2663篇
  2012年   1416篇
  2011年   1202篇
  2010年   983篇
  2009年   947篇
  2008年   1064篇
  2007年   1116篇
  2006年   1040篇
  2005年   911篇
  2004年   811篇
  2003年   662篇
  2002年   564篇
  2001年   420篇
  2000年   341篇
  1999年   212篇
  1998年   145篇
  1997年   131篇
  1996年   111篇
  1995年   95篇
  1994年   80篇
  1993年   73篇
  1992年   64篇
  1991年   33篇
  1990年   40篇
  1989年   38篇
  1988年   32篇
  1987年   29篇
  1986年   23篇
  1985年   21篇
  1984年   22篇
  1983年   14篇
  1982年   11篇
  1981年   11篇
  1980年   7篇
  1979年   2篇
  1978年   2篇
  1977年   6篇
  1975年   2篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
1.
在关于货币政策影响经济主体风险承担水平,进而影响金融周期波动机制的研究中,基于风险承担渠道的相关研究较为成熟.区别于以往相关研究多关注货币政策实际采取的立场,文章基于货币政策反应函数渠道探讨了数量型与价格型货币政策反应函数对金融周期波动影响的时变机制.滚动回归的实证结果显示:无论数量型货币政策规则还是价格型货币政策规则,货币政策对信贷波动反应的敏感性主要影响金融周期的波动,但在价格型货币政策规则下,基于信贷视角观察金融周期波动时,货币政策信贷敏感性与货币政策资产价格敏感性对金融周期影响差异较小;较之于价格型货币政策规则,货币政策对信贷波动反应的敏感性在数量型货币政策规则下,对金融周期波动的影响更显著,并在一定程度上表现出随时间扩大的趋势.文章的创新之处在于:强调了货币政策通过政策反应函数渠道而非以往研究中较多关注的狭义风险承担渠道影响金融周期波动的事实,并构建计量模型对货币政策反应函数渠道影响金融周期波动的时变机制进行了详细刻画.  相似文献   
2.
In the area of quality of life research, researchers may ask respondents to rate importance as well as satisfaction of various life domains (such as job and health) and use importance ratings as weights to calculate overall, or global, life satisfaction. The practice of giving more important domains more weight, known as importance weighting, has not been without controversy. Several previous studies assessed importance weighting using the analytical approach of moderated regression. This study discusses major issues related to how importance weighting has been assessed. Specifically, this study highlights that studies on importance weighting without considering statistical power are prone to type II error, i.e., failing to reject the null hypothesis of no significant weighting effect when the null hypothesis is actually false. The sample size required for adequate statistical power to detect importance weighting functions appeared larger than most previous studies could offer.  相似文献   
3.
4.
In this article, the employment characteristics of pre-industrial and industrial cohorts of deaf men and women are compared with each other, as well as with a cohort of non-disabled siblings. The aim is to determine the extent to which the employment patterns of deaf persons lined up with those of non-disabled people and to see how nineteenth-century industrialization processes influenced their employment opportunities. This article challenges the widely held assumption that the nineteenth century constituted a definitive break by arguing that the professional lives of deaf people were not necessarily better before industrialization. Moreover, I demonstrate that the development of deaf schools in the course of the nineteenth century opened a new range of career opportunities for deaf individuals.  相似文献   
5.
Financial stress index (FSI) is considered to be an important risk management tool to quantify financial vulnerabilities. This paper proposes a new framework based on a hybrid classifier model that integrates rough set theory (RST), FSI, support vector regression (SVR) and a control chart to identify stressed periods. First, the RST method is applied to select variables. The outputs are used as input data for FSI–SVR computation. Empirical analysis is conducted based on monthly FSI of the Federal Reserve Bank of Saint Louis from January 1992 to June 2011. A comparison study is performed between FSI based on the principal component analysis and FSI–SVR. A control chart based on FSI–SVR and extreme value theory is proposed to identify the extremely stressed periods. Our approach identified different stressed periods including internet bubble, subprime crisis and actual financial stress episodes, along with the calmest periods, agreeing with those given by Federal Reserve System reports.  相似文献   
6.
As part of the celebration of the 40th anniversary of the Society for Risk Analysis and Risk Analysis: An International Journal, this essay reviews the 10 most important accomplishments of risk analysis from 1980 to 2010, outlines major accomplishments in three major categories from 2011 to 2019, discusses how editors circulate authors’ accomplishments, and proposes 10 major risk-related challenges for 2020–2030. Authors conclude that the next decade will severely test the field of risk analysis.  相似文献   
7.
This article presents a flood risk analysis model that considers the spatially heterogeneous nature of flood events. The basic concept of this approach is to generate a large sample of flood events that can be regarded as temporal extrapolation of flood events. These are combined with cumulative flood impact indicators, such as building damages, to finally derive time series of damages for risk estimation. Therefore, a multivariate modeling procedure that is able to take into account the spatial characteristics of flooding, the regionalization method top‐kriging, and three different impact indicators are combined in a model chain. Eventually, the expected annual flood impact (e.g., expected annual damages) and the flood impact associated with a low probability of occurrence are determined for a study area. The risk model has the potential to augment the understanding of flood risk in a region and thereby contribute to enhanced risk management of, for example, risk analysts and policymakers or insurance companies. The modeling framework was successfully applied in a proof‐of‐concept exercise in Vorarlberg (Austria). The results of the case study show that risk analysis has to be based on spatially heterogeneous flood events in order to estimate flood risk adequately.  相似文献   
8.
Emergency material allocation is an important part of postdisaster emergency logistics that is significant for improving rescue effectiveness and reducing disaster losses. However, the traditional single‐period allocation model often causes local surpluses or shortages and high cost, and prevents the system from achieving an equitable or optimal multiperiod allocation. To achieve equitable allocation of emergency materials in the case of serious shortages relative to the demand by victims, this article introduces a multiperiod model for allocation of emergency materials to multiple affected locations (using an exponential utility function to reflect the disutility loss due to material shortfalls), and illustrates the relationship between equity of allocations and the cost of emergency response. Finally, numerical examples are presented to demonstrate both the feasibility and the usefulness of the proposed model for achieving multiperiod equitable allocation of emergency material among multiple disaster locations. The results indicate that the introduction of a nonlinear utility function to reflect the disutility of large shortfalls can make the material allocation fairer, and minimize large losses due to shortfalls. We found that achieving equity has a significant but not unreasonable impact on emergency costs. We also illustrate that using differing utility functions for different types of materials adds an important dimension of flexibility.  相似文献   
9.
In studies with recurrent event endpoints, misspecified assumptions of event rates or dispersion can lead to underpowered trials or overexposure of patients. Specification of overdispersion is often a particular problem as it is usually not reported in clinical trial publications. Changing event rates over the years have been described for some diseases, adding to the uncertainty in planning. To mitigate the risks of inadequate sample sizes, internal pilot study designs have been proposed with a preference for blinded sample size reestimation procedures, as they generally do not affect the type I error rate and maintain trial integrity. Blinded sample size reestimation procedures are available for trials with recurrent events as endpoints. However, the variance in the reestimated sample size can be considerable in particular with early sample size reviews. Motivated by a randomized controlled trial in paediatric multiple sclerosis, a rare neurological condition in children, we apply the concept of blinded continuous monitoring of information, which is known to reduce the variance in the resulting sample size. Assuming negative binomial distributions for the counts of recurrent relapses, we derive information criteria and propose blinded continuous monitoring procedures. The operating characteristics of these are assessed in Monte Carlo trial simulations demonstrating favourable properties with regard to type I error rate, power, and stopping time, ie, sample size.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号