首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   8246篇
  免费   1027篇
  国内免费   30篇
管理学   1303篇
民族学   15篇
人口学   107篇
丛书文集   101篇
理论方法论   856篇
综合类   911篇
社会学   1684篇
统计学   4326篇
  2024年   12篇
  2023年   26篇
  2022年   41篇
  2021年   145篇
  2020年   240篇
  2019年   485篇
  2018年   404篇
  2017年   670篇
  2016年   504篇
  2015年   467篇
  2014年   538篇
  2013年   1682篇
  2012年   717篇
  2011年   400篇
  2010年   384篇
  2009年   307篇
  2008年   341篇
  2007年   213篇
  2006年   189篇
  2005年   198篇
  2004年   210篇
  2003年   194篇
  2002年   226篇
  2001年   190篇
  2000年   145篇
  1999年   60篇
  1998年   66篇
  1997年   43篇
  1996年   25篇
  1995年   32篇
  1994年   18篇
  1993年   24篇
  1992年   23篇
  1991年   10篇
  1990年   10篇
  1989年   8篇
  1988年   8篇
  1987年   6篇
  1986年   2篇
  1985年   7篇
  1984年   6篇
  1983年   7篇
  1982年   4篇
  1981年   3篇
  1980年   5篇
  1979年   1篇
  1978年   1篇
  1977年   5篇
  1975年   1篇
排序方式: 共有9303条查询结果,搜索用时 15 毫秒
1.
在关于货币政策影响经济主体风险承担水平,进而影响金融周期波动机制的研究中,基于风险承担渠道的相关研究较为成熟.区别于以往相关研究多关注货币政策实际采取的立场,文章基于货币政策反应函数渠道探讨了数量型与价格型货币政策反应函数对金融周期波动影响的时变机制.滚动回归的实证结果显示:无论数量型货币政策规则还是价格型货币政策规则,货币政策对信贷波动反应的敏感性主要影响金融周期的波动,但在价格型货币政策规则下,基于信贷视角观察金融周期波动时,货币政策信贷敏感性与货币政策资产价格敏感性对金融周期影响差异较小;较之于价格型货币政策规则,货币政策对信贷波动反应的敏感性在数量型货币政策规则下,对金融周期波动的影响更显著,并在一定程度上表现出随时间扩大的趋势.文章的创新之处在于:强调了货币政策通过政策反应函数渠道而非以往研究中较多关注的狭义风险承担渠道影响金融周期波动的事实,并构建计量模型对货币政策反应函数渠道影响金融周期波动的时变机制进行了详细刻画.  相似文献   
2.
This article highlights three dimensions to understanding children's well‐being during and after parental imprisonment which have not been fully explored in current research. A consideration of ‘time’ reveals the importance of children's past experiences and their anticipated futures. A focus on ‘space’ highlights the impact of new or altered environmental dynamics. A study of ‘agency’ illuminates how children cope within structural, material and social confines which intensify vulnerability and dependency. This integrated perspective reveals important differences in individual children's experiences and commonalities in broader systemic and social constraints on prisoners’ children. The paper analyses data from a prospective longitudinal study of 35 prisoners’ children during and after their (step) father's imprisonment to illustrate the arguments.  相似文献   
3.
In the area of quality of life research, researchers may ask respondents to rate importance as well as satisfaction of various life domains (such as job and health) and use importance ratings as weights to calculate overall, or global, life satisfaction. The practice of giving more important domains more weight, known as importance weighting, has not been without controversy. Several previous studies assessed importance weighting using the analytical approach of moderated regression. This study discusses major issues related to how importance weighting has been assessed. Specifically, this study highlights that studies on importance weighting without considering statistical power are prone to type II error, i.e., failing to reject the null hypothesis of no significant weighting effect when the null hypothesis is actually false. The sample size required for adequate statistical power to detect importance weighting functions appeared larger than most previous studies could offer.  相似文献   
4.
Financial stress index (FSI) is considered to be an important risk management tool to quantify financial vulnerabilities. This paper proposes a new framework based on a hybrid classifier model that integrates rough set theory (RST), FSI, support vector regression (SVR) and a control chart to identify stressed periods. First, the RST method is applied to select variables. The outputs are used as input data for FSI–SVR computation. Empirical analysis is conducted based on monthly FSI of the Federal Reserve Bank of Saint Louis from January 1992 to June 2011. A comparison study is performed between FSI based on the principal component analysis and FSI–SVR. A control chart based on FSI–SVR and extreme value theory is proposed to identify the extremely stressed periods. Our approach identified different stressed periods including internet bubble, subprime crisis and actual financial stress episodes, along with the calmest periods, agreeing with those given by Federal Reserve System reports.  相似文献   
5.
In this paper, we consider the deterministic trend model where the error process is allowed to be weakly or strongly correlated and subject to non‐stationary volatility. Extant estimators of the trend coefficient are analysed. We find that under heteroskedasticity, the Cochrane–Orcutt‐type estimator (with some initial condition) could be less efficient than Ordinary Least Squares (OLS) when the process is highly persistent, whereas it is asymptotically equivalent to OLS when the process is less persistent. An efficient non‐parametrically weighted Cochrane–Orcutt‐type estimator is then proposed. The efficiency is uniform over weak or strong serial correlation and non‐stationary volatility of unknown form. The feasible estimator relies on non‐parametric estimation of the volatility function, and the asymptotic theory is provided. We use the data‐dependent smoothing bandwidth that can automatically adjust for the strength of non‐stationarity in volatilities. The implementation does not require pretesting persistence of the process or specification of non‐stationary volatility. Finite‐sample evaluation via simulations and an empirical application demonstrates the good performance of proposed estimators.  相似文献   
6.
Strong orthogonal arrays (SOAs) were recently introduced and studied as a class of space‐filling designs for computer experiments. An important problem that has not been addressed in the literature is that of design selection for such arrays. In this article, we conduct a systematic investigation into this problem, and we focus on the most useful SOA(n,m,4,2 + )s and SOA(n,m,4,2)s. This article first addresses the problem of design selection for SOAs of strength 2+ by examining their three‐dimensional projections. Both theoretical and computational results are presented. When SOAs of strength 2+ do not exist, we formulate a general framework for the selection of SOAs of strength 2 by looking at their two‐dimensional projections. The approach is fruitful, as it is applicable when SOAs of strength 2+ do not exist and it gives rise to them when they do. The Canadian Journal of Statistics 47: 302–314; 2019 © 2019 Statistical Society of Canada  相似文献   
7.
Proportional hazards are a common assumption when designing confirmatory clinical trials in oncology. This assumption not only affects the analysis part but also the sample size calculation. The presence of delayed effects causes a change in the hazard ratio while the trial is ongoing since at the beginning we do not observe any difference between treatment arms, and after some unknown time point, the differences between treatment arms will start to appear. Hence, the proportional hazards assumption no longer holds, and both sample size calculation and analysis methods to be used should be reconsidered. The weighted log‐rank test allows a weighting for early, middle, and late differences through the Fleming and Harrington class of weights and is proven to be more efficient when the proportional hazards assumption does not hold. The Fleming and Harrington class of weights, along with the estimated delay, can be incorporated into the sample size calculation in order to maintain the desired power once the treatment arm differences start to appear. In this article, we explore the impact of delayed effects in group sequential and adaptive group sequential designs and make an empirical evaluation in terms of power and type‐I error rate of the of the weighted log‐rank test in a simulated scenario with fixed values of the Fleming and Harrington class of weights. We also give some practical recommendations regarding which methodology should be used in the presence of delayed effects depending on certain characteristics of the trial.  相似文献   
8.
《Risk analysis》2018,38(9):1988-2009
Harbor seals in Iliamna Lake, Alaska, are a small, isolated population, and one of only two freshwater populations of harbor seals in the world, yet little is known about their abundance or risk for extinction. Bayesian hierarchical models were used to estimate abundance and trend of this population. Observational models were developed from aerial survey and harvest data, and they included effects for time of year and time of day on survey counts. Underlying models of abundance and trend were based on a Leslie matrix model that used prior information on vital rates from the literature. We developed three scenarios for variability in the priors and used them as part of a sensitivity analysis. The models were fitted using Markov chain Monte Carlo methods. The population production rate implied by the vital rate estimates was about 5% per year, very similar to the average annual harvest rate. After a period of growth in the 1980s, the population appears to be relatively stable at around 400 individuals. A population viability analysis assessing the risk of quasi‐extinction, defined as any reduction to 50 animals or below in the next 100 years, ranged from 1% to 3%, depending on the prior scenario. Although this is moderately low risk, it does not include genetic or catastrophic environmental events, which may have occurred to the population in the past, so our results should be applied cautiously.  相似文献   
9.
This article proposes several estimators for estimating the ridge parameter k based on Poisson ridge regression (RR) model. These estimators have been evaluated by means of Monte Carlo simulations. As performance criteria, we have calculated the mean squared error (MSE), the mean value, and the standard deviation of k. The first criterion is commonly used, while the other two have never been used when analyzing Poisson RR. However, these performance criteria are very informative because, if several estimators have an equal estimated MSE, then those with low average value and standard deviation of k should be preferred. Based on the simulated results, we may recommend some biasing parameters that may be useful for the practitioners in the field of health, social, and physical sciences.  相似文献   
10.
Bioequivalence (BE) studies are designed to show that two formulations of one drug are equivalent and they play an important role in drug development. When in a design stage, it is possible that there is a high degree of uncertainty on variability of the formulations and the actual performance of the test versus reference formulation. Therefore, an interim look may be desirable to stop the study if there is no chance of claiming BE at the end (futility), or claim BE if evidence is sufficient (efficacy), or adjust the sample size. Sequential design approaches specially for BE studies have been proposed previously in publications. We applied modification to the existing methods focusing on simplified multiplicity adjustment and futility stopping. We name our method modified sequential design for BE studies (MSDBE). Simulation results demonstrate comparable performance between MSDBE and the original published methods while MSDBE offers more transparency and better applicability. The R package MSDBE is available at https://sites.google.com/site/modsdbe/ . Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号