首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   11000篇
  免费   462篇
  国内免费   169篇
管理学   891篇
劳动科学   2篇
民族学   52篇
人才学   1篇
人口学   161篇
丛书文集   588篇
理论方法论   234篇
综合类   3775篇
社会学   650篇
统计学   5277篇
  2024年   17篇
  2023年   119篇
  2022年   153篇
  2021年   186篇
  2020年   267篇
  2019年   375篇
  2018年   419篇
  2017年   574篇
  2016年   398篇
  2015年   363篇
  2014年   503篇
  2013年   1803篇
  2012年   748篇
  2011年   506篇
  2010年   453篇
  2009年   442篇
  2008年   469篇
  2007年   517篇
  2006年   481篇
  2005年   501篇
  2004年   421篇
  2003年   391篇
  2002年   309篇
  2001年   283篇
  2000年   225篇
  1999年   116篇
  1998年   107篇
  1997年   94篇
  1996年   71篇
  1995年   63篇
  1994年   42篇
  1993年   33篇
  1992年   34篇
  1991年   25篇
  1990年   20篇
  1989年   13篇
  1988年   18篇
  1987年   17篇
  1986年   13篇
  1985年   11篇
  1984年   9篇
  1983年   8篇
  1982年   2篇
  1981年   1篇
  1980年   1篇
  1979年   3篇
  1978年   1篇
  1977年   1篇
  1976年   1篇
  1975年   4篇
排序方式: 共有10000条查询结果,搜索用时 750 毫秒
301.
One of the objectives of personalized medicine is to take treatment decisions based on a biomarker measurement. Therefore, it is often interesting to evaluate how well a biomarker can predict the response to a treatment. To do so, a popular methodology consists of using a regression model and testing for an interaction between treatment assignment and biomarker. However, the existence of an interaction is not sufficient for a biomarker to be predictive. It is only necessary. Hence, the use of the marker‐by‐treatment predictiveness curve has been recommended. In addition to evaluate how well a single continuous biomarker predicts treatment response, it can further help to define an optimal threshold. This curve displays the risk of a binary outcome as a function of the quantiles of the biomarker, for each treatment group. Methods that assume a binary outcome or rely on a proportional hazard model for a time‐to‐event outcome have been proposed to estimate this curve. In this work, we propose some extensions for censored data. They rely on a time‐dependent logistic model, and we propose to estimate this model via inverse probability of censoring weighting. We present simulations results and three applications to prostate cancer, liver cirrhosis, and lung cancer data. They suggest that a large number of events need to be observed to define a threshold with sufficient accuracy for clinical usefulness. They also illustrate that when the treatment effect varies with the time horizon which defines the outcome, then the optimal threshold also depends on this time horizon.  相似文献   
302.
Insights into the dynamics of human behavior in response to flooding are urgently needed for the development of effective integrated flood risk management strategies, and for integrating human behavior in flood risk modeling. However, our understanding of the dynamics of risk perceptions, attitudes, individual recovery processes, as well as adaptive (i.e., risk reducing) intention and behavior are currently limited because of the predominant use of cross-sectional surveys in the flood risk domain. Here, we present the results from one of the first panel surveys in the flood risk domain covering a relatively long period of time (i.e., four years after a damaging event), three survey waves, and a wide range of topics relevant to the role of citizens in integrated flood risk management. The panel data, consisting of 227 individuals affected by the 2013 flood in Germany, were analyzed using repeated-measures ANOVA and latent class growth analysis (LCGA) to utilize the unique temporal dimension of the data set. Results show that attitudes, such as the respondents’ perceived responsibility within flood risk management, remain fairly stable over time. Changes are observed partly for risk perceptions and mainly for individual recovery and intentions to undertake risk-reducing measures. LCGA reveal heterogeneous recovery and adaptation trajectories that need to be taken into account in policies supporting individual recovery and stimulating societal preparedness. More panel studies in the flood risk domain are needed to gain better insights into the dynamics of individual recovery, risk-reducing behavior, and associated risk and protective factors.  相似文献   
303.
个性化需求与零部件创新使得产品需求和补货提前期不确定,对供应链补货决策和运行成本产生重要影响。将提前期不确定因素引入Supply-hub协同补货研究中,探讨提前期随机和需求不确定情况下,考虑零部件配套性的三供应商单制造商生产两定制产品的Supply-hub协同补货决策问题;提出了三种补货策略,以供应链运行成本最小化为目标,建立不同策略下的供应链补货模型并求解最优补货批量和供应链最小运行成本;发现三种补货策略均存在唯一最优补货批量,基于Supply-hub的两种协同补货策略和基于分散决策的供应商独立补货策略各有优势,但基于Supply-hub的批量及时间协同的补货策略恒优于基于Supply-hub的集中补货策略。最后,通过MATLAB进行算例分析验证结论,发现基于Supply-hub的批量及时间协同的补货策略能有效降低需求不确定性带来的成本增加风险;通用件的提前期波动对于供应链期望运行成本的影响要高于定制件提前期波动的影响,因此在进行供应链补货策略选择时更加关注通用件提前期。  相似文献   
304.
大数据技术作为电子政务发展的推进器,不仅给技术进步带来深刻变革,而且给政府协同和公共服务模式带来了创新性发展。大数据作为“未来的石油”,已经成为一种重要的国家战略资源,正引领着新一轮科技创新,对中国电子政务可持续快速发展具有重要影响和推动作用。利用大数据来加强顶层设计、创新体制机制、整合信息资源、分析处理数据、促进政务协同、提供决策管理、提升公共服务和确保信息安全等方面已经得到重视和应用。“用数据说话、用数据决策、用数据管理、用数据创新”已经成为提升政府治理能力的新途径。  相似文献   
305.
文章基于2004-2013年中国31个省市的面板数据,采用动态面板模型和差分GMM估计方法,分别选取化学需氧量排放和氨氮排放作为水环境污染的有机污染物和无机污染物的排放指标,对贸易开放的结构效应引致的中国水环境污染排放进行了实证研究.研究结果表明,经济增长的规模和技术效应是影响水环境污染排放的主要因素,直接结构效应对中国水污染排放的影响不显著,贸易开放的结构效应也在一定程度上加剧了中国水环境污染的排放.通过引入贸易开放的相关交叉项进一步对决定贸易结构效应的比较优势来源进行识别,结果发现,对于中国水环境污染排放并不存在所谓的“污染天堂效应”和“要素禀赋效应”.贸易的结构效应会导致西部经济欠发达地区的水污染排放降低,而对中东部经济相对发达地区,贸易的结构效应会引致其水污染排放量的增大和排放强度的加剧.  相似文献   
306.
1980-1982年经济危机爆发后,香港因大量企业倒闭引发欠薪浪潮,欠薪优先权机制出现失灵.以国外经验为模本,香港构设了欠薪保障基金制度.该制度依托完善的法律程序将欠薪纳入专门基金予以垫付,并逐渐提高垫付标准.这种法律设计特点鲜明,实际运行数据也证明比较成功.这为解决当前农民工极端讨薪事件频发、欠薪问题季节性爆发的顽疾,提供了极具参考价值的制度化思路.我国应当鼓励地方探索和构建企业欠薪保障制度.  相似文献   
307.
面对资本市场风险加剧的现实背景,以"公司经营业绩与股票市场业绩一致趋优"为稳健型投资的核心要素,立足于区间数据表示、会计信息度量两个关键要素,开展稳健型股票价值投资的多准则决策建模研究。面向稳健型投资决策目标,提出满足"稳健性""局部性""全局性"3个特性的序化机理,围绕关键特征选择、特征评价、全序化建模的主体脉络建立系统性多准则决策方法,进而构建"稳健型股票价值投资决策"的研究框架。  相似文献   
308.
运用系统科学、过程方法研究具有时间与空间双重维度的政府创新过程,该过程由政府创新时间流程、空间要素两部分组成。研究发现:当前理论界与实务界通常将时间维度下的政府创新过程误读为一个简单线性过程,对空间维度下的政府创过程新缺乏系统整体性的深入探讨,而且没有对时间与空间维度进行整合。为此,从时间维度深化政府创新动态循环过程研究,从空间维度深化政府创新生态系统研究,从时空维度深化政府创新旅途的研究,是未来破解政府创新黑箱之谜,是深化政府创新过程研究的重要方向。  相似文献   
309.
Single cohort stage‐frequency data are considered when assessing the stage reached by individuals through destructive sampling. For this type of data, when all hazard rates are assumed constant and equal, Laplace transform methods have been applied in the past to estimate the parameters in each stage‐duration distribution and the overall hazard rates. If hazard rates are not all equal, estimating stage‐duration parameters using Laplace transform methods becomes complex. In this paper, two new models are proposed to estimate stage‐dependent maturation parameters using Laplace transform methods where non‐trivial hazard rates apply. The first model encompasses hazard rates that are constant within each stage but vary between stages. The second model encompasses time‐dependent hazard rates within stages. Moreover, this paper introduces a method for estimating the hazard rate in each stage for the stage‐wise constant hazard rates model. This work presents methods that could be used in specific types of laboratory studies, but the main motivation is to explore the relationships between stage maturation parameters that, in future work, could be exploited in applying Bayesian approaches. The application of the methodology in each model is evaluated using simulated data in order to illustrate the structure of these models.  相似文献   
310.
Bayesian methods are increasingly used in proof‐of‐concept studies. An important benefit of these methods is the potential to use informative priors, thereby reducing sample size. This is particularly relevant for treatment arms where there is a substantial amount of historical information such as placebo and active comparators. One issue with using an informative prior is the possibility of a mismatch between the informative prior and the observed data, referred to as prior‐data conflict. We focus on two methods for dealing with this: a testing approach and a mixture prior approach. The testing approach assesses prior‐data conflict by comparing the observed data to the prior predictive distribution and resorting to a non‐informative prior if prior‐data conflict is declared. The mixture prior approach uses a prior with a precise and diffuse component. We assess these approaches for the normal case via simulation and show they have some attractive features as compared with the standard one‐component informative prior. For example, when the discrepancy between the prior and the data is sufficiently marked, and intuitively, one feels less certain about the results, both the testing and mixture approaches typically yield wider posterior‐credible intervals than when there is no discrepancy. In contrast, when there is no discrepancy, the results of these approaches are typically similar to the standard approach. Whilst for any specific study, the operating characteristics of any selected approach should be assessed and agreed at the design stage; we believe these two approaches are each worthy of consideration. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号