首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   16782篇
  免费   585篇
  国内免费   227篇
管理学   2044篇
劳动科学   2篇
民族学   62篇
人才学   3篇
人口学   319篇
丛书文集   825篇
理论方法论   377篇
综合类   8097篇
社会学   537篇
统计学   5328篇
  2024年   23篇
  2023年   129篇
  2022年   235篇
  2021年   263篇
  2020年   378篇
  2019年   485篇
  2018年   537篇
  2017年   667篇
  2016年   562篇
  2015年   585篇
  2014年   935篇
  2013年   2143篇
  2012年   1249篇
  2011年   1083篇
  2010年   891篇
  2009年   882篇
  2008年   972篇
  2007年   955篇
  2006年   872篇
  2005年   745篇
  2004年   608篇
  2003年   516篇
  2002年   454篇
  2001年   388篇
  2000年   247篇
  1999年   179篇
  1998年   96篇
  1997年   99篇
  1996年   73篇
  1995年   62篇
  1994年   46篇
  1993年   39篇
  1992年   38篇
  1991年   40篇
  1990年   24篇
  1989年   18篇
  1988年   15篇
  1987年   9篇
  1986年   7篇
  1985年   13篇
  1984年   8篇
  1983年   9篇
  1982年   5篇
  1981年   1篇
  1980年   1篇
  1979年   4篇
  1978年   2篇
  1977年   1篇
  1975年   1篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
871.
Vickrey提出的基于出行的瓶颈模型以出行作为分析单位,没有考虑出行与活动之间的相互关联.本文对Vickrey的瓶颈模型进行了拓展,提出了基于活动的瓶颈模型来研究通勤者早晨上班出发时间决策问题,模型考虑了通勤者对出行负效用与活动效用之间的权衡.在基于活动的瓶颈模型的基础上,分别研究了常数和线性边际活动效用下瓶颈动态拥挤收费和阶梯收费问题,并与传统的瓶颈模型的解进行比较.结果表明,当活动的边际效用为线性函数时,瓶颈处最优动态收费曲线不再呈分段线性关系,而是分段二次曲线;与基于活动的瓶颈模型相比,传统的基于出行的瓶颈模型将高估瓶颈处的排队延误、阶梯收费水平,以及早高峰的开始和结束时间;基于出行的瓶颈模型和常数边际活动效用下的瓶颈模型导致的最优阶梯收费是最优动态收费最大值的一半,并且刚好消除瓶颈处排队延误的一半;与线性边际活动效用下的瓶颈模型相比较,两者低估了阶梯收费能消除的瓶颈排队,从而低估了阶梯收费的效率.  相似文献   
872.
Sleep problems are common and impair the health and productivity of employees. Work characteristics constitute one possible cause of sleep problems, and sleeping poorly might influence wellbeing and performance at work. This study examines the reciprocal associations between sleep problems and psychosocial work characteristics. The participants were 1744 full-time employed individuals (56% women; mean age 38 years in 2007) from the Young Finns study who responded to questionnaires on work characteristics (conceptualised by the demand–control model and effort–reward imbalance model) and sleep problems (Jenkins Sleep Scale) in 2007 and 2012. Cross-lagged structural equation models are used to examine the associations. The results show that low control and low rewards at baseline predicted sleep problems. Baseline sleep problems predicted higher effort, higher effort–reward imbalance, and lower reward. Sleep problems also predicted lower odds for belonging to the low (rather than high) job strain group and active jobs group. The association between work characteristics and sleep problems appears to be reciprocal, with a stressful work environment increasing sleep problems, and sleep problems influencing future work characteristics. The results emphasise the importance of interventions aimed at both enhancing sleep quality and reducing psychosocial risks at work.  相似文献   
873.
Smart manufacturing systems (SMSs) are envisioned to contain highly automated and IT-driven production systems. To address the complexity that arises in such systems, a standard and holistic model for describing its activities and their interrelationships is needed. This paper introduces a factory design and improvement (FDI) activity model and illustrates a case study of FDI in an electromechanical component factory. In essence, FDI is a reference activity model that encompasses a range of manufacturing system activities for designing and improving a factory during its initial development and also its operational phases. The FDI model shows not only the dependency between activities and manufacturing control levels but also the pieces of information and software functions each activity relies on. We envision that the availability of these pieces of information in digital form to integrate across the software functions will increase the agility of factory design and improvement projects. Therefore, our future work lies in contributing to standards for exchanging such information.  相似文献   
874.
As flood risks grow worldwide, a well‐designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood‐loss‐sharing program involving private insurance based on location‐specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS‐based flood model and a stochastic optimization procedure with respect to location‐specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile‐related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures.  相似文献   
875.
A novel method was used to incorporate in vivo host–pathogen dynamics into a new robust outbreak model for legionellosis. Dose‐response and time‐dose‐response (TDR) models were generated for Legionella longbeachae exposure to mice via the intratracheal route using a maximum likelihood estimation approach. The best‐fit TDR model was then incorporated into two L. pneumophila outbreak models: an outbreak that occurred at a spa in Japan, and one that occurred in a Melbourne aquarium. The best‐fit TDR from the murine dosing study was the beta‐Poisson with exponential‐reciprocal dependency model, which had a minimized deviance of 32.9. This model was tested against other incubation distributions in the Japan outbreak, and performed consistently well, with reported deviances ranging from 32 to 35. In the case of the Melbourne outbreak, the exponential model with exponential dependency was tested against non‐time‐dependent distributions to explore the performance of the time‐dependent model with the lowest number of parameters. This model reported low minimized deviances around 8 for the Weibull, gamma, and lognormal exposure distribution cases. This work shows that the incorporation of a time factor into outbreak distributions provides models with acceptable fits that can provide insight into the in vivo dynamics of the host‐pathogen system.  相似文献   
876.
Recently, the concept of black swans has gained increased attention in the fields of risk assessment and risk management. Different types of black swans have been suggested, distinguishing between unknown unknowns (nothing in the past can convincingly point to its occurrence), unknown knowns (known to some, but not to relevant analysts), or known knowns where the probability of occurrence is judged as negligible. Traditional risk assessments have been questioned, as their standard probabilistic methods may not be capable of predicting or even identifying these rare and extreme events, thus creating a source of possible black swans. In this article, we show how a simulation model can be used to identify previously unknown potentially extreme events that if not identified and treated could occur as black swans. We show that by manipulating a verified and validated model used to predict the impacts of hazards on a system of interest, we can identify hazard conditions not previously experienced that could lead to impacts much larger than any previous level of impact. This makes these potential black swan events known and allows risk managers to more fully consider them. We demonstrate this method using a model developed to evaluate the effect of hurricanes on energy systems in the United States; we identify hurricanes with potentially extreme impacts, storms well beyond what the historic record suggests is possible in terms of impacts.  相似文献   
877.
ABSTRACT

A recent prevalence of high visibility catastrophic events has garnered increased attention to process safety issues. While the use of Behavior-Based Safety interventions demonstrate a reduction in workplace injuries by targeting employee behavior, we believe that process safety requires a greater focus on the behavior of leaders (e.g., creating and executing strategy). One effective method to begin targeting leader behavior for the improvement of process safety is to teach leaders about the principles of behavior, including ways by which the science may be applied within their own organizational models.  相似文献   
878.
Abstract

Manufacturing applications address business to business (B2B) with highly customised applications developed for specific requirements, offering highly specialised solution-oriented and service-based software components, systems, and digital tools that aim at a fast and accurate decision-making support system. The purpose of this paper is to describe the implementation of digital technologies for operations management using manufacturing or engineering apps (eApps), for product design and manufacturing processes. In particular, starting from the specific needs of two companies from mature European industries as automotive and food, this work depicts how this kind of solutions can support companies and improve their operations. In particular, related benefits and challenges faced for the full implementation of the developed tools are highlighted. Moreover a business model to exploit the manufacturing apps is also proposed. The business model proposed for the exploitation of the eApps supports the commercialisation of all the revenue streams offered by this rapidly growing sector taking into account the specific needs of the concerned stakeholders through a diversified value proposition.  相似文献   
879.
In this paper, we provide efficient estimators and honest confidence bands for a variety of treatment effects including local average (LATE) and local quantile treatment effects (LQTE) in data‐rich environments. We can handle very many control variables, endogenous receipt of treatment, heterogeneous treatment effects, and function‐valued outcomes. Our framework covers the special case of exogenous receipt of treatment, either conditional on controls or unconditionally as in randomized control trials. In the latter case, our approach produces efficient estimators and honest bands for (functional) average treatment effects (ATE) and quantile treatment effects (QTE). To make informative inference possible, we assume that key reduced‐form predictive relationships are approximately sparse. This assumption allows the use of regularization and selection methods to estimate those relations, and we provide methods for post‐regularization and post‐selection inference that are uniformly valid (honest) across a wide range of models. We show that a key ingredient enabling honest inference is the use of orthogonal or doubly robust moment conditions in estimating certain reduced‐form functional parameters. We illustrate the use of the proposed methods with an application to estimating the effect of 401(k) eligibility and participation on accumulated assets. The results on program evaluation are obtained as a consequence of more general results on honest inference in a general moment‐condition framework, which arises from structural equation models in econometrics. Here, too, the crucial ingredient is the use of orthogonal moment conditions, which can be constructed from the initial moment conditions. We provide results on honest inference for (function‐valued) parameters within this general framework where any high‐quality, machine learning methods (e.g., boosted trees, deep neural networks, random forest, and their aggregated and hybrid versions) can be used to learn the nonparametric/high‐dimensional components of the model. These include a number of supporting auxiliary results that are of major independent interest: namely, we (1) prove uniform validity of a multiplier bootstrap, (2) offer a uniformly valid functional delta method, and (3) provide results for sparsity‐based estimation of regression functions for function‐valued outcomes.  相似文献   
880.
以x(1)(n)为初始条件的GM模型   总被引:55,自引:3,他引:55  
自1982年灰色系统理论创立以来,关于GM模型的研究,都是以序列X(1)的第一个分量作为灰色微分模型的初始条件进行建模的,这样造成对新信息利用不够充分。根据灰色系统理论的新信息优先原理,在建模过程中赋予新信息较大的权重可以提高灰色建模的功效。因此本文在建立灰色模型时,把X(1)的第n个分量作为灰色微分模型的初始条件,对GM模型进行了改进,从而使所建模型的预测精度大为提高。最后,通过实例验证了所建模型的实用性与可靠性。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号