全文获取类型
收费全文 | 6984篇 |
免费 | 255篇 |
国内免费 | 81篇 |
专业分类
管理学 | 1813篇 |
民族学 | 2篇 |
人口学 | 29篇 |
丛书文集 | 128篇 |
理论方法论 | 71篇 |
综合类 | 1496篇 |
社会学 | 71篇 |
统计学 | 3710篇 |
出版年
2024年 | 13篇 |
2023年 | 66篇 |
2022年 | 115篇 |
2021年 | 146篇 |
2020年 | 167篇 |
2019年 | 244篇 |
2018年 | 264篇 |
2017年 | 407篇 |
2016年 | 261篇 |
2015年 | 237篇 |
2014年 | 339篇 |
2013年 | 1267篇 |
2012年 | 607篇 |
2011年 | 334篇 |
2010年 | 247篇 |
2009年 | 306篇 |
2008年 | 316篇 |
2007年 | 311篇 |
2006年 | 276篇 |
2005年 | 268篇 |
2004年 | 232篇 |
2003年 | 143篇 |
2002年 | 118篇 |
2001年 | 102篇 |
2000年 | 83篇 |
1999年 | 79篇 |
1998年 | 67篇 |
1997年 | 44篇 |
1996年 | 35篇 |
1995年 | 31篇 |
1994年 | 37篇 |
1993年 | 19篇 |
1992年 | 28篇 |
1991年 | 23篇 |
1990年 | 7篇 |
1989年 | 11篇 |
1988年 | 15篇 |
1987年 | 4篇 |
1986年 | 7篇 |
1985年 | 8篇 |
1984年 | 9篇 |
1983年 | 5篇 |
1982年 | 10篇 |
1981年 | 4篇 |
1980年 | 2篇 |
1979年 | 1篇 |
1978年 | 2篇 |
1977年 | 1篇 |
1975年 | 2篇 |
排序方式: 共有7320条查询结果,搜索用时 31 毫秒
1.
Christian Dormann Mikaela Owen Maureen Dollard Christina Guthier 《Work and stress》2018,32(3):248-261
Longitudinal studies are the gold standard of empirical work and stress research whenever experiments are not plausible. Frequently, scales are used to assess risk factors and their consequences, and cross-lagged effects are estimated to determine possible risks. Methods to translate cross-lagged effects into risk ratios to facilitate risk assessment do not yet exist, which creates a divide between psychological and epidemiological work stress research. The aim of the present paper is to demonstrate how cross-lagged effects can be used to assess the risk ratio of different levels of psychosocial safety climate (PSC) in organisations, an important psychosocial risk for the development of depression. We used available longitudinal evidence from the Australian Workplace Barometer (N?=?1905) to estimate cross-lagged effects of PSC on depression. We applied continuous time modelling to obtain time-scalable cross effects. These were further investigated in a 4-year Monte Carlo simulation, which translated them into 4-year incident rates. Incident rates were determined by relying on clinically relevant 2-year periods of depression. We suggest a critical value of PSC?=?26 (corresponding to ?1.4 SD), which is indicative of more than 100% increased incidents of persistent depressive disorder in 4-year periods compared to average levels of PSC across 4 years. 相似文献
2.
以全球价值链(GVC)参与为视角,基于2005—2015年40个“一带一路”沿线国家与我国的增加值贸易数据,采用静态面板固定效应模型实证分析交通基础设施和通信基础设施对“一带一路”沿线国家贸易利益的影响。结果发现,加强交通和通信基础设施建设有利于提高一国出口中所包含的国内增加值,增进该国的贸易利益。作用机制研究表明,交通基础设施和通信基础设施都能够通过降低贸易成本促进贸易利益的提升,其中通信基础设施还能够通过提高贸易的时间效率对贸易利益产生积极影响。因此,“一带一路”沿线国家应以设施联通为依托,加强本国的交通和通信基础设施建设,把握参与国际生产分工的主动权,以实现共同发展繁荣。 相似文献
3.
Stephen J. Ruberg Frank E. Harrell Jr. Margaret Gamalo-Siebers Lisa LaVange J. Jack Lee Karen Price 《The American statistician》2019,73(1):319-327
ABSTRACTThe cost and time of pharmaceutical drug development continue to grow at rates that many say are unsustainable. These trends have enormous impact on what treatments get to patients, when they get them and how they are used. The statistical framework for supporting decisions in regulated clinical development of new medicines has followed a traditional path of frequentist methodology. Trials using hypothesis tests of “no treatment effect” are done routinely, and the p-value < 0.05 is often the determinant of what constitutes a “successful” trial. Many drugs fail in clinical development, adding to the cost of new medicines, and some evidence points blame at the deficiencies of the frequentist paradigm. An unknown number effective medicines may have been abandoned because trials were declared “unsuccessful” due to a p-value exceeding 0.05. Recently, the Bayesian paradigm has shown utility in the clinical drug development process for its probability-based inference. We argue for a Bayesian approach that employs data from other trials as a “prior” for Phase 3 trials so that synthesized evidence across trials can be utilized to compute probability statements that are valuable for understanding the magnitude of treatment effect. Such a Bayesian paradigm provides a promising framework for improving statistical inference and regulatory decision making. 相似文献
4.
A. M. Abd El-Raheem 《Journal of Statistical Computation and Simulation》2019,89(16):3075-3104
The generalized half-normal (GHN) distribution and progressive type-II censoring are considered in this article for studying some statistical inferences of constant-stress accelerated life testing. The EM algorithm is considered to calculate the maximum likelihood estimates. Fisher information matrix is formed depending on the missing information law and it is utilized for structuring the asymptomatic confidence intervals. Further, interval estimation is discussed through bootstrap intervals. The Tierney and Kadane method, importance sampling procedure and Metropolis-Hastings algorithm are utilized to compute Bayesian estimates. Furthermore, predictive estimates for censored data and the related prediction intervals are obtained. We consider three optimality criteria to find out the optimal stress level. A real data set is used to illustrate the importance of GHN distribution as an alternative lifetime model for well-known distributions. Finally, a simulation study is provided with discussion. 相似文献
5.
《European Management Journal》2020,38(6):900-913
Based on the environment-strategy performance perspective and dynamic capabilities framework, we develop a theoretical model and hypotheses specifying how supply chain collaboration as a response to environment context factors – competitive intensity, supply uncertainty, technological turbulence and market turbulence, using a lean and agile strategy may influence firm performance. We test the model using partial least square structural equation modelling on data collected from a field survey with responses from 152 manufacturing firms representing a variety of industries. Empirical findings generally support the relationship between collaboration and firm performance using a lean and agile strategy. Also, for firms in industries that face environments characterised by high supply uncertainty and competitive intensity with, technological turbulence, the study finds evidence of a direct relationship between these environmental factors and supply chain collaboration. The findings provide an initial strategic response framework for appropriately aligning a lean and agile supply chain strategy through collaboration with environment context factors to achieve firm performance improvements. 相似文献
6.
《Risk analysis》2018,38(8):1576-1584
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed‐form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling‐based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed‐form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks’s method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. 相似文献
7.
Nuttanan Wichitaksorn 《统计学通讯:理论与方法》2020,49(8):1801-1817
AbstractThis article proposes a new approach to analyze multiple vector autoregressive (VAR) models that render us a newly constructed matrix autoregressive (MtAR) model based on a matrix-variate normal distribution with two covariance matrices. The MtAR is a generalization of VAR models where the two covariance matrices allow the extension of MtAR to a structural MtAR analysis. The proposed MtAR can also incorporate different lag orders across VAR systems that provide more flexibility to the model. The estimation results from a simulation study and an empirical study on macroeconomic application show favorable performance of our proposed models and method. 相似文献
8.
Keisuke Himoto 《Risk analysis》2020,40(6):1124-1138
Post-earthquake fires are high-consequence events with extensive damage potential. They are also low-frequency events, so their nature remains underinvestigated. One difficulty in modeling post-earthquake ignition probabilities is reducing the model uncertainty attributed to the scarce source data. The data scarcity problem has been resolved by pooling the data indiscriminately collected from multiple earthquakes. However, this approach neglects the inter-earthquake heterogeneity in the regional and seasonal characteristics, which is indispensable for risk assessment of future post-earthquake fires. Thus, the present study analyzes the post-earthquake ignition probabilities of five major earthquakes in Japan from 1995 to 2016 (1995 Kobe, 2003 Tokachi-oki, 2004 Niigata–Chuetsu, 2011 Tohoku, and 2016 Kumamoto earthquakes) by a hierarchical Bayesian approach. As the ignition causes of earthquakes share a certain commonality, common prior distributions were assigned to the parameters, and samples were drawn from the target posterior distribution of the parameters by a Markov chain Monte Carlo simulation. The results of the hierarchical model were comparatively analyzed with those of pooled and independent models. Although the pooled and hierarchical models were both robust in comparison with the independent model, the pooled model underestimated the ignition probabilities of earthquakes with few data samples. Among the tested models, the hierarchical model was least affected by the source-to-source variability in the data. The heterogeneity of post-earthquake ignitions with different regional and seasonal characteristics has long been desired in the modeling of post-earthquake ignition probabilities but has not been properly considered in the existing approaches. The presented hierarchical Bayesian approach provides a systematic and rational framework to effectively cope with this problem, which consequently enhances the statistical reliability and stability of estimating post-earthquake ignition probabilities. 相似文献
9.
10.
A NOTE ON EVANESCENT PROCESSES 总被引:1,自引:0,他引:1
This note examines the connection between μ-invariant measures for the transition function of a continuous-time Markov chain and those of its q-matrix, Q. The major result establishes a necessary and aufficient condition for a convergent μ-invariant measure for Q to be μ-inhant for the minimal transition function, P, under the assumption that P is honest. This corrects Theorem 6 of Vere-Jones (1969) and the first part of Corollary 1 of Pollett (1986), both of which assert that the above conclusion holds in the absence of this condition. The error was pointed out by E.A. van Doom (1991) and the counterexample which be presented provides the basis for the present arguments. In determining where the error occurred in the original proof, we are able to identify a simple sufficient condition for μ-invariance. 相似文献