首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2784篇
  免费   131篇
  国内免费   9篇
管理学   333篇
民族学   3篇
人口学   35篇
丛书文集   29篇
理论方法论   64篇
综合类   201篇
社会学   59篇
统计学   2200篇
  2023年   42篇
  2022年   40篇
  2021年   46篇
  2020年   52篇
  2019年   107篇
  2018年   124篇
  2017年   223篇
  2016年   105篇
  2015年   94篇
  2014年   128篇
  2013年   573篇
  2012年   223篇
  2011年   109篇
  2010年   88篇
  2009年   112篇
  2008年   82篇
  2007年   96篇
  2006年   84篇
  2005年   91篇
  2004年   80篇
  2003年   57篇
  2002年   60篇
  2001年   39篇
  2000年   41篇
  1999年   35篇
  1998年   31篇
  1997年   26篇
  1996年   14篇
  1995年   16篇
  1994年   17篇
  1993年   9篇
  1992年   12篇
  1991年   18篇
  1990年   6篇
  1989年   9篇
  1988年   8篇
  1987年   3篇
  1986年   3篇
  1985年   4篇
  1984年   2篇
  1983年   3篇
  1982年   5篇
  1981年   1篇
  1980年   4篇
  1979年   1篇
  1975年   1篇
排序方式: 共有2924条查询结果,搜索用时 15 毫秒
1.
Poor quality of care may have a detrimental effect on access and take-up and can become a serious barrier to the universality of health services. This consideration is of particular interest in view of the fact that health systems in many countries must address a growing public-sector deficit and respond to increasing pressures due to COVID-19 and aging population, among other factors. In line with a rapidly emerging literature, we focus on patient satisfaction as a proxy for quality of health care. Drawing on rich longitudinal and cross-sectional data for Spain and multilevel estimation techniques, we show that in addition to individual level differences, policy levers (such as public health spending and the patient-doctor ratio, in particular) exert a considerable influence on the quality of a health care system. Our results suggest that policymakers seeking to enhance the quality of care should be cautious when compromising the level of health resources, and in particular, health personnel, as a response to economic downturns in a sector that traditionally had insufficient human resources in many countries, which have become even more evident in the light of the current health crisis. Additionally, we provide evidence that the increasing reliance on the private health sector may be indicative of inefficiencies in the public system and/or the existence of features of private insurance which are deemed important by patients.  相似文献   
2.
Longitudinal studies are the gold standard of empirical work and stress research whenever experiments are not plausible. Frequently, scales are used to assess risk factors and their consequences, and cross-lagged effects are estimated to determine possible risks. Methods to translate cross-lagged effects into risk ratios to facilitate risk assessment do not yet exist, which creates a divide between psychological and epidemiological work stress research. The aim of the present paper is to demonstrate how cross-lagged effects can be used to assess the risk ratio of different levels of psychosocial safety climate (PSC) in organisations, an important psychosocial risk for the development of depression. We used available longitudinal evidence from the Australian Workplace Barometer (N?=?1905) to estimate cross-lagged effects of PSC on depression. We applied continuous time modelling to obtain time-scalable cross effects. These were further investigated in a 4-year Monte Carlo simulation, which translated them into 4-year incident rates. Incident rates were determined by relying on clinically relevant 2-year periods of depression. We suggest a critical value of PSC?=?26 (corresponding to ?1.4 SD), which is indicative of more than 100% increased incidents of persistent depressive disorder in 4-year periods compared to average levels of PSC across 4 years.  相似文献   
3.
ABSTRACT

The cost and time of pharmaceutical drug development continue to grow at rates that many say are unsustainable. These trends have enormous impact on what treatments get to patients, when they get them and how they are used. The statistical framework for supporting decisions in regulated clinical development of new medicines has followed a traditional path of frequentist methodology. Trials using hypothesis tests of “no treatment effect” are done routinely, and the p-value < 0.05 is often the determinant of what constitutes a “successful” trial. Many drugs fail in clinical development, adding to the cost of new medicines, and some evidence points blame at the deficiencies of the frequentist paradigm. An unknown number effective medicines may have been abandoned because trials were declared “unsuccessful” due to a p-value exceeding 0.05. Recently, the Bayesian paradigm has shown utility in the clinical drug development process for its probability-based inference. We argue for a Bayesian approach that employs data from other trials as a “prior” for Phase 3 trials so that synthesized evidence across trials can be utilized to compute probability statements that are valuable for understanding the magnitude of treatment effect. Such a Bayesian paradigm provides a promising framework for improving statistical inference and regulatory decision making.  相似文献   
4.
《Risk analysis》2018,38(9):1988-2009
Harbor seals in Iliamna Lake, Alaska, are a small, isolated population, and one of only two freshwater populations of harbor seals in the world, yet little is known about their abundance or risk for extinction. Bayesian hierarchical models were used to estimate abundance and trend of this population. Observational models were developed from aerial survey and harvest data, and they included effects for time of year and time of day on survey counts. Underlying models of abundance and trend were based on a Leslie matrix model that used prior information on vital rates from the literature. We developed three scenarios for variability in the priors and used them as part of a sensitivity analysis. The models were fitted using Markov chain Monte Carlo methods. The population production rate implied by the vital rate estimates was about 5% per year, very similar to the average annual harvest rate. After a period of growth in the 1980s, the population appears to be relatively stable at around 400 individuals. A population viability analysis assessing the risk of quasi‐extinction, defined as any reduction to 50 animals or below in the next 100 years, ranged from 1% to 3%, depending on the prior scenario. Although this is moderately low risk, it does not include genetic or catastrophic environmental events, which may have occurred to the population in the past, so our results should be applied cautiously.  相似文献   
5.
The generalized half-normal (GHN) distribution and progressive type-II censoring are considered in this article for studying some statistical inferences of constant-stress accelerated life testing. The EM algorithm is considered to calculate the maximum likelihood estimates. Fisher information matrix is formed depending on the missing information law and it is utilized for structuring the asymptomatic confidence intervals. Further, interval estimation is discussed through bootstrap intervals. The Tierney and Kadane method, importance sampling procedure and Metropolis-Hastings algorithm are utilized to compute Bayesian estimates. Furthermore, predictive estimates for censored data and the related prediction intervals are obtained. We consider three optimality criteria to find out the optimal stress level. A real data set is used to illustrate the importance of GHN distribution as an alternative lifetime model for well-known distributions. Finally, a simulation study is provided with discussion.  相似文献   
6.
Keisuke Himoto 《Risk analysis》2020,40(6):1124-1138
Post-earthquake fires are high-consequence events with extensive damage potential. They are also low-frequency events, so their nature remains underinvestigated. One difficulty in modeling post-earthquake ignition probabilities is reducing the model uncertainty attributed to the scarce source data. The data scarcity problem has been resolved by pooling the data indiscriminately collected from multiple earthquakes. However, this approach neglects the inter-earthquake heterogeneity in the regional and seasonal characteristics, which is indispensable for risk assessment of future post-earthquake fires. Thus, the present study analyzes the post-earthquake ignition probabilities of five major earthquakes in Japan from 1995 to 2016 (1995 Kobe, 2003 Tokachi-oki, 2004 Niigata–Chuetsu, 2011 Tohoku, and 2016 Kumamoto earthquakes) by a hierarchical Bayesian approach. As the ignition causes of earthquakes share a certain commonality, common prior distributions were assigned to the parameters, and samples were drawn from the target posterior distribution of the parameters by a Markov chain Monte Carlo simulation. The results of the hierarchical model were comparatively analyzed with those of pooled and independent models. Although the pooled and hierarchical models were both robust in comparison with the independent model, the pooled model underestimated the ignition probabilities of earthquakes with few data samples. Among the tested models, the hierarchical model was least affected by the source-to-source variability in the data. The heterogeneity of post-earthquake ignitions with different regional and seasonal characteristics has long been desired in the modeling of post-earthquake ignition probabilities but has not been properly considered in the existing approaches. The presented hierarchical Bayesian approach provides a systematic and rational framework to effectively cope with this problem, which consequently enhances the statistical reliability and stability of estimating post-earthquake ignition probabilities.  相似文献   
7.
Summary.  Alongside the development of meta-analysis as a tool for summarizing research literature, there is renewed interest in broader forms of quantitative synthesis that are aimed at combining evidence from different study designs or evidence on multiple parameters. These have been proposed under various headings: the confidence profile method, cross-design synthesis, hierarchical models and generalized evidence synthesis. Models that are used in health technology assessment are also referred to as representing a synthesis of evidence in a mathematical structure. Here we review alternative approaches to statistical evidence synthesis, and their implications for epidemiology and medical decision-making. The methods include hierarchical models, models informed by evidence on different functions of several parameters and models incorporating both of these features. The need to check for consistency of evidence when using these powerful methods is emphasized. We develop a rationale for evidence synthesis that is based on Bayesian decision modelling and expected value of information theory, which stresses not only the need for a lack of bias in estimates of treatment effects but also a lack of bias in assessments of uncertainty. The increasing reliance of governmental bodies like the UK National Institute for Clinical Excellence on complex evidence synthesis in decision modelling is discussed.  相似文献   
8.
Let ( Xk ) k be a sequence of i.i.d. random variables taking values in a set , and consider the problem of estimating the law of X1 in a Bayesian framework. We prove, under mild conditions on the prior, that the sequence of posterior distributions satisfies a moderate deviation principle.  相似文献   
9.
Modelling daily multivariate pollutant data at multiple sites   总被引:7,自引:1,他引:6  
Summary. This paper considers the spatiotemporal modelling of four pollutants measured daily at eight monitoring sites in London over a 4-year period. Such multiple-pollutant data sets measured over time at multiple sites within a region of interest are typical. Here, the modelling was carried out to provide the exposure for a study investigating the health effects of air pollution. Alternative objectives include the design problem of the positioning of a new monitoring site, or for regulatory purposes to determine whether environmental standards are being met. In general, analyses are hampered by missing data due, for example, to a particular pollutant not being measured at a site, a monitor being inactive by design (e.g. a 6-day monitoring schedule) or because of an unreliable or faulty monitor. Data of this type are modelled here within a dynamic linear modelling framework, in which the dependences across time, space and pollutants are exploited. Throughout the approach is Bayesian, with implementation via Markov chain Monte Carlo sampling.  相似文献   
10.
本文简单介绍了分形理论的概念和特征,着重阐述了分形理论在图案艺术设计方面的应用和方法。研究设计了计算机辅助图案艺术设计系统的功能和构成。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号