全文获取类型
收费全文 | 17907篇 |
免费 | 472篇 |
国内免费 | 4篇 |
专业分类
管理学 | 2622篇 |
民族学 | 86篇 |
人才学 | 3篇 |
人口学 | 1604篇 |
丛书文集 | 92篇 |
理论方法论 | 1745篇 |
综合类 | 300篇 |
社会学 | 9131篇 |
统计学 | 2800篇 |
出版年
2020年 | 236篇 |
2019年 | 365篇 |
2018年 | 394篇 |
2017年 | 531篇 |
2016年 | 402篇 |
2015年 | 302篇 |
2014年 | 402篇 |
2013年 | 2965篇 |
2012年 | 510篇 |
2011年 | 524篇 |
2010年 | 417篇 |
2009年 | 401篇 |
2008年 | 424篇 |
2007年 | 455篇 |
2006年 | 443篇 |
2005年 | 436篇 |
2004年 | 390篇 |
2003年 | 301篇 |
2002年 | 354篇 |
2001年 | 421篇 |
2000年 | 395篇 |
1999年 | 347篇 |
1998年 | 294篇 |
1997年 | 270篇 |
1996年 | 272篇 |
1995年 | 263篇 |
1994年 | 284篇 |
1993年 | 271篇 |
1992年 | 282篇 |
1991年 | 288篇 |
1990年 | 319篇 |
1989年 | 274篇 |
1988年 | 284篇 |
1987年 | 275篇 |
1986年 | 234篇 |
1985年 | 271篇 |
1984年 | 291篇 |
1983年 | 267篇 |
1982年 | 214篇 |
1981年 | 164篇 |
1980年 | 192篇 |
1979年 | 224篇 |
1978年 | 189篇 |
1977年 | 151篇 |
1976年 | 161篇 |
1975年 | 145篇 |
1974年 | 163篇 |
1973年 | 115篇 |
1972年 | 97篇 |
1971年 | 92篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
71.
Low dose risk estimation via simultaneous statistical inferences 总被引:2,自引:0,他引:2
Walter W. Piegorsch R. Webster West Wei Pan Ralph L. Kodell 《Journal of the Royal Statistical Society. Series C, Applied statistics》2005,54(1):245-258
Summary. The paper develops and studies simultaneous confidence bounds that are useful for making low dose inferences in quantitative risk analysis. Application is intended for risk assessment studies where human, animal or ecological data are used to set safe low dose levels of a toxic agent, but where study information is limited to high dose levels of the agent. Methods are derived for estimating simultaneous, one-sided, upper confidence limits on risk for end points measured on a continuous scale. From the simultaneous confidence bounds, lower confidence limits on the dose that is associated with a particular risk (often referred to as a bench-mark dose ) are calculated. An important feature of the simultaneous construction is that any inferences that are based on inverting the simultaneous confidence bounds apply automatically to inverse bounds on the bench-mark dose. 相似文献
72.
73.
The well-known chi-squared goodness-of-fit test for a multinomial distribution is generally biased when the observations are subject to misclassification. In Pardo and Zografos (2000) the problem was considered using a double sampling scheme and ø-divergence test statistics. A new problem appears if the null hypothesis is not simple because it is necessary to give estimators for the unknown parameters. In this paper the minimum ø-divergence estimators are considered and some of their properties are established. The proposed ø-divergence test statistics are obtained by calculating ø-divergences between probability density functions and by replacing parameters by their minimum ø-divergence estimators in the derived expressions. Asymptotic distributions of the new test statistics are also obtained. The testing procedure is illustrated with an example. 相似文献
74.
75.
Longitudinal data often contain missing observations, and it is in general difficult to justify particular missing data mechanisms, whether random or not, that may be hard to distinguish. The authors describe a likelihood‐based approach to estimating both the mean response and association parameters for longitudinal binary data with drop‐outs. They specify marginal and dependence structures as regression models which link the responses to the covariates. They illustrate their approach using a data set from the Waterloo Smoking Prevention Project They also report the results of simulation studies carried out to assess the performance of their technique under various circumstances. 相似文献
76.
77.
Christian P. Robert Xiao-Li Meng Jesper Møller Jeffrey S Rosenthal C Jennison M. A Hurn F Al-Awadhi Peter McCullagh Christophe Andrieu Arnaud Doucet Petros Dellaportas Ioulia Papageorgiou Ricardo S Ehlers Elena A Erosheva Stephen E Fienberg Jonathan J Forster Roger C Gill Nial Friel Peter Green David Hastie R King Hans R Künsch N. A. Lazar C Osinski 《Journal of the Royal Statistical Society. Series B, Statistical methodology》2003,65(1):39-55
78.
Cathy W. S. Chen F. C. Liu Mike K. P. So 《Australian & New Zealand Journal of Statistics》2008,50(1):29-51
To capture mean and variance asymmetries and time‐varying volatility in financial time series, we generalize the threshold stochastic volatility (THSV) model and incorporate a heavy‐tailed error distribution. Unlike existing stochastic volatility models, this model simultaneously accounts for uncertainty in the unobserved threshold value and in the time‐delay parameter. Self‐exciting and exogenous threshold variables are considered to investigate the impact of a number of market news variables on volatility changes. Adopting a Bayesian approach, we use Markov chain Monte Carlo methods to estimate all unknown parameters and latent variables. A simulation experiment demonstrates good estimation performance for reasonable sample sizes. In a study of two international financial market indices, we consider two variants of the generalized THSV model, with US market news as the threshold variable. Finally, we compare models using Bayesian forecasting in a value‐at‐risk (VaR) study. The results show that our proposed model can generate more accurate VaR forecasts than can standard models. 相似文献
79.
S. Vansteelandt E. Goetghebeur 《Journal of the Royal Statistical Society. Series B, Statistical methodology》2003,65(4):817-835
Summary. We estimate cause–effect relationships in empirical research where exposures are not completely controlled, as in observational studies or with patient non-compliance and self-selected treatment switches in randomized clinical trials. Additive and multiplicative structural mean models have proved useful for this but suffer from the classical limitations of linear and log-linear models when accommodating binary data. We propose the generalized structural mean model to overcome these limitations. This is a semiparametric two-stage model which extends the structural mean model to handle non-linear average exposure effects. The first-stage structural model describes the causal effect of received exposure by contrasting the means of observed and potential exposure-free outcomes in exposed subsets of the population. For identification of the structural parameters, a second stage 'nuisance' model is introduced. This takes the form of a classical association model for expected outcomes given observed exposure. Under the model, we derive estimating equations which yield consistent, asymptotically normal and efficient estimators of the structural effects. We examine their robustness to model misspecification and construct robust estimators in the absence of any exposure effect. The double-logistic structural mean model is developed in more detail to estimate the effect of observed exposure on the success of treatment in a randomized controlled blood pressure reduction trial with self-selected non-compliance. 相似文献
80.
While shows like The X-Files and 24 have merged conspiracy theories with popular science (fictions), some video games have been pushing the narrative even further. Electronic Art's Majestic game was released in July 2001 and quickly generated media buzz with its unusual multi-modal gameplay. Mixing phone calls, faxes, instant messaging, real and "fake' websites, and email, the game provides a fascinating case of an attempt at new directions for gaming communities. Through story, mode of playing, and use of technology, Majestic highlights the uncertain status of knowledge, community and self in a digital age; at the same time, it allows examination of alternative ways of understanding games' role and purpose in the larger culture. Drawing on intricate storylines involving government conspiracies, techno-bio warfare, murder and global terror, players were asked to solve mysteries in the hopes of preventing a devastating future of domination. Because the game drew in both actual and Majestic-owned/-designed websites, it constantly pushed those playing the game right to borders where simulation collides with " factuality'. Given the wide variety of "legitimate' conspiracy theory, alien encounters and alternative science web pages, users often could not distinguish when they were leaving the game's pages and venturing into " real' World Wide Web sites. Its further use of AOL's instant messenger system, in which gamers spoke not only to bots but to other players, pushed users to evaluate constantly both the status of those they were talking to and the information being provided. Additionally, the game required players to occupy unfamiliar subject positions, ones where agency was attenuated, and which subsequently generated a multi-layered sense of unease among players. This mix of authentic and staged information in conjunction with technologically mediated roles highlights what are often seen as phenomenon endemic to the Internet itself; that is, the destabilization of categories of knowing, relating, and being. 相似文献