首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   449篇
  免费   19篇
管理学   74篇
民族学   4篇
人口学   38篇
丛书文集   2篇
理论方法论   48篇
综合类   14篇
社会学   232篇
统计学   56篇
  2023年   2篇
  2022年   2篇
  2020年   8篇
  2019年   21篇
  2018年   11篇
  2017年   21篇
  2016年   15篇
  2015年   12篇
  2014年   18篇
  2013年   69篇
  2012年   13篇
  2011年   14篇
  2010年   12篇
  2009年   5篇
  2008年   13篇
  2007年   20篇
  2006年   19篇
  2005年   17篇
  2004年   11篇
  2003年   7篇
  2002年   11篇
  2001年   13篇
  2000年   6篇
  1999年   7篇
  1998年   7篇
  1997年   6篇
  1996年   2篇
  1995年   10篇
  1994年   9篇
  1993年   5篇
  1992年   5篇
  1991年   8篇
  1990年   5篇
  1989年   2篇
  1988年   8篇
  1987年   4篇
  1986年   3篇
  1985年   4篇
  1984年   4篇
  1981年   6篇
  1980年   5篇
  1979年   4篇
  1978年   3篇
  1976年   2篇
  1975年   4篇
  1974年   3篇
  1973年   4篇
  1968年   1篇
  1966年   1篇
  1963年   2篇
排序方式: 共有468条查询结果,搜索用时 921 毫秒
1.
2.
We discuss Bayesian analyses of traditional normal-mixture models for classification and discrimination. The development involves application of an iterative resampling approach to Monte Carlo inference, commonly called Gibbs sampling, and demonstrates routine application. We stress the benefits of exact analyses over traditional classification and discrimination techniques, including the ease with which such analyses may be performed in a quite general setting, with possibly several normal-mixture components having different covariance matrices, the computation of exact posterior classification probabilities for observed data and for future cases to be classified, and posterior distributions for these probabilities that allow for assessment of second-level uncertainties in classification.  相似文献   
3.
Low dose risk estimation via simultaneous statistical inferences   总被引:2,自引:0,他引:2  
Summary.  The paper develops and studies simultaneous confidence bounds that are useful for making low dose inferences in quantitative risk analysis. Application is intended for risk assessment studies where human, animal or ecological data are used to set safe low dose levels of a toxic agent, but where study information is limited to high dose levels of the agent. Methods are derived for estimating simultaneous, one-sided, upper confidence limits on risk for end points measured on a continuous scale. From the simultaneous confidence bounds, lower confidence limits on the dose that is associated with a particular risk (often referred to as a bench-mark dose ) are calculated. An important feature of the simultaneous construction is that any inferences that are based on inverting the simultaneous confidence bounds apply automatically to inverse bounds on the bench-mark dose.  相似文献   
4.
This paper develops a Coase-like solution of the problem of inducing Pareto optimal behavior in the presence of reciprocal externalities. In place of Coasean direct compensation between the parties to an externality problem, actors strategically match each other's externality-producing activity, and thus induce counterparts to internalize the external benefits or costs of their actions. The analysis suggests a general framework for analyzing social interactions in the presence of reciprocal externalities. As an application of the theory, a solution of the duopoly problem is noted.We are indebted to L. Danziger, D. Samet, participants in faculty seminars at Bar-Ilan and New York Universities and at the University of Maryland, and two anonymous referees for helpful comments and discussions. All remaining errors are our own.  相似文献   
5.
We discuss the issue of using benchmark doses for quantifying (excess) risk associated with exposure to environmental hazards. The paradigm of low-dose risk estimation in dose-response modeling is used as the primary application scenario. Emphasis is placed on making simultaneous inferences on benchmark doses when data are in the form of proportions, although the concepts translate easily to other forms of outcome data.  相似文献   
6.
While most of epidemiology is observational, rather than experimental, the culture of epidemiology is still derived from agricultural experiments, rather than other observational fields, such as astronomy or economics. The mismatch is made greater as focus has turned to continue risk factors, multifactorial outcomes, and outcomes with large variation unexplainable by available risk factors. The analysis of such data is often viewed as hypothesis testing with statistical control replacing randomization. However, such approaches often test restricted forms of the hypothesis being investigated, such as the hypothesis of a linear association, when there is no prior empirical or theoretical reason to believe that if an association exists, it is linear. In combination with the large nonstochastic sources of error in such observational studies, this suggests the more flexible alternative of exploring the association. Conclusions on the possible causal nature of any discovered association will rest on the coherence and consistency of multiple studies. Nonparametric smoothing in general, and generalized additive models in particular, represent an attractive approach to such problems. This is illustrated using data examining the relationship between particulate air pollution and daily mortality in Birmingham, Alabama; between particulate air pollution, ozone, and SO2 and daily hospital admissions for respiratory illness in Philadelphia; and between ozone and particulate air pollution and coughing episodes in children in six eastern U.S. cities. The results indicate that airborne particles and ozone are associated with adverse health outcomes at very low concentrations, and that there are likely no thresholds for these relationships.  相似文献   
7.
This article examines why members of the U.S. House of Representatives voted for H.R. 4437, the controversial 2005 bill to construct a 700‐mile immigration barrier along the U.S.‐Mexican border and to criminalize illegal presence and aid to undocumented immigrants. Logit analysis suggests that being a first‐term House member or a Republican and representing a district that was in the South or the West or heavily blue‐collar substantially boosted the odds of supporting H.R. 4437. If a member's district was disproportionately Asian, Latino, or, especially, African American, he or she was instead more likely to oppose the measure.  相似文献   
8.
Subjects imagined situations in which they reported enjoying themselves either alone or with others. Electromyographic (EMG) activity was recorded bilaterally from regions overlying thezygomatic major muscles responsible for smiling. Controlling for equal rated happiness in the two conditions, subjects showed more smiling in high-sociality than low-sociality imagery. In confirming imaginary audience effects during imagery, these data corroborate hypotheses that solitary facial displays are mediated by the presence of imaginary interactants, and suggest caution in employing them as measures of felt emotion.Avery Gilbert and Amy Jaffey had compelling insights throughout the course of study. We thank Paul Ekman, Carroll Izard, and Paul Rozin for extensive comments on earlier drafts. We also thank Bernard Apfelbaum, Jon Baron, Janet Bavelas, John Cacioppo, Linda Camras, Dean Delis, Rob DeRubeis, Alan Fiske, Stephen Fowler, Greg McHugo, Harriet Oster, David Premack, W. John Smith, and David Williams for their valuable comments and suggestions.  相似文献   
9.
Simulations of forest inventory in several populations compared simple random with “quick probability proportional to size” (QPPS) sampling. The latter may be applied in the absence of a list sampling frame and/or prior measurement of the auxiliary variable. The correlation between the auxiliary and target variables required to render QPPS sampling more efficient than simple random sampling varied over the range 0.3–0.6 and was lower when sampling from populations that were skewed to the right. Two possible analytical estimators of the standard error of the estimate of the mean for QPPS sampling were found to be less reliable than bootstrapping.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号