全文获取类型
收费全文 | 5733篇 |
免费 | 24篇 |
专业分类
管理学 | 822篇 |
民族学 | 28篇 |
人口学 | 543篇 |
丛书文集 | 20篇 |
教育普及 | 1篇 |
理论方法论 | 539篇 |
综合类 | 62篇 |
社会学 | 2585篇 |
统计学 | 1157篇 |
出版年
2024年 | 34篇 |
2023年 | 46篇 |
2021年 | 56篇 |
2020年 | 110篇 |
2019年 | 141篇 |
2018年 | 140篇 |
2017年 | 214篇 |
2016年 | 153篇 |
2015年 | 94篇 |
2014年 | 159篇 |
2013年 | 865篇 |
2012年 | 182篇 |
2011年 | 165篇 |
2010年 | 138篇 |
2009年 | 123篇 |
2008年 | 150篇 |
2007年 | 138篇 |
2006年 | 159篇 |
2005年 | 117篇 |
2004年 | 118篇 |
2003年 | 85篇 |
2002年 | 109篇 |
2001年 | 136篇 |
2000年 | 129篇 |
1999年 | 138篇 |
1998年 | 107篇 |
1997年 | 91篇 |
1996年 | 95篇 |
1995年 | 72篇 |
1994年 | 69篇 |
1993年 | 91篇 |
1992年 | 90篇 |
1991年 | 102篇 |
1990年 | 80篇 |
1989年 | 81篇 |
1988年 | 62篇 |
1987年 | 68篇 |
1986年 | 65篇 |
1985年 | 78篇 |
1984年 | 71篇 |
1983年 | 72篇 |
1982年 | 50篇 |
1981年 | 41篇 |
1980年 | 41篇 |
1979年 | 49篇 |
1978年 | 43篇 |
1977年 | 36篇 |
1976年 | 37篇 |
1975年 | 34篇 |
1974年 | 40篇 |
排序方式: 共有5757条查询结果,搜索用时 15 毫秒
71.
72.
This article presents a synthetic control chart for detection of shifts in the process median. The synthetic chart is a combination of sign chart and conforming run-length chart. The performance evaluation of the proposed chart indicates that the synthetic chart has a higher power of detecting shifts in process median than the Shewhart charts based on sign statistic as well as the classical Shewhart X-bar chart for various symmetric distributions. The improvement is significant for shifts of moderate to large shifts in the median. The robustness studies of the proposed synthetic control chart against outliers indicate that the proposed synthetic control chart is robust against contamination by outliers. 相似文献
73.
74.
The agéd number theoretic concept of continued fractions can enhance certain Bayesian computations. The crux of this claim is due to continued fraction representations of numerically challenging special function ratios that arise in Bayesian computing. Continued fraction approximation via Lentz's Algorithm often leads to efficient and stable computation of such quantities. Copyright © 2012 John Wiley & Sons, Ltd. 相似文献
75.
Joseph W. McKean Jeff T. Terpstra John D. Kloke 《Wiley Interdisciplinary Reviews: Computational Statistics》2009,1(2):132-140
This review discusses two algorithms that can be used to compute rank‐based regression estimates. For completeness, a brief overview of rank‐based inference procedures in the context of a linear model is presented. The discussion includes geometry, estimation, inference, and diagnostics. In regard to computing the rank‐based estimates, we discuss two approaches. The first approach is based on an algebraic identity that allows one to compute the (Wilcoxon) estimates using a L1 regression routine. The other approach is a Newton‐type algorithm. In addition, we discuss how rank‐based inference can be generalized to nonlinear and random effects models. Some simple examples using existing statistical software are also presented for the sake of illustration and comparison. Traditional least squares (LS) procedures offer the user an encompassing methodology for analyzing models, linear or nonlinear. These procedures are based on the simple premise of fitting the model by minimizing the Euclidean distance between the vector of responses and the model. Besides the fit, the LS procedures include diagnostics to check the quality of fit and an array of inference procedures including confidence intervals (regions) and tests of hypotheses. LS procedures, though, are not robust. One outlier can spoil the LS fit, its associated inference, and even its diagnostic procedures (i.e., methods which should detect the outliers). Rank‐based procedures also offer the user a complete methodology. The only essential change is to replace the Euclidean norm by another norm, so that the geometry remains the same. As with the LS procedures, these rank‐based procedures offer the user diagnostic tools to check the quality of fit and associated inference procedures. Further, in contrast to the LS procedures, they are robust to the effect of outliers. They are generalizations of simple nonparametric rank procedures such as the Wilcoxon one and two‐sample methods and they retain the high efficiency of these simple rank methods. Further, depending on the knowledge of the underlying error distribution, this rank‐based analysis can be optimized by the choice of the norm (scores). Weighted versions of the fit can obtain high (50%) breakdown. Copyright © 2009 John Wiley & Sons, Inc. This article is categorized under:
- Statistical and Graphical Methods of Data Analysis > Nonparametric Methods
76.
This study, based on quantitative and qualitative surveys conducted from July 2004 to September 2005, examines the perceptions of Hanoi consumers and their reactions to the Avian Influenza epizootic (H5N1). Hanoi consumers clearly link the risk of human contamination by the virus to the preparation and ingestion of poultry. During the first crisis, consumers reacted quickly and intensely (74% of them had already stopped eating poultry in January 2004). Nevertheless, once the crisis abated, they quickly resumed their consumption of poultry. This behavior corresponds to the pattern described by empirical studies of other crises, such as BSE. What is more surprising is the speed with which the different steps of this common pattern succeeded one another. It may be explained by a rapid decrease in risk anxiety. A logit model shows that, soon after the beginning of the crisis, AI risk anxiety was tempered by confidence in the information and recommendations issued by the government concerning AI and, in the long term, by a high perceived self-efficiency to deal with AI. Indeed, not only has poultry consumption been affected in terms of the quantity consumed, but alternative ways of selecting and preparing poultry have also been adopted as anti-risk practices. Risk communication strategies should take this into account, and rely on a previous assessment of consumer practices adopted to deal with the risk. 相似文献
77.
Kenneth T. Bogen 《Risk analysis》2008,28(4):1033-1051
The U.S. Environmental Protection Agency (USEPA) guidelines for cancer risk assessment recognize that some chemical carcinogens may have a site-specific mode of action (MOA) involving mutation and cell-killing-induced hyperplasia. The guidelines recommend that for such dual MOA (DMOA) carcinogens, judgment should be used to compare and assess results using separate \"linear\" (genotoxic) versus \"nonlinear\" (nongenotoxic) approaches to low-level risk extrapolation. Because the guidelines allow this only when evidence supports reliable risk extrapolation using a validated mechanistic model, they effectively prevent addressing MOA uncertainty when data do not fully validate such a model but otherwise clearly support a DMOA. An adjustment-factor approach is proposed to address this gap, analogous to reference-dose procedures used for classic toxicity endpoints. By this method, even when a \"nonlinear\" toxicokinetic model cannot be fully validated, the effect of DMOA uncertainty on low-dose risk can be addressed. Application of the proposed approach was illustrated for the case of risk extrapolation from bioassay data on rat nasal tumors induced by chronic lifetime exposure to naphthalene. Bioassay data, toxicokinetic data, and pharmacokinetic analyses were determined to indicate that naphthalene is almost certainly a DMOA carcinogen. Plausibility bounds on rat-tumor-type-specific DMOA-related uncertainty were obtained using a mechanistic two-stage cancer risk model adapted to reflect the empirical link between genotoxic and cytotoxic effects of the most potent identified genotoxic naphthalene metabolites, 1,2- and 1,4-naphthoquinone. Bound-specific adjustment factors were then used to reduce naphthalene risk estimated by linear extrapolation (under the default genotoxic MOA assumption), to account for the DMOA exhibited by this compound. 相似文献
78.
The vast majority of research on self‐monitoring in the workplace focuses on the benefits that accrue to chameleon‐like high self‐monitors (relative to true‐to‐themselves low self‐monitors). In this study, we depart from the mainstream by focusing on a potential liability of being a high self‐monitor: high levels of experienced role conflict. We hypothesize that high self‐monitors tend to choose work situations that, although consistent with the expression of their characteristic personality, inherently involve greater role conflict (i.e. competing role expectations from different role senders). Data collected from a 116‐member high‐tech firm showed support for this mediation hypothesis: relative to low self‐monitors, high self‐monitors tended to experience greater role conflict in work organizations because high self‐monitors were more likely to occupy boundary spanning positions. To help draw a more realistic and balanced portrait of self‐monitoring in the workplace, we call for more theoretically grounded research on the price chameleons pay. 相似文献
79.
Worldwide data on terrorist incidents between 1968 and 2004 gathered by the RAND Corporation and the Oklahoma City National Memorial Institute for the Prevention of Terrorism (MIPT) were assessed for patterns and trends in morbidity/mortality. Adjusted data analyzed involve a total of 19,828 events, 7,401 \"adverse\" events (each causing >or= 1 victim), and 86,568 \"casualties\" (injuries), of which 25,408 were fatal. Most terror-related adverse events, casualties, and deaths involved bombs and guns. Weapon-specific patterns and terror-related risk levels in Israel (IS) have differed markedly from those of all other regions combined (OR). IS had a fatal fraction of casualties about half that of OR, but has experienced relatively constant lifetime terror-related casualty risks on the order of 0.5%--a level 2 to 3 orders of magnitude more than those experienced in OR that increased approximately 100-fold over the same period. Individual event fatality has increased steadily, the median increasing from 14% to 50%. Lorenz curves obtained indicate substantial dispersion among victim/event rates: about half of all victims were caused by the top 2.5% (or 10%) of harm-ranked events in OR (or IS). Extreme values of victim/event rates were approximated fairly well by generalized Pareto models (typically used to fit to data on forest fires, sea levels, earthquakes, etc.). These results were in turn used to forecast maximum OR- and IS-specific victims/event rates through 2080, illustrating empirically-based methods that could be applied to improve strategies to assess, prevent, and manage terror-related risks and consequences. 相似文献
80.
Simon J. T. Pollard Ray V. Kemp Mark Crawford Raquel Duarte-Davidson James G. Irwin Roger Yearsley 《Risk analysis》2004,24(6):1551-1560
Environmental policymakers and regulators are often in the position of having to prioritize their actions across a diverse range of environmental pressures to secure environmental protection and improvements. Information on environmental issues to inform this type of strategic analysis can be disparate; it may be too voluminous or even absent. Data on a range of issues are rarely presented in a common format that allows easy analysis and comparison. Nevertheless, judgments are required on the significance of various environmental pressures and on the inherent uncertainties to inform strategic assessments such as “state of the environment” reports. How can decisionmakers go about this type of strategic and comparative risk analysis? In an attempt to provide practical tools for the analysis of environmental risks at a strategic level, the Environment Agency of England and Wales has conducted a program of developmental research on strategic risk assessment since 1996. The tools developed under this program use the concept of “environmental harm” as a common metric, viewed from technical, social, and economic perspectives, to analyze impacts from a range of environmental pressures. Critical to an informed debate on the relative importance of these perspectives is an understanding and analysis of the various characteristics of harm (spatial and temporal extent, reversibility, latency, etc.) and of the social response to actual or potential environmental harm from a range of hazards. Recent developments in our approach, described herein, allow a presentation of the analysis in a structured fashion so as to better inform risk‐management decisions. 相似文献