全文获取类型
收费全文 | 2271篇 |
免费 | 68篇 |
专业分类
管理学 | 264篇 |
民族学 | 20篇 |
人口学 | 244篇 |
丛书文集 | 12篇 |
理论方法论 | 265篇 |
综合类 | 27篇 |
社会学 | 1011篇 |
统计学 | 496篇 |
出版年
2023年 | 27篇 |
2022年 | 8篇 |
2021年 | 29篇 |
2020年 | 65篇 |
2019年 | 93篇 |
2018年 | 113篇 |
2017年 | 134篇 |
2016年 | 80篇 |
2015年 | 60篇 |
2014年 | 77篇 |
2013年 | 459篇 |
2012年 | 97篇 |
2011年 | 86篇 |
2010年 | 72篇 |
2009年 | 57篇 |
2008年 | 71篇 |
2007年 | 69篇 |
2006年 | 68篇 |
2005年 | 57篇 |
2004年 | 42篇 |
2003年 | 50篇 |
2002年 | 44篇 |
2001年 | 45篇 |
2000年 | 29篇 |
1999年 | 38篇 |
1998年 | 22篇 |
1997年 | 20篇 |
1996年 | 23篇 |
1995年 | 22篇 |
1994年 | 22篇 |
1993年 | 24篇 |
1992年 | 26篇 |
1991年 | 25篇 |
1990年 | 10篇 |
1989年 | 13篇 |
1988年 | 12篇 |
1987年 | 11篇 |
1986年 | 8篇 |
1985年 | 15篇 |
1984年 | 15篇 |
1983年 | 13篇 |
1982年 | 10篇 |
1981年 | 17篇 |
1980年 | 10篇 |
1979年 | 8篇 |
1978年 | 5篇 |
1976年 | 5篇 |
1975年 | 8篇 |
1974年 | 9篇 |
1966年 | 3篇 |
排序方式: 共有2339条查询结果,搜索用时 0 毫秒
911.
Rafael A. Ponce Scott M. Bartell Eva Y. Wong Denise LaFlamme Clark Carrington Robert C. Lee Donald L. Patrick Elaine M. Faustman & Michael Bolger 《Risk analysis》2000,20(4):529-542
Risks associated with toxicants in food are often controlled by exposure reduction. When exposure recommendations are developed for foods with both harmful and beneficial qualities, however, they must balance the associated risks and benefits to maximize public health. Although quantitative methods are commonly used to evaluate health risks, such methods have not been generally applied to evaluating the health benefits associated with environmental exposures. A quantitative method for risk-benefit analysis is presented that allows for consideration of diverse health endpoints that differ in their impact (i.e., duration and severity) using dose-response modeling weighted by quality-adjusted life years saved. To demonstrate the usefulness of this method, the risks and benefits of fish consumption are evaluated using a single health risk and health benefit endpoint. Benefits are defined as the decrease in myocardial infarction mortality resulting from fish consumption, and risks are defined as the increase in neurodevelopmental delay (i.e., talking) resulting from prenatal methylmercury exposure. Fish consumption rates are based on information from Washington State. Using the proposed framework, the net health impact of eating fish is estimated in either a whole population or a population consisting of women of childbearing age and their children. It is demonstrated that across a range of fish methylmercury concentrations (0-1 ppm) and intake levels (0-25 g/day), individuals would have to weight the neurodevelopmental effects 6 times more (in the whole population) or 250 times less (among women of child-bearing age and their children) than the myocardial infarction benefits in order to be ambivalent about whether or not to consume fish. These methods can be generalized to evaluate the merits of other public health and risk management programs that involve trade-offs between risks and benefits. 相似文献
912.
Based on the perspective of knowing in practice, this paper investigates the everyday work of engineers in a semiconductor plant. Qualitative data analysis is used to examine the active role of objects and how engineers rely on them to perform troubleshooting in the manufacturing processes. Three active roles emerged from the analysis: (a) objects activating interpretations, (b) objects stimulating collaborative practices, and (c) objects sparking experimental activities. Based on these three findings, we propose that objects have three triggering roles. First, objects trigger meaning-making. The meaning-making process provides a condition for better understanding the situation, making inferences, and developing possible diagnostic logics. Second, objects trigger spontaneous relationships. This role promotes social interaction, encourages members to cooperate, and to negotiate in the organization. Third, objects trigger real-time exploratory actions. These triggering roles of objects enable prioritization and execution of troubleshooting practice, based on the available information, actionable knowledge, and the situation at hand. Finally, these findings have important theoretical implications and indicate interesting future research directions related to the active role of objects in work practice. 相似文献
913.
Shelley D. Dionne Alka Gupta Kristin Lee Sotak Kristie A. Shirreffs Andra Serban Chanyu Hao Dong Ha Kim Francis J. Yammarino 《The Leadership Quarterly》2014,25(1):6-35
The purpose of this article is to present a comprehensive 25-year review of the incorporation of levels of analysis into conceptual and empirical leadership research published within Leadership Quarterly throughout its history. We assessed the population of Leadership Quarterly's research (790 research articles) on four key levels of analysis-based issues: (1) explicit statement of the focal level(s) of analysis; (2) appropriate measurement given level of constructs; (3) use of a multi-level data analysis technique; and, (4) alignment of theory and data. Prior reviews regarding levels of analysis incorporation into leadership research have been limited to major research domains. Results revealed that while both conceptual and empirical articles only explicitly state the focal level of analysis in approximately one-third of the articles, appropriate levels-based measurement and alignment between theory and data are relatively strong areas of achievement for the articles within Leadership Quarterly. Multi-level data analysis techniques are used in less than one-fifth of all articles. Although there is room for improvement, there is evidence that Leadership Quarterly is a premier outlet for levels-based leadership research. Given the increasing complexity of organizational science with regard to groups, teams and collectives, Leadership Quarterly has an opportunity to model for organizational research on how to build and test complicated multi-level theories and models. 相似文献
914.
Xiao Zhang Haosheng Fan Victor C. S. Lee Minming Li Yingchao Zhao Chuang Liu 《Journal of Combinatorial Optimization》2018,36(2):434-457
Barrier coverage, as one of the most important applications of wireless sensor network (WSNs), is to provide coverage for the boundary of a target region. We study the barrier coverage problem by using a set of n sensors with adjustable coverage radii deployed along a line interval or circle. Our goal is to determine a range assignment \(\mathbf {R}=({r_{1}},{r_{2}}, \ldots , {r_{n}})\) of sensors such that the line interval or circle is fully covered and its total cost \(C(\mathbf {R})=\sum _{i=1}^n {r_{i}}^\alpha \) is minimized. For the line interval case, we formulate the barrier coverage problem of line-based offsets deployment, and present two approximation algorithms to solve it. One is an approximation algorithm of ratio 4 / 3 runs in \(O(n^{2})\) time, while the other is a fully polynomial time approximation scheme (FPTAS) of computational complexity \(O(\frac{n^{2}}{\epsilon })\). For the circle case, we optimally solve it when \(\alpha = 1\) and present a \(2(\frac{\pi }{2})^\alpha \)-approximation algorithm when \(\alpha > 1\). Besides, we propose an integer linear programming (ILP) to minimize the total cost of the barrier coverage problem such that each point of the line interval is covered by at least k sensors. 相似文献
915.
916.
Brandon Whitcher Thomas C. M. Lee Jeffrey B. Weiss Timothy J. Hoar Douglas W. Nychka 《Journal of the Royal Statistical Society. Series C, Applied statistics》2008,57(3):293-312
Summary. The fundamental equations that model turbulent flow do not provide much insight into the size and shape of observed turbulent structures. We investigate the efficient and accurate representation of structures in two-dimensional turbulence by applying statistical models directly to the simulated vorticity field. Rather than extract the coherent portion of the image from the background variation, as in the classical signal-plus-noise model, we present a model for individual vortices using the non-decimated discrete wavelet transform. A template image, which is supplied by the user, provides the features to be extracted from the vorticity field. By transforming the vortex template into the wavelet domain, specific characteristics that are present in the template, such as size and symmetry, are broken down into components that are associated with spatial frequencies. Multivariate multiple linear regression is used to fit the vortex template to the vorticity field in the wavelet domain. Since all levels of the template decomposition may be used to model each level in the field decomposition, the resulting model need not be identical to the template. Application to a vortex census algorithm that records quantities of interest (such as size, peak amplitude and circulation) as the vorticity field evolves is given. The multiresolution census algorithm extracts coherent structures of all shapes and sizes in simulated vorticity fields and can reproduce known physical scaling laws when processing a set of vorticity fields that evolve over time. 相似文献
917.
The aim of this work is the discussion and investigation of measures of divergence and model selection criteria. A recently introduced measure of divergence, the so-called BHHJ measure (Basu, A., Harris, I.R., Hjort, N.L., Jones, M.C., 1998. Robust and efficient estimation by minimising a density power divergence. Biometrika 85, 549–559) is investigated and a new model selection criterion the divergence information criterion (DIC) based on this measure is proposed. Simulations are performed to check the appropriateness of the proposed criterion. 相似文献
918.
Jing Cao J. Jack Lee Susan Alber 《Journal of statistical planning and inference》2009,139(12):4111-4122
A challenge for implementing performance-based Bayesian sample size determination is selecting which of several methods to use. We compare three Bayesian sample size criteria: the average coverage criterion (ACC) which controls the coverage rate of fixed length credible intervals over the predictive distribution of the data, the average length criterion (ALC) which controls the length of credible intervals with a fixed coverage rate, and the worst outcome criterion (WOC) which ensures the desired coverage rate and interval length over all (or a subset of) possible datasets. For most models, the WOC produces the largest sample size among the three criteria, and sample sizes obtained by the ACC and the ALC are not the same. For Bayesian sample size determination for normal means and differences between normal means, we investigate, for the first time, the direction and magnitude of differences between the ACC and ALC sample sizes. For fixed hyperparameter values, we show that the difference of the ACC and ALC sample size depends on the nominal coverage, and not on the nominal interval length. There exists a threshold value of the nominal coverage level such that below the threshold the ALC sample size is larger than the ACC sample size, and above the threshold the ACC sample size is larger. Furthermore, the ACC sample size is more sensitive to changes in the nominal coverage. We also show that for fixed hyperparameter values, there exists an asymptotic constant ratio between the WOC sample size and the ALC (ACC) sample size. Simulation studies are conducted to show that similar relationships among the ACC, ALC, and WOC may hold for estimating binomial proportions. We provide a heuristic argument that the results can be generalized to a larger class of models. 相似文献
919.
This paper addresses the problem of identifying groups that satisfy the specific conditions for the means of feature variables. In this study, we refer to the identified groups as “target clusters” (TCs). To identify TCs, we propose a method based on the normal mixture model (NMM) restricted by a linear combination of means. We provide an expectation–maximization (EM) algorithm to fit the restricted NMM by using the maximum-likelihood method. The convergence property of the EM algorithm and a reasonable set of initial estimates are presented. We demonstrate the method's usefulness and validity through a simulation study and two well-known data sets. The proposed method provides several types of useful clusters, which would be difficult to achieve with conventional clustering or exploratory data analysis methods based on the ordinary NMM. A simple comparison with another target clustering approach shows that the proposed method is promising in the identification. 相似文献
920.
We consider a fully Bayesian analysis of road casualty data at 56 designated mobile safety camera sites in the Northumbria Police Force area in the UK. It is well documented that regression to the mean (RTM) can exaggerate the effectiveness of road safety measures and, since the 1980s, an empirical Bayes (EB) estimation framework has become the gold standard for separating real treatment effects from those of RTM. In this paper we suggest some diagnostics to check the assumptions underpinning the standard estimation framework. We also show that, relative to a fully Bayesian treatment, the EB method is over-optimistic when quantifying the variability of estimates of casualty frequency. Implementing a fully Bayesian analysis via Markov chain Monte Carlo also provides a more flexible and complete inferential procedure. We assess the sensitivity of estimates of treatment effectiveness, as well as the expected monetary value of prevention owing to the implementation of the safety cameras, to different model specifications, which include the estimation of trend and the construction of informative priors for some parameters. 相似文献