全文获取类型
收费全文 | 2513篇 |
免费 | 52篇 |
专业分类
管理学 | 390篇 |
民族学 | 18篇 |
人口学 | 178篇 |
丛书文集 | 10篇 |
理论方法论 | 283篇 |
综合类 | 21篇 |
社会学 | 1325篇 |
统计学 | 340篇 |
出版年
2023年 | 16篇 |
2021年 | 12篇 |
2020年 | 50篇 |
2019年 | 80篇 |
2018年 | 68篇 |
2017年 | 95篇 |
2016年 | 75篇 |
2015年 | 65篇 |
2014年 | 77篇 |
2013年 | 407篇 |
2012年 | 101篇 |
2011年 | 91篇 |
2010年 | 77篇 |
2009年 | 70篇 |
2008年 | 93篇 |
2007年 | 75篇 |
2006年 | 77篇 |
2005年 | 78篇 |
2004年 | 77篇 |
2003年 | 65篇 |
2002年 | 63篇 |
2001年 | 60篇 |
2000年 | 45篇 |
1999年 | 57篇 |
1998年 | 42篇 |
1997年 | 44篇 |
1996年 | 28篇 |
1995年 | 36篇 |
1994年 | 30篇 |
1993年 | 18篇 |
1992年 | 22篇 |
1991年 | 28篇 |
1990年 | 21篇 |
1989年 | 12篇 |
1988年 | 30篇 |
1987年 | 20篇 |
1986年 | 23篇 |
1985年 | 20篇 |
1984年 | 28篇 |
1983年 | 18篇 |
1982年 | 15篇 |
1981年 | 28篇 |
1980年 | 18篇 |
1979年 | 17篇 |
1978年 | 12篇 |
1977年 | 10篇 |
1976年 | 12篇 |
1975年 | 14篇 |
1974年 | 9篇 |
1973年 | 8篇 |
排序方式: 共有2565条查询结果,搜索用时 15 毫秒
991.
Air pollution is a current and growing concern for Canadians, and there is evidence that ambient levels that meet current exposure standards may be associated with mortality and morbidity in Toronto, Canada. Evaluating exposure is an important step in understanding the relationship between particulate matter (PM) exposure and health outcomes. This report describes the PEARLS model (Particulate Exposure from Ambient to Regional Lung by Subgroup), which predicts exposure distributions for 11 age-gender population subgroups in Toronto to PM2.5 (PM with a median aerodynamic diameter of 2.5 microm or less) using Monte Carlo simulation techniques. The model uses physiological and activity pattern characteristics of each subgroup to determine region-specific lung exposure to PM2.5, which is defined as the mass of PM2.5 deposited per unit time to each of five lung regions (two extrathoracic, bronchial, bronchiolar, and alveolar). The modeling results predict that children, toddlers, and infants have the broadest distributions of exposure, and the greatest chance of experiencing extreme exposures in the alveolar region of the lung. Importance analysis indicates that the most influential model variables are air exchange rate into indoor environments, time spent outdoors, and time spent at high activity levels. Additionally, a "critical point" was defined and introduced to the PEARLS to investigate the effects of possible threshold-pathogenic phenomena on subgroup exposure patterns. The analysis indicates that the subgroups initially predicted to be most highly exposed were likely to have the highest proportion of their population exposed above the critical point. Substantial exposures above the critical point were predicted in all subgroups for ambient concentrations of PM2.5 commonly observed in Toronto after continuous exposure of 24 hours or more. 相似文献
992.
Using a survey, this paper provides information about the current state of performance management (appraisal) from a sample of UK‐based EFQM‐affiliated organizations. It particularly focuses on several critical issues of performance management in the context of TQM including: the effectiveness of TQM programmes; the rationale for performance management; degree of internal consistency between TQM assumptions and performance management systems; and the relationship among performance management, effectiveness of TQM programmes, employee satisfaction and overall organization performance. Although the fundamental precepts advocated by founders of TQM appear to be in conflict with performance management practices, however, the article argues that, rather than being contradictory, both can add value to the operations of the other in the interest of the organization as a whole. More precisely, the paper explains how a successful TQM strategy requires a rethinking and changing the organization's performance management system, otherwise it is highly likely to result in a disaster. To conclude, the survey evidence is used, combined with previous literature, to discuss the implications of these results for designing a contextually appropriate performance management for TQM and in the interest of the future research on TQM and HRM. 相似文献
993.
The problem of no‐shows (patients who do not arrive for scheduled appointments) is particularly significant for health care clinics, with reported no‐show rates varying widely from 3% to 80%. No‐shows reduce revenues and provider productivity, increase costs, and limit patient access by reducing effective clinic capacity. In this article, we construct a flexible appointment scheduling model to mitigate the detrimental effects of patient no‐shows, and develop a fast and effective solution procedure that constructs near‐optimal overbooked appointment schedules that balance the benefits of serving additional patients with the potential costs of patient waiting and clinic overtime. Computational results demonstrate the efficacy of our model and solution procedure, and connect our work to prior research in health care appointment scheduling. 相似文献
994.
J. A. John & E. R. Williams 《Journal of the Royal Statistical Society. Series C, Applied statistics》1997,46(2):207-214
Two-replicate row–column designs are often used for field trials in multisite tree or plant breeding programmes. With only two replicates for each trial, it is important to use designs with optimal or near optimal efficiency factors. This paper presents an algorithm for generating such designs. The method extends the contraction approach of Bailey and Patterson to any set of parameters and uses the factorial design construction algorithm of Williams and John to generate designs. Our experience with the algorithm is that it produces designs that are at least as good as, and often much better and more quickly generated than, those obtained by other recent computer algorithms. 相似文献
995.
Louise Choo Stephen G. Walker 《Journal of the Royal Statistical Society. Series A, (Statistics in Society)》2008,171(2):395-405
Summary. For rare diseases the observed disease count may exhibit extra Poisson variability, particularly in areas with low or sparse populations. Hence the variance of the estimates of disease risk, the standardized mortality ratios, may be highly unstable. This overdispersion must be taken into account otherwise subsequent maps based on standardized mortality ratios will be misleading and, rather than displaying the true spatial pattern of disease risk, the most extreme values will be highlighted. Neighbouring areas tend to exhibit spatial correlation as they may share more similarities than non-neighbouring areas. The need to address overdispersion and spatial correlation has led to the proposal of Bayesian approaches for smoothing estimates of disease risk. We propose a new model for investigating the spatial variation of disease risks in conjunction with an alternative specification for estimates of disease risk in geographical areas—the multivariate Poisson–gamma model. The main advantages of this new model lie in its simplicity and ability to account naturally for overdispersion and spatial auto-correlation. Exact expressions for important quantities such as expectations, variances and covariances can be easily derived. 相似文献
996.
Frank Rijmen Edward H. Ip Stephen Rapp Edward G. Shaw 《Journal of the Royal Statistical Society. Series A, (Statistics in Society)》2008,171(3):739-753
Summary. Primary and metastatic brain tumour patients are treated with surgery, radiation therapy and chemotherapy. Such treatments often result in short- and long-term symptoms that impact cognitive, emotional and physical function. Therefore, understanding the transition of symptom burden over time is important for guiding treatment and follow-up of brain tumour patients with symptom-specific interventions. We describe the use of a hidden Markov model with person-specific random effects for the temporal pattern of symptom burden. Clinically relevant covariates are also incorporated in the analysis through the use of generalized linear models. 相似文献
997.
Stephen Senn 《Significance》2008,5(1):37-39
One hundred years ago, an author under the pseudonym of Student published a paper which was to become famous. It was entitled The probable error of a mean. But what we now know as Student's t -test attracted little attention. It took another statistician of genius, R. A. Fisher, to amend, publicise and make it ubiquitous. But both Student's and Fisher's published versions were based upon faulty data. Stephen Senn reminds us of the third dedicated researcher and the quarter of a century delay before the story behind Student's t -test emerged. 相似文献
998.
In astronomy multiple images are frequently obtained at the same position of the sky for follow-up coaddition as it helps one go deeper and look for fainter objects. With large scale panchromatic synoptic surveys becoming more common, image co-addition has become even more necessary as new observations start to get compared with coadded fiducial sky in real time. The standard coaddition techniques have included straight averages, variance weighted averages, medians etc. A more sophisticated nonlinear response chi-square method is also used when it is known that the data are background noise limited and the point spread function is homogenized in all channels. A more robust object detection technique capable of detecting faint sources, even those not seen at all epochs which will normally be smoothed out in traditional methods, is described. The analysis at each pixel level is based on a formula similar to Mahalanobis distance. 相似文献
999.
We present an uncertainty analysis conducted using CETA-R, a model in which the costs of climate change are specified as Risks of large losses. In this analysis, we assume that three key parameters may each take on "high" or "low" values, leading to eight possible states of the world. We then explore optimal policies when the state of the world is known, and under uncertainty. Also, we estimate the benefits of resolving uncertainty earlier. We find that the optimal policy under uncertainty is similar to the policy that is optimal when each of the key parameters is at its low value. We also find that the value of immediate uncertainty resolution rises sharply as the alternative to immediate resolution is increasingly delayed resolution. 相似文献
1000.
A new type of PERT/CPM methodology is introduced whereby the individual activities within a project management network are endowed with resistive and capacitive elements. This methodology will enable us to mathematically define a completion time for the various activities and to relate the completion time to relevant cost in the completion of a particular task. It will also allow us to define the work done, and the rate at which work is being done, in an activity as a function of the applied effort and resource outlay. As a consequence, both the resource expenditure and the work done can be tracked within a network as a function of time. 相似文献