首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   7867篇
  免费   195篇
  国内免费   1篇
管理学   1106篇
民族学   36篇
人口学   779篇
丛书文集   32篇
理论方法论   720篇
综合类   92篇
社会学   3820篇
统计学   1478篇
  2023年   49篇
  2020年   106篇
  2019年   153篇
  2018年   178篇
  2017年   274篇
  2016年   190篇
  2015年   135篇
  2014年   170篇
  2013年   1384篇
  2012年   242篇
  2011年   225篇
  2010年   160篇
  2009年   146篇
  2008年   161篇
  2007年   165篇
  2006年   189篇
  2005年   168篇
  2004年   174篇
  2003年   151篇
  2002年   166篇
  2001年   176篇
  2000年   176篇
  1999年   179篇
  1998年   146篇
  1997年   142篇
  1996年   133篇
  1995年   120篇
  1994年   113篇
  1993年   115篇
  1992年   122篇
  1991年   137篇
  1990年   128篇
  1989年   146篇
  1988年   86篇
  1987年   113篇
  1986年   98篇
  1985年   120篇
  1984年   98篇
  1983年   106篇
  1982年   83篇
  1981年   83篇
  1980年   81篇
  1979年   85篇
  1978年   90篇
  1977年   63篇
  1976年   67篇
  1975年   62篇
  1974年   59篇
  1973年   50篇
  1971年   42篇
排序方式: 共有8063条查询结果,搜索用时 15 毫秒
61.
Factor analytic variance models have been widely considered for the analysis of multivariate data particularly in the psychometrics area. Recently Smith, Cullis & Thompson (2001) have considered their use in the analysis of multi‐environment data arising from plant improvement programs. For these data, the size of the problem and the complexity of the variance models chosen to account for spatial heterogeneity within trials implies that standard algorithms for fitting factor analytic models can be computationally expensive. This paper presents a sparse implementation of the average information algorithm (Gilmour, Thompson & Cullis, 1995) for fitting factor analytic and reduced rank variance models.  相似文献   
62.
The authors provide an overview of optimal scaling results for the Metropolis algorithm with Gaussian proposal distribution. They address in more depth the case of high‐dimensional target distributions formed of independent, but not identically distributed components. They attempt to give an intuitive explanation as to why the well‐known optimal acceptance rate of 0.234 is not always suitable. They show how to find the asymptotically optimal acceptance rate when needed, and they explain why it is sometimes necessary to turn to inhomogeneous proposal distributions. Their results are illustrated with a simple example.  相似文献   
63.
This study examines the relationships among personal coping resources, social support, external coping resources, job stressors and job strains in a sample of 110 American Telephone and Telegraph employees undergoing a major organizational restructuring. The study expanded on a model suggested by Ashford (1988) by defining another category of coping resources that employees may draw upon to deal with the stressors and strains which occur during major organizational changes. External coping resources were defined as those which provided employees with a sense of 'vicarious control' in stressful situations. Results indicated that personal coping resources, social support and external coping resources had a direct effect upon job stressor and strain levels. No 'buffering' effect of these coplng resources was found. Hierarchical regression analyses indicated that external coping resources added to the prediction of job stressors and strains even when pertonal coping resources and social support were entered first into the prediction questions.  相似文献   
64.
Making ends meet: perceptions of poverty in Sweden   总被引:1,自引:0,他引:1  
During the era after the Second World War, Sweden has built a welfare system based on labor market participation and income maintenance. Low unemployment and decent wages are supposed to guarantee people a labor market income or income maintenance, which in turn should provide a proper standard for everyone. However, a rapid increase in unemployment and economic problems have made the future of the Swedish welfare state more uncertain than ever. These circumstances have, among other things, led to the suggestion that Sweden should abandon the income maintenance policy and create a social policy system with the more limited ambition of guaranteeing everyone a minimum income. In that case, one central question must be answered: what constitutes a decent minimum income in today's Sweden? Where should we draw the poverty line under which people will not be forced to live? These questions are central in the current debate. The consensual poverty line method is used in this article to derive a poverty line relevant for today's Sweden. The results shows that more than every fifth household has an income below the consensual poverty line. That is, they have an income that most Swedes would argue is too low to make ends meet. The level of the consensual poverty line was compared with the National Board of Health and Welfare's guidelines for social assistance. The consensual poverty line was shown to be more generous to small households and the norm for social assistance was more generous to larger households. Finally, the expenditure for guaranteeing all Swedish household a minimum income equal to the consensual poverty line was estimated: more than SEK 25 billion per year. The results in the article casts serious doubt on the ability of the Swedish welfare state to secure a decent income to all citizens.  相似文献   
65.
The goal of Louisiana's 1990–1991 comparative risk project, also called the Louisiana Environmental Action Plan (LEAP), was to incorporate risk assessment into state environmental planning and policymaking. Scientists, government officials, and citizens were brought together to estimate the relative risk to human health, natural resources, and quality of life posed by 33 selected environmental issues. The issues were then ranked according to their relative estimated risks. It was hoped that this ranking of "comparative risks" would enable state policymakers to target the most important environmental problems and allocate scarce public resources more rationally and efficiently. As a result of the project, the governor issued an Executive Order forming a permanent Public Advisory Committee to continue this type of comparative risk assessment in Louisiana.  相似文献   
66.
Summary.  To help to design vaccines for acquired immune deficiency syndrome that protect broadly against many genetic variants of the human immunodeficiency virus, the mutation rates at 118 positions in HIV amino-acid sequences of subtype C versus those of subtype B were compared. The false discovery rate (FDR) multiple-comparisons procedure can be used to determine statistical significance. When the test statistics have discrete distributions, the FDR procedure can be made more powerful by a simple modification. The paper develops a modified FDR procedure for discrete data and applies it to the human immunodeficiency virus data. The new procedure detects 15 positions with significantly different mutation rates compared with 11 that are detected by the original FDR method. Simulations delineate conditions under which the modified FDR procedure confers large gains in power over the original technique. In general FDR adjustment methods can be improved for discrete data by incorporating the modification proposed.  相似文献   
67.
68.
69.
70.
Recent studies demonstrating a concentration dependence of elimination of 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) suggest that previous estimates of exposure for occupationally exposed cohorts may have underestimated actual exposure, resulting in a potential overestimate of the carcinogenic potency of TCDD in humans based on the mortality data for these cohorts. Using a database on U.S. chemical manufacturing workers potentially exposed to TCDD compiled by the National Institute for Occupational Safety and Health (NIOSH), we evaluated the impact of using a concentration- and age-dependent elimination model (CADM) (Aylward et al., 2005) on estimates of serum lipid area under the curve (AUC) for the NIOSH cohort. These data were used previously by Steenland et al. (2001) in combination with a first-order elimination model with an 8.7-year half-life to estimate cumulative serum lipid concentration (equivalent to AUC) for these workers for use in cancer dose-response assessment. Serum lipid TCDD measurements taken in 1988 for a subset of the cohort were combined with the NIOSH job exposure matrix and work histories to estimate dose rates per unit of exposure score. We evaluated the effect of choices in regression model (regression on untransformed vs. ln-transformed data and inclusion of a nonzero regression intercept) as well as the impact of choices of elimination models and parameters on estimated AUCs for the cohort. Central estimates for dose rate parameters derived from the serum-sampled subcohort were applied with the elimination models to time-specific exposure scores for the entire cohort to generate AUC estimates for all cohort members. Use of the CADM resulted in improved model fits to the serum sampling data compared to the first-order models. Dose rates varied by a factor of 50 among different combinations of elimination model, parameter sets, and regression models. Use of a CADM results in increases of up to five-fold in AUC estimates for the more highly exposed members of the cohort compared to estimates obtained using the first-order model with 8.7-year half-life. This degree of variation in the AUC estimates for this cohort would affect substantially the cancer potency estimates derived from the mortality data from this cohort. Such variability and uncertainty in the reconstructed serum lipid AUC estimates for this cohort, depending on elimination model, parameter set, and regression model, have not been described previously and are critical components in evaluating the dose-response data from the occupationally exposed populations.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号