首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1692篇
  免费   31篇
管理学   281篇
民族学   9篇
人口学   112篇
丛书文集   6篇
理论方法论   198篇
综合类   16篇
社会学   826篇
统计学   275篇
  2023年   12篇
  2022年   5篇
  2021年   10篇
  2020年   31篇
  2019年   52篇
  2018年   44篇
  2017年   59篇
  2016年   43篇
  2015年   45篇
  2014年   58篇
  2013年   275篇
  2012年   67篇
  2011年   58篇
  2010年   54篇
  2009年   45篇
  2008年   59篇
  2007年   54篇
  2006年   52篇
  2005年   52篇
  2004年   48篇
  2003年   44篇
  2002年   43篇
  2001年   40篇
  2000年   33篇
  1999年   37篇
  1998年   30篇
  1997年   31篇
  1996年   15篇
  1995年   22篇
  1994年   24篇
  1993年   11篇
  1992年   11篇
  1991年   17篇
  1990年   15篇
  1989年   7篇
  1988年   24篇
  1987年   15篇
  1986年   17篇
  1985年   14篇
  1984年   18篇
  1983年   15篇
  1982年   11篇
  1981年   21篇
  1980年   16篇
  1979年   13篇
  1978年   11篇
  1977年   6篇
  1976年   8篇
  1975年   9篇
  1974年   7篇
排序方式: 共有1723条查询结果,搜索用时 0 毫秒
21.
Compliance Versus Risk in Assessing Occupational Exposures   总被引:1,自引:0,他引:1  
Assessments of occupational exposures to chemicals are generally based upon the practice of compliance testing in which the probability of compliance is related to the exceedance [γ, the likelihood that any measurement would exceed an occupational exposure limit (OEL)] and the number of measurements obtained. On the other hand, workers’ chronic health risks generally depend upon cumulative lifetime exposures which are not directly related to the probability of compliance. In this paper we define the probability of “overexposure” (θ) as the likelihood that individual risk (a function of cumulative exposure) exceeds the risk inherent in the OEL (a function of the OEL and duration of exposure). We regard θ as a relevant measure of individual risk for chemicals, such as carcinogens, which produce chronic effects after long-term exposures but not necessarily for acutely-toxic substances which can produce effects relatively quickly. We apply a random-effects model to data from 179 groups of workers, exposed to a variety of chemical agents, and obtain parameter estimates for the group mean exposure and the within- and between-worker components of variance. These estimates are then combined with OELs to generate estimates of γ and θ. We show that compliance testing can significantly underestimate the health risk when sample sizes are small. That is, there can be large probabilities of compliance with typical sample sizes, despite the fact that large proportions of the working population have individual risks greater than the risk inherent in the OEL. We demonstrate further that, because the relationship between θ and γ depends upon the within- and between-worker components of variance, it cannot be assumed a priori that exceedance is a conservative surrogate for overexposure. Thus, we conclude that assessment practices which focus upon either compliance or exceedance are problematic and recommend that employers evaluate exposures relative to the probabilities of overexposure.  相似文献   
22.
23.
We show that, in the context of double-bootstrap confidence intervals, linear interpolation at the second level of the double bootstrap can reduce the simulation error component of coverage error by an order of magnitude. Intervals that are indistinguishable in terms of coverage error with theoretical, infinite simulation, double-bootstrap confidence intervals may be obtained at substantially less computational expense than by using the standard Monte Carlo approximation method. The intervals retain the simplicity of uniform bootstrap sampling and require no special analysis or computational techniques. Interpolation at the first level of the double bootstrap is shown to have a relatively minor effect on the simulation error.  相似文献   
24.
This paper considers the estimation of Cobb-Douglas production functions using panel data covering a large sample of companies observed for a small number of time periods. GMM estimatorshave been found to produce large finite-sample biases when using the standard first-differenced estimator. These biases can be dramatically reduced by exploiting reasonable stationarity restrictions on the initial conditions process. Using data for a panel of R&Dperforming US manufacturing companies we find that the additional instruments used in our extended GMM estimator yield much more reasonable parameter estimates.  相似文献   
25.
In the Brazilian Amazon during the 1980s,urban population growth outstripped rural growth, andby 1991, most of the region's population resided inurban areas. Much of this urban growth involvedestablishment of unplanned housing with inadequateinfrastructure, which resulted in rising pollution. This paper compares indicators of environmentalquality in urban populations of the Amazon in 1980 and1991, and among different kinds of urban populationsin 1991. The results show that environmental qualityin the region deteriorated during the 1980s as theproduction of and exposure to environmental hazardsrose while resources to ward off hazards eroded. Thefindings also show that environmental quality wasparticularly poor in more rapidly growing urbancenters. The urban Amazon may not afford an adequatestandard of living and this may generate out-migrationfrom the region.  相似文献   
26.
This paper proposes an interpretation of the pure capital rationing problem as it is faced by many managers in decentralized firms in which budgets serve as the principal means of control. It is argued that the appropriate objective for situations such as these is the maximization of either undiscounted earnings over the planning horizon or total value of the investments at the horizon. When either objective function is used in conjunction with the frequently encountered linear programming models for the capital rationing problem, shadow prices result which give rise to discount rates that will reproduce the optimal solution using discounted cash flow as a criterion. These results are then used as a means for clarifying several confusing and misleading statements that have appeared in the literature.  相似文献   
27.
This article seeks an understanding of sleeping and waking life in bed in relation to the use of handheld screen devices. The bedtime use of these devices has come to public attention through the involvement of this practice in the construction of two contemporary crises of sleep: Sleep science’s crisis of chronic sleep deprivation and the wider cultural crisis of the invasion of sleep by fast capitalism, as proposed by Jonathan Crary. Visual art is employed to gain insights into the two related understandings of sleep: one concerned with individual sleep self-regulation and the other with the corporeal commonality of sleep. The sociology of sleep, based on Michele Foucault’s concept of ‘biopower’, is augmented by philosophical insights from Jean-Luc Nancy and Gilles Deleuze to reconfigure the sleeping body beyond the bounds of disciplinary regimes and permit a reassessment of the affective potential of sleep. Works of digital media art, together with ethnographic and cultural studies on the use of mobile devices, are employed to tease out the entanglement of these screen devices with waking and sleeping bedroom life. An ontology of sleep as a pre-individual affective state is proposed as an alternative basis for resisting the appropriation of sleep by the always-waking world accessed through the use of smartphones and tablets.  相似文献   
28.
本文认为,公共服务应从以产品为主导的逻辑转向服务途径。通过采取服务导向,公共服务递送的经验性、组织间和系统性,以及作为共同生产者的服务使用者角色,将一同被考虑。论文将通过服务蓝图的应用,解释共同生产如何操作。并介绍了高等教育中的一个案例。在这一案例中,蓝图的创建将师生汇聚在一起,专注于学生入学的设计,从而改善学生体验,并支持共同生产。   相似文献   
29.
In nonregular problems where the conventional \(n\) out of \(n\) bootstrap is inconsistent, the \(m\) out of \(n\) bootstrap provides a useful remedy to restore consistency. Conventionally, optimal choice of the bootstrap sample size \(m\) is taken to be the minimiser of a frequentist error measure, estimation of which has posed a major difficulty hindering practical application of the \(m\) out of \(n\) bootstrap method. Relatively little attention has been paid to a stronger, stochastic, version of the optimal bootstrap sample size, defined as the minimiser of an error measure calculated directly from the observed sample. Motivated by this stronger notion of optimality, we develop procedures for calculating the stochastically optimal value of \(m\). Our procedures are shown to work under special forms of Edgeworth-type expansions which are typically satisfied by statistics of the shrinkage type. Theoretical and empirical properties of our methods are illustrated with three examples, namely the James–Stein estimator, the ridge regression estimator and the post-model-selection regression estimator.  相似文献   
30.
This study examined whether the efficacy of keepin' it REAL, a model program for substance use prevention in schools, was moderated by gender, ethnicity, and acculturation. Gender differences in program efficacy may arise through boys' higher risk of drug use, inadequate attention to girls' developmental issues, or cultural factors like polarized gender expectations. Data came from a randomized trial in 35 Phoenix, Arizona, middle schools involving 4,622 mostly Latino 7th graders. Using multi-level mixed models and multiple imputation missing techniques, results for the total sample showed no gender differences in program effects on recent substance use, but the program was more effective in fostering boys' than girls' anti-drug norms. Subgroup analyses demonstrated several more beneficial program effects for boys than girls (less alcohol and cigarette use and stronger anti-drug norms), but only among less acculturated Latinos. There were no gender differences in program effects among more acculturated Latinos, nor among non-Latino whites.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号