首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   25331篇
  免费   675篇
  国内免费   1篇
管理学   3150篇
民族学   128篇
人才学   6篇
人口学   2370篇
丛书文集   135篇
教育普及   2篇
理论方法论   2296篇
现状及发展   1篇
综合类   355篇
社会学   12373篇
统计学   5191篇
  2023年   130篇
  2021年   128篇
  2020年   378篇
  2019年   592篇
  2018年   600篇
  2017年   861篇
  2016年   601篇
  2015年   488篇
  2014年   584篇
  2013年   4424篇
  2012年   841篇
  2011年   753篇
  2010年   614篇
  2009年   568篇
  2008年   686篇
  2007年   688篇
  2006年   616篇
  2005年   557篇
  2004年   540篇
  2003年   496篇
  2002年   517篇
  2001年   607篇
  2000年   518篇
  1999年   497篇
  1998年   424篇
  1997年   386篇
  1996年   394篇
  1995年   369篇
  1994年   332篇
  1993年   362篇
  1992年   415篇
  1991年   397篇
  1990年   376篇
  1989年   352篇
  1988年   336篇
  1987年   318篇
  1986年   309篇
  1985年   326篇
  1984年   337篇
  1983年   309篇
  1982年   266篇
  1981年   214篇
  1980年   258篇
  1979年   263篇
  1978年   206篇
  1977年   211篇
  1976年   188篇
  1975年   194篇
  1974年   158篇
  1972年   125篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
941.
A supersaturated design is a design whose run size is not enough for estimating all the main effects. It is commonly used in screening experiments, where the goals are to identify sparse and dominant active factors with low cost. In this paper, we study a variable selection method via the Dantzig selector, proposed by Candes and Tao [2007. The Dantzig selector: statistical estimation when pp is much larger than nn. Annals of Statistics 35, 2313–2351], to screen important effects. A graphical procedure and an automated procedure are suggested to accompany with the method. Simulation shows that this method performs well compared to existing methods in the literature and is more efficient at estimating the model size.  相似文献   
942.
Dynamic programming (DP) is a fast, elegant method for solving many one-dimensional optimisation problems but, unfortunately, most problems in image analysis, such as restoration and warping, are two-dimensional. We consider three generalisations of DP. The first is iterated dynamic programming (IDP), where DP is used to recursively solve each of a sequence of one-dimensional problems in turn, to find a local optimum. A second algorithm is an empirical, stochastic optimiser, which is implemented by adding progressively less noise to IDP. The final approach replaces DP by a more computationally intensive Forward-Backward Gibbs Sampler, and uses a simulated annealing cooling schedule. Results are compared with existing pixel-by-pixel methods of iterated conditional modes (ICM) and simulated annealing in two applications: to restore a synthetic aperture radar (SAR) image, and to warp a pulsed-field electrophoresis gel into alignment with a reference image. We find that IDP and its stochastic variant outperform the remaining algorithms.  相似文献   
943.
Nonparametric density estimation in the presence of measurement error is considered. The usual kernel deconvolution estimator seeks to account for the contamination in the data by employing a modified kernel. In this paper a new approach based on a weighted kernel density estimator is proposed. Theoretical motivation is provided by the existence of a weight vector that perfectly counteracts the bias in density estimation without generating an excessive increase in variance. In practice a data driven method of weight selection is required. Our strategy is to minimize the discrepancy between a standard kernel estimate from the contaminated data on the one hand, and the convolution of the weighted deconvolution estimate with the measurement error density on the other hand. We consider a direct implementation of this approach, in which the weights are optimized subject to sum and non-negativity constraints, and a regularized version in which the objective function includes a ridge-type penalty. Numerical tests suggest that the weighted kernel estimation can lead to tangible improvements in performance over the usual kernel deconvolution estimator. Furthermore, weighted kernel estimates are free from the problem of negative estimation in the tails that can occur when using modified kernels. The weighted kernel approach generalizes to the case of multivariate deconvolution density estimation in a very straightforward manner.  相似文献   
944.
The reversible jump Markov chain Monte Carlo (MCMC) sampler (Green in Biometrika 82:711–732, 1995) has become an invaluable device for Bayesian practitioners. However, the primary difficulty with the sampler lies with the efficient construction of transitions between competing models of possibly differing dimensionality and interpretation. We propose the use of a marginal density estimator to construct between-model proposal distributions. This provides both a step towards black-box simulation for reversible jump samplers, and a tool to examine the utility of common between-model mapping strategies. We compare the performance of our approach to well established alternatives in both time series and mixture model examples.  相似文献   
945.
In cultural studies of cemetery locus there is a very important aspect of understanding the cemetery as a tool for the formation of the socio-cultural identity of living people. I adhere to the point of view that the cemetery has always produced, and continues to produce, a variety of identities. While during the pre-Modern period the cemetery was a necessary element of individual self-understanding as a member of a certain community, in the Modern era the cemetery produces more particularistic identities. Modernity generates some universal and abstract schemes of identification, and by means of the repression of death from public consciousness, cemeteries lose their role as a focal point in social communication. I draw attention to the radical utopian ideas of the Russian philosopher Nikolai Fedorov, who not only considered the cemetery as a locus of memory, but also proclaimed the task of the transformation of cemeteries into a base for universal work on resurrection of dead ancestors and the restoration of brotherly relations in all mankind.  相似文献   
946.
I suggest an extension of the semiparametric transformation model that specifies a time-varying regression structure for the transformation, and thus allows time-varying structure in the data. Special cases include a stratified version of the usual semiparametric transformation model. The model can be thought of as specifying a first order Taylor expansion of a completely flexible baseline. Large sample properties are derived and estimators of the asymptotic variances of the regression coefficients are given. The method is illustrated by a worked example and a small simulation study. A goodness of fit procedure for testing if the regression effects lead to a satisfactory fit is also suggested.  相似文献   
947.
We investigate the effect of unobserved heterogeneity in the context of the linear transformation model for censored survival data in the clinical trials setting. The unobserved heterogeneity is represented by a frailty term, with unknown distribution, in the linear transformation model. The bias of the estimate under the assumption of no unobserved heterogeneity when it truly is present is obtained. We also derive the asymptotic relative efficiency of the estimate of treatment effect under the incorrect assumption of no unobserved heterogeneity. Additionally we investigate the loss of power for clinical trials that are designed assuming the model without frailty when, in fact, the model with frailty is true. Numerical studies under a proportional odds model show that the loss of efficiency and the loss of power can be substantial when the heterogeneity, as embodied by a frailty, is ignored. An erratum to this article can be found at  相似文献   
948.
Both treatment efficacy and safety are typically the primary endpoints in Phase II, and even in some Phase III, clinical trials. Efficacy is frequently measured by time to response, death, or some other milestone event and thus is a continuous, possibly censored, outcome. Safety, however, is frequently measured on a discrete scale; in Eastern Cooperative Oncology Group clinical trial E2290, it was measured as the number of weekly rounds of chemotherapy that were tolerable to colorectal cancer patients. For the joint analysis of efficacy and safety, we propose a non-parametric, computationally simple estimator for the bivariate survival function when one time-to-event is continuous, one is discrete, and both are subject to right-censoring. The bivariate censoring times may depend on each other, but they are assumed to be independent of both event times. We derive a closed-form covariance estimator for the survivor function which allows for inference to be based on any of several possible statistics of interest. In addition, we derive its covariance with respect to calendar time of analysis, allowing for its use in sequential studies.  相似文献   
949.
Is the Study of Happiness a Worthy Scientific Pursuit?   总被引:1,自引:1,他引:0  
This paper critiques the view that the study of happiness is not a worthy scientific pursuit. The happiness set point and hedonic treadmill theories denote the complexity of increasing happiness levels due to genetic limitations and adaptation, however, there is mounting evidence to suggest that with the use of appropriate measures and specific interventions aimed at fostering strengths and virtues, happiness can be increased. Furthermore, the benefits of investigating methods for increasing happiness include improvements in physical, psychological and social health and well-being. It is concluded that approaching human needs from a top down or holistic standpoint where individuals can use their strengths to overcome life’s challenges, is beneficial to health and well-being. Hence, the study of happiness is a worthy scientific pursuit.  相似文献   
950.
This paper characterizes vulnerable workers in Canada and the federal jurisdiction, based upon characteristics such as employment status, demographic characteristics, and job characteristics, and identifies areas in which labour standards may have a role. Based on this analysis, the paper evaluates the potential for labour standards to address economic vulnerability, focusing on labour standards policies aimed at wages and benefits, hours, and employment arrangements. In addition, the analysis considers the extent to which labour standards are likely to reach vulnerable workers. The results suggest several potential roles for labour standards and highlights policy implications.
George A. SlotsveEmail:
  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号