首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3157篇
  免费   106篇
  国内免费   19篇
管理学   145篇
民族学   14篇
人口学   62篇
丛书文集   241篇
理论方法论   131篇
综合类   1484篇
社会学   417篇
统计学   788篇
  2024年   2篇
  2023年   14篇
  2022年   22篇
  2021年   26篇
  2020年   51篇
  2019年   78篇
  2018年   78篇
  2017年   89篇
  2016年   89篇
  2015年   80篇
  2014年   138篇
  2013年   416篇
  2012年   193篇
  2011年   186篇
  2010年   151篇
  2009年   154篇
  2008年   167篇
  2007年   165篇
  2006年   180篇
  2005年   157篇
  2004年   170篇
  2003年   148篇
  2002年   136篇
  2001年   106篇
  2000年   71篇
  1999年   47篇
  1998年   27篇
  1997年   16篇
  1996年   25篇
  1995年   18篇
  1994年   11篇
  1993年   8篇
  1992年   14篇
  1991年   9篇
  1990年   5篇
  1989年   3篇
  1988年   4篇
  1987年   7篇
  1986年   4篇
  1985年   5篇
  1984年   2篇
  1983年   1篇
  1982年   2篇
  1981年   3篇
  1979年   1篇
  1978年   2篇
  1975年   1篇
排序方式: 共有3282条查询结果,搜索用时 78 毫秒
991.
研究求解对称不定线性系统Ax=b的不定不完全分解预处理算法,其中A为稀疏的对称不定矩阵。合适的选主元算法是成功分解不定矩阵的关键,为了加快选主元的速度,给出了松弛的有界Bunch-Kaufman(RBBK)对称选主元算法,并分析了该选主元算法的稳定性以及参数的选择范围。将RBBK算法与不完全Cholesky分解相结合,得到了一类稳定性较高的修改的不完全Cholesky分解预处理技术。MATLAB下的数值例子表明,将提出的预处理技术用于SQMR迭代算法时,得到较快的收敛速度。  相似文献   
992.
林语堂作为中国现代索引的开拓者、先驱者,站在新文化运动的高度,对汉字检字法作了多方面、创造性地探索,对当时乃至后来的检字法研究、索引理论研究和编纂实践都产生了积极影响,其意义也绝不仅限于检字法本身。  相似文献   
993.
Recently, various studies have used the Poisson Pseudo-Maximal Likehood (PML) to estimate gravity specifications of trade flows and non-count data models more generally. Some papers also report results based on the Negative Binomial Quasi-Generalised Pseudo-Maximum Likelihood (NB QGPML) estimator, which encompasses the Poisson assumption as a special case. This note shows that the NB QGPML estimators that have been used so far are unappealing when applied to a continuous dependent variable which unit choice is arbitrary, because estimates artificially depend on that choice. A new NB QGPML estimator is introduced to overcome this shortcoming.  相似文献   
994.
Child welfare practice is temporally structured and includes a variety of follow‐up activities. Practice‐based follow‐up has not, however, been much explored when studying children's paths in the child welfare system. This paper is based on a study of children (103) who were taken into care in 2006 in 10 Finnish municipalities and their paths in care until 2011. The social workers' institutional knowledge of their ‘own’ clients comprises the core of the research design. The paper reflects on the nature of this data from the point of view of the notions of temporality. The analysis highlights four temporal trajectories used in retrospective analysis of children's paths in care: the linear time trajectory of the decisions and changes in the institutional positions, the temporally fragmentary trajectory of childhood and youth, the circular time trajectory of professional understanding of the child's path and the silent time trajectory. Each one documents the children's paths differently; the linear one tends to be the ‘natural’ and most easily available trajectory and consequently, the children's paths are documented via decisions and institutional positions. The analysis suggests that more attention should be given to the complexity of time and temporality when studying children's paths in child welfare.  相似文献   
995.
The fitting of Lévy processes is an important field of interest in both option pricing and risk management. In literature, a large number of fitting methods requiring adequate initial values at the start of the optimization procedure exists. A so-called simplified method of moments (SMoM) generates by assuming a symmetric distribution these initial values for the Variance Gamma process, whereby the idea behind can be easily transferred to the Normal Inverse Gaussian process. However, the characteristics of the Generalized Hyperbolic process prevent such an easy adaption. Therefore, we provide by applying a Taylor series approximation for the modified Bessel function of the third kind, a Tschirnhaus transformation and a symmetric distribution assumption, a SMOM for the Generalized Hyperbolic distribution. Our simulation study compares the results of our SMoM with the results of the maximum likelihood estimation. The results show that our proposed approach is an appropriate and useful way for estimating Generalized Hyperbolic process parameters and significantly reduces estimation time.  相似文献   
996.
The use of relevance vector machines to flexibly model hazard rate functions is explored. This technique is adapted to survival analysis problems through the partial logistic approach. The method exploits the Bayesian automatic relevance determination procedure to obtain sparse solutions and it incorporates the flexibility of kernel-based models. Example results are presented on literature data from a head-and-neck cancer survival study using Gaussian and spline kernels. Sensitivity analysis is conducted to assess the influence of hyperprior distribution parameters. The proposed method is then contrasted with other flexible hazard regression methods, in particular the HARE model proposed by Kooperberg et al. [16]. A simulation study is conducted to carry out the comparison. The model developed in this paper exhibited good performance in the prediction of hazard rate. The application of this sparse Bayesian technique to a real cancer data set demonstrated that the proposed method can potentially reveal characteristics of the hazards, associated with the dynamics of the studied diseases, which may be missed by existing modeling approaches based on different perspectives on the bias vs. variance balance.  相似文献   
997.
Progression-free survival (PFS) is a frequently used endpoint in oncological clinical studies. In case of PFS, potential events are progression and death. Progressions are usually observed delayed as they can be diagnosed not before the next study visit. For this reason potential bias of treatment effect estimates for progression-free survival is a concern. In randomized trials and for relative treatment effects measures like hazard ratios, bias-correcting methods are not necessarily required or have been proposed before. However, less is known on cross-trial comparisons of absolute outcome measures like median survival times. This paper proposes a new method for correcting the assessment time bias of progression-free survival estimates to allow a fair cross-trial comparison of median PFS. Using median PFS for example, the presented method approximates the unknown posterior distribution by a Bayesian approach based on simulations. It is shown that the proposed method leads to a substantial reduction of bias as compared to estimates derived from maximum likelihood or Kaplan–Meier estimates. Bias could be reduced by more than 90% over a broad range of considered situations differing in assessment times and underlying distributions. By coverage probabilities of at least 94% based on the credibility interval of the posterior distribution the resulting parameters hold common confidence levels. In summary, the proposed approach is shown to be useful for a cross-trial comparison of median PFS.  相似文献   
998.
The underlying statistical concept that animates empirical strategies for extracting causal inferences from observational data is that observational data may be adjusted to resemble data that might have originated from a randomized experiment. This idea has driven the literature on matching methods. We explore an un-mined idea for making causal inferences with observational data – that any given observational study may contain a large number of indistinguishably balanced matched designs. We demonstrate how the absence of a unique best solution presents an opportunity for greater information retrieval in causal inference analysis based on the principle that many solutions teach us more about a given scientific hypothesis than a single study and improves our discernment with observational studies. The implementation can be achieved by integrating the statistical theories and models within a computational optimization framework that embodies the statistical foundations and reasoning.  相似文献   
999.
IntroductionGlaucoma is a leading cause of vision loss and blindness in the U.S. Risk factors include African American race, older age, family history of glaucoma, and diabetes. This paper describes the evaluation of a mobile eye health and a telemedicine program designed to improve access to eye care among people at high-risk for glaucoma.MethodsThe RE-AIM (reach, efficacy, adoption, implementation, and maintenance) evaluation framework was used to harmonize indicators. Both programs provided community-based eye health education and eye services related to glaucoma detection and care. Each program reported data on participants and community partners. An external evaluator conducted site visit interviews with program staff and community partners. Quantitative and qualitative data were integrated and analyzed using the RE-AIM dimensions.DiscussionBy targeting high-risk populations and providing comprehensive eye exams, both programs detected a large proportion of new glaucoma-related cases (17–19%) – a much larger proportion than that found in the general population (<2%). The educational intervention increased glaucoma knowledge; evidence that it led people to seek eye care was inconclusive.ConclusionsEvaluation findings from the mobile eye health program and the telemedicine program may provide useful information for wider implementation in public health clinics and in optometrist clinics located in retail outlets.  相似文献   
1000.
Modelling time-varying and frequency-specific relationships between two brain signals is becoming an essential methodological tool to answer theoretical questions in experimental neuroscience. In this article, we propose to estimate a frequency Granger causality statistic that may vary in time in order to evaluate the functional connections between two brain regions during a task. We use for that purpose an adaptive Kalman filter type of estimator of a linear Gaussian vector autoregressive model with coefficients evolving over time. The estimation procedure is achieved through variational Bayesian approximation and is extended for multiple trials. This Bayesian State Space (BSS) model provides a dynamical Granger-causality statistic that is quite natural. We propose to extend the BSS model to include the à trous Haar decomposition. This wavelet-based forecasting method is based on a multiscale resolution decomposition of the signal using the redundant à trous wavelet transform and allows us to capture short- and long-range dependencies between signals. Equally importantly it allows us to derive the desired dynamical and frequency-specific Granger-causality statistic. The application of these models to intracranial local field potential data recorded during a psychological experimental task shows the complex frequency-based cross-talk between amygdala and medial orbito-frontal cortex.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号