首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3064篇
  免费   103篇
  国内免费   18篇
管理学   136篇
民族学   15篇
人口学   64篇
丛书文集   238篇
理论方法论   138篇
综合类   1428篇
社会学   421篇
统计学   745篇
  2024年   2篇
  2023年   14篇
  2022年   22篇
  2021年   28篇
  2020年   52篇
  2019年   80篇
  2018年   78篇
  2017年   89篇
  2016年   87篇
  2015年   78篇
  2014年   134篇
  2013年   407篇
  2012年   184篇
  2011年   185篇
  2010年   150篇
  2009年   150篇
  2008年   163篇
  2007年   160篇
  2006年   170篇
  2005年   151篇
  2004年   164篇
  2003年   145篇
  2002年   133篇
  2001年   98篇
  2000年   69篇
  1999年   46篇
  1998年   24篇
  1997年   15篇
  1996年   25篇
  1995年   14篇
  1994年   4篇
  1993年   8篇
  1992年   11篇
  1991年   7篇
  1990年   4篇
  1989年   3篇
  1988年   4篇
  1987年   7篇
  1986年   4篇
  1985年   4篇
  1984年   2篇
  1983年   1篇
  1982年   2篇
  1981年   3篇
  1979年   1篇
  1978年   2篇
  1975年   1篇
排序方式: 共有3185条查询结果,搜索用时 421 毫秒
961.
Response-adaptive (RA) allocation designs can skew the allocation of incoming subjects toward the better performing treatment group based on the previously accrued responses. While unstable estimators and increased variability can adversely affect adaptation in early trial stages, Bayesian methods can be implemented with decreasingly informative priors (DIP) to overcome these difficulties. DIPs have been previously used for binary outcomes to constrain adaptation early in the trial, yet gradually increase adaptation as subjects accrue. We extend the DIP approach to RA designs for continuous outcomes, primarily in the normal conjugate family by functionalizing the prior effective sample size to equal the unobserved sample size. We compare this effective sample size DIP approach to other DIP formulations. Further, we considered various allocation equations and assessed their behavior utilizing DIPs. Simulated clinical trials comparing the behavior of these approaches with traditional Frequentist and Bayesian RA as well as balanced designs show that the natural lead-in approaches maintain improved treatment with lower variability and greater power.  相似文献   
962.
二氧化钒作为一种极具应用潜力的功能材料,在智能窗、抗激光辐射以及温控开关等许多领域发挥着非常重要的作用.本文阐述了二氧化钒的结构和相变特性,综述了二氧化钒薄膜制备工艺和热致相变性质的研究进展,最后对其应用前景进行展望.  相似文献   
963.
A large literature provides strong empirical support for the influence of parenting on child outcomes. The current study addresses enduring research questions testing the importance of early parenting behavior to children's adjustment. Specifically, we developed and tested a novel multi‐method observational measure of parental positive behavior support at age 2. Next, we tested whether early parental positive behavior support was related to child adjustment at school age, within a multi‐agent and multi‐method measurement approach and design. Observational and parent‐reported data from mother–child dyads (N = 731; 49 percent female) were collected from a high‐risk sample at age 2. Follow‐up data were collected via teacher report and child assessment at age 7.5. The results supported combining three different observational methods to assess positive behavior support at age 2 within a latent factor. Further, parents' observed positive behavior support at age 2 predicted multiple types of teacher‐reported and child‐assessed problem behavior and competencies at 7.5 years old. Results supported the validity and predictive capability of a multi‐method observational measure of parenting and the importance of a continued focus on the early years within preventive interventions.  相似文献   
964.
研究求解对称不定线性系统Ax=b的不定不完全分解预处理算法,其中A为稀疏的对称不定矩阵。合适的选主元算法是成功分解不定矩阵的关键,为了加快选主元的速度,给出了松弛的有界Bunch-Kaufman(RBBK)对称选主元算法,并分析了该选主元算法的稳定性以及参数的选择范围。将RBBK算法与不完全Cholesky分解相结合,得到了一类稳定性较高的修改的不完全Cholesky分解预处理技术。MATLAB下的数值例子表明,将提出的预处理技术用于SQMR迭代算法时,得到较快的收敛速度。  相似文献   
965.
林语堂作为中国现代索引的开拓者、先驱者,站在新文化运动的高度,对汉字检字法作了多方面、创造性地探索,对当时乃至后来的检字法研究、索引理论研究和编纂实践都产生了积极影响,其意义也绝不仅限于检字法本身。  相似文献   
966.
Recently, various studies have used the Poisson Pseudo-Maximal Likehood (PML) to estimate gravity specifications of trade flows and non-count data models more generally. Some papers also report results based on the Negative Binomial Quasi-Generalised Pseudo-Maximum Likelihood (NB QGPML) estimator, which encompasses the Poisson assumption as a special case. This note shows that the NB QGPML estimators that have been used so far are unappealing when applied to a continuous dependent variable which unit choice is arbitrary, because estimates artificially depend on that choice. A new NB QGPML estimator is introduced to overcome this shortcoming.  相似文献   
967.
Child welfare practice is temporally structured and includes a variety of follow‐up activities. Practice‐based follow‐up has not, however, been much explored when studying children's paths in the child welfare system. This paper is based on a study of children (103) who were taken into care in 2006 in 10 Finnish municipalities and their paths in care until 2011. The social workers' institutional knowledge of their ‘own’ clients comprises the core of the research design. The paper reflects on the nature of this data from the point of view of the notions of temporality. The analysis highlights four temporal trajectories used in retrospective analysis of children's paths in care: the linear time trajectory of the decisions and changes in the institutional positions, the temporally fragmentary trajectory of childhood and youth, the circular time trajectory of professional understanding of the child's path and the silent time trajectory. Each one documents the children's paths differently; the linear one tends to be the ‘natural’ and most easily available trajectory and consequently, the children's paths are documented via decisions and institutional positions. The analysis suggests that more attention should be given to the complexity of time and temporality when studying children's paths in child welfare.  相似文献   
968.
The fitting of Lévy processes is an important field of interest in both option pricing and risk management. In literature, a large number of fitting methods requiring adequate initial values at the start of the optimization procedure exists. A so-called simplified method of moments (SMoM) generates by assuming a symmetric distribution these initial values for the Variance Gamma process, whereby the idea behind can be easily transferred to the Normal Inverse Gaussian process. However, the characteristics of the Generalized Hyperbolic process prevent such an easy adaption. Therefore, we provide by applying a Taylor series approximation for the modified Bessel function of the third kind, a Tschirnhaus transformation and a symmetric distribution assumption, a SMOM for the Generalized Hyperbolic distribution. Our simulation study compares the results of our SMoM with the results of the maximum likelihood estimation. The results show that our proposed approach is an appropriate and useful way for estimating Generalized Hyperbolic process parameters and significantly reduces estimation time.  相似文献   
969.
The use of relevance vector machines to flexibly model hazard rate functions is explored. This technique is adapted to survival analysis problems through the partial logistic approach. The method exploits the Bayesian automatic relevance determination procedure to obtain sparse solutions and it incorporates the flexibility of kernel-based models. Example results are presented on literature data from a head-and-neck cancer survival study using Gaussian and spline kernels. Sensitivity analysis is conducted to assess the influence of hyperprior distribution parameters. The proposed method is then contrasted with other flexible hazard regression methods, in particular the HARE model proposed by Kooperberg et al. [16]. A simulation study is conducted to carry out the comparison. The model developed in this paper exhibited good performance in the prediction of hazard rate. The application of this sparse Bayesian technique to a real cancer data set demonstrated that the proposed method can potentially reveal characteristics of the hazards, associated with the dynamics of the studied diseases, which may be missed by existing modeling approaches based on different perspectives on the bias vs. variance balance.  相似文献   
970.
The underlying statistical concept that animates empirical strategies for extracting causal inferences from observational data is that observational data may be adjusted to resemble data that might have originated from a randomized experiment. This idea has driven the literature on matching methods. We explore an un-mined idea for making causal inferences with observational data – that any given observational study may contain a large number of indistinguishably balanced matched designs. We demonstrate how the absence of a unique best solution presents an opportunity for greater information retrieval in causal inference analysis based on the principle that many solutions teach us more about a given scientific hypothesis than a single study and improves our discernment with observational studies. The implementation can be achieved by integrating the statistical theories and models within a computational optimization framework that embodies the statistical foundations and reasoning.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号