首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1530篇
  免费   28篇
  国内免费   7篇
管理学   38篇
劳动科学   1篇
民族学   8篇
人口学   8篇
丛书文集   126篇
理论方法论   49篇
综合类   833篇
社会学   39篇
统计学   463篇
  2024年   1篇
  2023年   3篇
  2022年   2篇
  2021年   9篇
  2020年   11篇
  2019年   18篇
  2018年   17篇
  2017年   32篇
  2016年   27篇
  2015年   34篇
  2014年   50篇
  2013年   244篇
  2012年   76篇
  2011年   74篇
  2010年   71篇
  2009年   79篇
  2008年   78篇
  2007年   104篇
  2006年   82篇
  2005年   100篇
  2004年   82篇
  2003年   69篇
  2002年   56篇
  2001年   78篇
  2000年   41篇
  1999年   25篇
  1998年   13篇
  1997年   7篇
  1996年   10篇
  1995年   10篇
  1994年   11篇
  1993年   6篇
  1992年   5篇
  1991年   7篇
  1990年   4篇
  1989年   2篇
  1988年   7篇
  1987年   4篇
  1986年   2篇
  1985年   4篇
  1984年   1篇
  1983年   2篇
  1982年   2篇
  1977年   2篇
  1976年   1篇
  1975年   2篇
排序方式: 共有1565条查询结果,搜索用时 46 毫秒
61.
Plotting of log−log survival functions against time for different categories or combinations of categories of covariates is perhaps the easiest and most commonly used graphical tool for checking proportional hazards (PH) assumption. One problem in the utilization of the technique is that the covariates need to be categorical or made categorical through appropriate grouping of the continuous covariates. Subjectivity in the decision making on the basis of eye-judgment of the plots and frequent inconclusiveness arising in situations where the number of categories and/or covariates gets larger are among other limitations of this technique. This paper proposes a non-graphical (numerical) test of the PH assumption that makes use of log−log survival function. The test enables checking proportionality for categorical as well as continuous covariates and overcomes the other limitations of the graphical method. Observed power and size of the test are compared to some other tests of its kind through simulation experiments. Simulations demonstrate that the proposed test is more powerful than some of the most sensitive tests in the literature in a wide range of survival situations. An example of the test is given using the widely used gastric cancer data.  相似文献   
62.
改革开放以来,国务院相继经历了6次比较大的机构改革,纵观这6次机构改革.尤以1998年和2008年的动作幅度最大、影响范围最广且具有标志性的意义.1998年和2008年这两次机构改革,都是以职能转变是政府机构改革的核心,但这两次又有所不同.本文通过两次机构改革中形成的"三定规定"的研究,以政府职能转变为视角,分析我国行政管理体制取得的进展与成就,同时根据服务型政府建设的要求,找出不足,明确下一步改革的方向,这对于进一步深化行政管理体制改革、建设人民满意的服务型政府具有重要意义.  相似文献   
63.
知识经济时代的到来 ,把高等学校推向了社会的中心地位 ,同时带来了新的生机。高校人才集聚 ,应成为知识经济的策源地 ,这就要求高校发挥基础性研究的学术功能 ,综合研究的学术功能 ,教学研究的学术功能 ,应用研究的学术功能 ,高校自身发展研究的学术功能 ,才能够完成历史使命  相似文献   
64.
Over the past five years the Artificial Intelligence Center at SRI has been developing a new technology to address the problem of automated information management within real- world contexts. The result of this work is a body of techniques for automated reasoning from evidence that we call evidential reasoning. The techniques are based upon the mathematics of belief functions developed by Dempster and Shafer and have been successfully applied to a variety of problems including computer vision, multisensor integration, and intelligence analysis.

We have developed both a formal basis and a framework for implementating automated reasoning systems based upon these techniques. Both the formal and practical approach can be divided into four parts: (1) specifying a set of distinct propositional spaces, (2) specifying the interrelationships among these spaces, (3) representing bodies of evidence as belief distributions, and (4) establishing paths of the bodies for evidence to move through these spaces by means of evidential operations, eventually converging on spaces where the target questions can be answered. These steps specify a means for arguing from multiple bodies of evidence toward a particular (probabilistic) conclusion. Argument construction is the process by which such evidential analyses are constructed and is the analogue of constructing proof trees in a logical context.

This technology features the ability to reason from uncertain, incomplete, and occasionally inaccurate information based upon seven evidential operations: fusion, discounting, translation, projection, summarization, interpretation, and gisting. These operation are theoretically sound but have intuitive appeal as well.

In implementing this formal approach, we have found that evidential arguments can be represented as graphs. To support the construction, modification, and interrogation of evidential arguments, we have developed Gister. Gister provides an interactive, menu-driven, graphical interface that allows these graphical structures to be easily manipulated.

Our goal is to provide effective automated aids to domain experts for argument construction. Gister represents our first attempt at such an aid.  相似文献   

65.
In this article, we propose a weighted simulated integrated conditional moment (WSICM) test of the validity of parametric specifications of conditional distribution models for stationary time series data, by combining the weighted integrated conditional moment (ICM) test of Bierens (1984 Bierens, H. J. (1984). Model specification testing of time series regressions. Journal of Econometrics 26:323353.[Crossref], [Web of Science ®] [Google Scholar]) for time series regression models with the simulated ICM test of Bierens and Wang (2012 Bierens, H. J., Wang, L. (2012). Integrated conditional moment tests for parametric conditional distributions. Econometric Theory 28:328362.[Crossref], [Web of Science ®] [Google Scholar]) of conditional distribution models for cross-section data. To the best of our knowledge, no other consistent test for parametric conditional time series distributions has been proposed yet in the literature, despite consistency claims made by some authors.  相似文献   
66.
Several researchers have proposed solutions to control type I error rate in sequential designs. The use of Bayesian sequential design becomes more common; however, these designs are subject to inflation of the type I error rate. We propose a Bayesian sequential design for binary outcome using an alpha‐spending function to control the overall type I error rate. Algorithms are presented for calculating critical values and power for the proposed designs. We also propose a new stopping rule for futility. Sensitivity analysis is implemented for assessing the effects of varying the parameters of the prior distribution and maximum total sample size on critical values. Alpha‐spending functions are compared using power and actual sample size through simulations. Further simulations show that, when total sample size is fixed, the proposed design has greater power than the traditional Bayesian sequential design, which sets equal stopping bounds at all interim analyses. We also find that the proposed design with the new stopping for futility rule results in greater power and can stop earlier with a smaller actual sample size, compared with the traditional stopping rule for futility when all other conditions are held constant. Finally, we apply the proposed method to a real data set and compare the results with traditional designs.  相似文献   
67.
In this exploratory study, we examine whether organizational mission statement attributes make a difference to the performance of nonprofit performing arts organizations. We use text analysis to measure two semantic attributes—activity and commonality—of mission statements. We examine whether these attributes are associated with improved performance for the instrumental and expressive functions of nonprofit performing arts organizations. Our findings indicate that the mission statement attribute activity is associated with improved performance for both instrumental and expressive functions. Our analysis of nonfindings for the mission statement attribute commonality suggests that there is a need to develop and use content analysis tools tailored to nonprofit contexts.  相似文献   
68.
In this paper, we consider a generalisation of the backward simulation method of Duch et al. [New approaches to operational risk modeling. IBM J Res Develop. 2014;58:1–9] to build bivariate Poisson processes with flexible time correlation structures, and to simulate the arrival times of the processes. The proposed backward construction approach uses the Marshall–Olkin bivariate binomial distribution for the conditional law and some well-known families of bivariate copulas for the joint success probability in lieu of the typical conditional independence assumption. The resulting bivariate Poisson process can exhibit various time correlation structures which are commonly observed in real data.  相似文献   
69.
The first-order product autoregressive (PAR(1)) model introduced by McKenzie in 1982 McKenzie, E. D. (1982). Product autoregression: A time series characterization of the gamma distribution. Journal of Applied Probability 19:463468. [Google Scholar] did not attract the attention of practitioners due to the unavailability of a proper estimation method. This article proposes an estimating function (EF) method to fill the gap. In particular, we suggest an optimal combination of linear and quadratic EFs to overcome the problem of parameter identification. The procedure is applied to Weibull and Gamma PAR(1) models. Simulation and data analysis show that the proposed method performs better than the existing methods.  相似文献   
70.
In this article, the statistical inference for the Gompertz distribution based on Type-II progressively hybrid censored data is discussed. The estimation of the parameters for Gompertz distribution is obtained using maximum likelihood method (MLE) and Bayesian method under three different loss functions. We also proved the existence and uniqueness of the MLE. The one-sample Bayesian prediction intervals are obtained. The work is done for different values of the parameters. We apply the Monto Carlo simulation to compare the proposed methods, also an example is discussed to construct the Prediction intervals.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号