首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6429篇
  免费   199篇
  国内免费   59篇
管理学   290篇
民族学   64篇
人口学   31篇
丛书文集   680篇
理论方法论   281篇
综合类   3941篇
社会学   334篇
统计学   1066篇
  2024年   19篇
  2023年   47篇
  2022年   55篇
  2021年   71篇
  2020年   88篇
  2019年   128篇
  2018年   149篇
  2017年   153篇
  2016年   133篇
  2015年   150篇
  2014年   348篇
  2013年   674篇
  2012年   429篇
  2011年   405篇
  2010年   381篇
  2009年   374篇
  2008年   366篇
  2007年   438篇
  2006年   409篇
  2005年   406篇
  2004年   345篇
  2003年   301篇
  2002年   242篇
  2001年   196篇
  2000年   117篇
  1999年   36篇
  1998年   23篇
  1997年   31篇
  1996年   34篇
  1995年   27篇
  1994年   24篇
  1993年   17篇
  1992年   9篇
  1991年   8篇
  1990年   9篇
  1989年   10篇
  1988年   5篇
  1987年   3篇
  1986年   1篇
  1985年   4篇
  1984年   7篇
  1983年   5篇
  1982年   4篇
  1981年   2篇
  1979年   1篇
  1978年   1篇
  1977年   1篇
  1975年   1篇
排序方式: 共有6687条查询结果,搜索用时 15 毫秒
81.
散打之力量是散打运动中除运动技术等要素外,非常重要的另一要素,散打之力量在运动中表现出两个方面的应用,即散打之局部力量和整体力量;散打之局部力量和整体力量在比赛中交替展现,正是由于散打之局部力量和整体力量的不停变换使用及不同情况下各种力量共同作用,运动员才能在比赛中赢得胜利。散打之局部力量与整体力量的科学应用与训练不仅塑造了练习者完美的运动身形,而且培养了个人不凡的运动气质,是一项完美的体育运动。  相似文献   
82.
In this article, the least squares (LS) estimates of the parameters of periodic autoregressive (PAR) models are investigated for various distributions of error terms via Monte-Carlo simulation. Beside the Gaussian distribution, this study covers the exponential, gamma, student-t, and Cauchy distributions. The estimates are compared for various distributions via bias and MSE criterion. The effect of other factors are also examined as the non-constancy of model orders, the non-constancy of the variances of seasonal white noise, the period length, and the length of the time series. The simulation results indicate that this method is in general robust for the estimation of AR parameters with respect to the distribution of error terms and other factors. However, the estimates of those parameters were, in some cases, noticeably poor for Cauchy distribution. It is also noticed that the variances of estimates of white noise variances are highly affected by the degree of skewness of the distribution of error terms.  相似文献   
83.
Likelihood ratios (LRs) are used to characterize the efficiency of diagnostic tests. In this paper, we use the classical weighted least squares (CWLS) test procedure, which was originally used for testing the homogeneity of relative risks, for comparing the LRs of two or more binary diagnostic tests. We compare the performance of this method with the relative diagnostic likelihood ratio (rDLR) method and the diagnostic likelihood ratio regression (DLRReg) approach in terms of size and power, and we observe that the performances of CWLS and rDLR are the same when used to compare two diagnostic tests, while DLRReg method has higher type I error rates and powers. We also examine the performances of the CWLS and DLRReg methods for comparing three diagnostic tests in various sample size and prevalence combinations. On the basis of Monte Carlo simulations, we conclude that all of the tests are generally conservative and have low power, especially in settings of small sample size and low prevalence.  相似文献   
84.
Louis Anthony Cox  Jr. 《Risk analysis》2012,32(11):1919-1934
Extreme and catastrophic events pose challenges for normative models of risk management decision making. They invite development of new methods and principles to complement existing normative decision and risk analysis. Because such events are rare, it is difficult to learn about them from experience. They can prompt both too little concern before the fact, and too much after. Emotionally charged and vivid outcomes promote probability neglect and distort risk perceptions. Aversion to acting on uncertain probabilities saps precautionary action; moral hazard distorts incentives to take care; imperfect learning and social adaptation (e.g., herd‐following, group‐think) complicate forecasting and coordination of individual behaviors and undermine prediction, preparation, and insurance of catastrophic events. Such difficulties raise substantial challenges for normative decision theories prescribing how catastrophe risks should be managed. This article summarizes challenges for catastrophic hazards with uncertain or unpredictable frequencies and severities, hard‐to‐envision and incompletely described decision alternatives and consequences, and individual responses that influence each other. Conceptual models and examples clarify where and why new methods are needed to complement traditional normative decision theories for individuals and groups. For example, prospective and retrospective preferences for risk management alternatives may conflict; procedures for combining individual beliefs or preferences can produce collective decisions that no one favors; and individual choices or behaviors in preparing for possible disasters may have no equilibrium. Recent ideas for building “disaster‐resilient” communities can complement traditional normative decision theories, helping to meet the practical need for better ways to manage risks of extreme and catastrophic events.  相似文献   
85.
The paper studies five entropy tests of exponentiality using five statistics based on different entropy estimates. Critical values for various sample sizes determined by means of Monte Carlo simulations are presented for each of the test statistics. By simulation, we compare the power of these five tests for various alternatives and sample sizes.  相似文献   
86.
For the two-sample location and scale problem we propose an adaptive test which is based on so called Lepage type tests. The well known test of Lepage (1971) is a combination of the Wilcoxon test for location alternatives and the Ansari-Bradley test for scale alternatives and it behaves well for symmetric and medium-tailed distributions. For the cae of short-, medium- and long-tailed distributions we replace the Wilcoxon test and the .Ansari-Bradley test by suitable other two-sample tests for location and scale, respectively, in oder to get higher power than the classical Lepage test for such distribotions. These tests here are called Lepage type tests. in practice, however, we generally have no clear idea about the distribution having generated our data. Thus, an adaptive test should be applied which takes the the given data set inio consideration. The proposed adaptive test is based on the concept of Hogg (1974), i.e., first, to classify the unknown symmetric distribution function with respect to a measure for tailweight and second, to apply an appropriate Lepage type test for this classified type of distribution. We compare the adaptive test with the three Lepage type tests in the adaptive scheme and with the classical Lepage test as well as with other parametric and nonparametric tests. The power comparison is carried out via Monte Carlo simulation. It is shown that the adaptive test is the best one for the broad class of distributions considered.  相似文献   
87.
ABSTRACT

Recently, Risti? and Nadarajah [A new lifetime distribution. J Stat Comput Simul. 2014;84:135–150] introduced the Poisson generated family of distributions and investigated the properties of a special case named the exponentiated-exponential Poisson distribution. In this paper, we study general mathematical properties of the Poisson-X family in the context of the T-X family of distributions pioneered by Alzaatreh et al. [A new method for generating families of continuous distributions. Metron. 2013;71:63–79], which include quantile, shapes of the density and hazard rate functions, asymptotics and Shannon entropy. We obtain a useful linear representation of the family density and explicit expressions for the ordinary and incomplete moments, mean deviations and generating function. One special lifetime model called the Poisson power-Cauchy is defined and some of its properties are investigated. This model can have flexible hazard rate shapes such as increasing, decreasing, bathtub and upside-down bathtub. The method of maximum likelihood is used to estimate the model parameters. We illustrate the flexibility of the new distribution by means of three applications to real life data sets.  相似文献   
88.
《统计学通讯:理论与方法》2012,41(16-17):3233-3243
In literature there are several studies on the performance of Bayesian network structure learning algorithms. The focus of these studies is almost always the heuristics the learning algorithms are based on, i.e., the maximization algorithms (in score-based algorithms) or the techniques for learning the dependencies of each variable (in constraint-based algorithms). In this article, we investigate how the use of permutation tests instead of parametric ones affects the performance of Bayesian network structure learning from discrete data. Shrinkage tests are also covered to provide a broad overview of the techniques developed in current literature.  相似文献   
89.
ABSTRACT

A statistical test can be seen as a procedure to produce a decision based on observed data, where some decisions consist of rejecting a hypothesis (yielding a significant result) and some do not, and where one controls the probability to make a wrong rejection at some prespecified significance level. Whereas traditional hypothesis testing involves only two possible decisions (to reject or not a null hypothesis), Kaiser’s directional two-sided test as well as the more recently introduced testing procedure of Jones and Tukey, each equivalent to running two one-sided tests, involve three possible decisions to infer the value of a unidimensional parameter. The latter procedure assumes that a point null hypothesis is impossible (e.g., that two treatments cannot have exactly the same effect), allowing a gain of statistical power. There are, however, situations where a point hypothesis is indeed plausible, for example, when considering hypotheses derived from Einstein’s theories. In this article, we introduce a five-decision rule testing procedure, equivalent to running a traditional two-sided test in addition to two one-sided tests, which combines the advantages of the testing procedures of Kaiser (no assumption on a point hypothesis being impossible) and Jones and Tukey (higher power), allowing for a nonnegligible (typically 20%) reduction of the sample size needed to reach a given statistical power to get a significant result, compared to the traditional approach.  相似文献   
90.
基于Kendall’sτ秩相关系数的优越性和定义,本文提出了新的具有明确经济意义的动态条件相关copula模型,将常用的Gaussian、Clayton和Gumbel函数统一根据该演化方程实现动态化,构造出三种Kendall’sτ动态条件相关copula模型,可用于刻画不同的相关模式。这些模型不仅参数少、容易估计,避免了现有动态条件相关copula模型构建方法各异导致的在实证中不利于比较的缺点,而且能够进行多步向前预测,有效地减少了进行样本外预测时的计算量,从而为刻画时变、非线性、非对称性和尾部相关等复杂的动态相关模式提供了新方法。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号