首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3140篇
  免费   79篇
  国内免费   39篇
管理学   430篇
民族学   10篇
人口学   25篇
丛书文集   158篇
理论方法论   43篇
综合类   1527篇
社会学   127篇
统计学   938篇
  2024年   4篇
  2023年   19篇
  2022年   32篇
  2021年   20篇
  2020年   46篇
  2019年   68篇
  2018年   69篇
  2017年   129篇
  2016年   102篇
  2015年   118篇
  2014年   133篇
  2013年   463篇
  2012年   274篇
  2011年   177篇
  2010年   129篇
  2009年   150篇
  2008年   151篇
  2007年   165篇
  2006年   152篇
  2005年   149篇
  2004年   105篇
  2003年   110篇
  2002年   90篇
  2001年   88篇
  2000年   58篇
  1999年   44篇
  1998年   27篇
  1997年   29篇
  1996年   39篇
  1995年   27篇
  1994年   15篇
  1993年   14篇
  1992年   19篇
  1991年   7篇
  1990年   9篇
  1989年   4篇
  1988年   5篇
  1987年   9篇
  1986年   2篇
  1985年   3篇
  1984年   2篇
  1982年   1篇
  1981年   1篇
排序方式: 共有3258条查询结果,搜索用时 31 毫秒
51.
Minimum information bivariate distributions with uniform marginals and a specified rank correlation are studied in this paper. These distributions play an important role in a particular way of modeling dependent random variables which has been used in the computer code UNICORN for carrying out uncertainty analyses. It is shown that these minimum information distributions have a particular form which makes simulation of conditional distributions very simple. Approximations to the continuous distributions are discussed and explicit formulae are determined. Finally a relation is discussed to DAD theorems, and a numerical algorithm is given (which has geometric rate of covergence) for determining the minimum information distributions.  相似文献   
52.
In this paper the Bayesian analysis of incomplete categorical data under informative general censoring proposed by Paulino and Pereira (1995) is revisited. That analysis is based on Dirichlet priors and can be applied to any missing data pattern. However, the known properties of the posterior distributions are scarce and therefore severe limitations to the posterior computations remain. Here is shown how a Monte Carlo simulation approach based on an alternative parameterisation can be used to overcome the former computational difficulties. The proposed simulation approach makes available the approximate estimation of general parametric functions and can be implemented in a very straightforward way.  相似文献   
53.
In this article, the least squares (LS) estimates of the parameters of periodic autoregressive (PAR) models are investigated for various distributions of error terms via Monte-Carlo simulation. Beside the Gaussian distribution, this study covers the exponential, gamma, student-t, and Cauchy distributions. The estimates are compared for various distributions via bias and MSE criterion. The effect of other factors are also examined as the non-constancy of model orders, the non-constancy of the variances of seasonal white noise, the period length, and the length of the time series. The simulation results indicate that this method is in general robust for the estimation of AR parameters with respect to the distribution of error terms and other factors. However, the estimates of those parameters were, in some cases, noticeably poor for Cauchy distribution. It is also noticed that the variances of estimates of white noise variances are highly affected by the degree of skewness of the distribution of error terms.  相似文献   
54.
This contribution deals with the Monte Carlo simulation of generalized Gaussian random variables. Such a parametric family of distributions has been proposed in many applications in science to describe physical phenomena and in engineering, and it seems to be also useful in modelling economic and financial data. For values of the shape parameter α within a certain range, the distribution presents heavy tails. In particular, the cases α=1/3 and α=1/2 are considered. For such values of the shape parameter, different simulation methods are assessed.  相似文献   
55.
Likelihood ratios (LRs) are used to characterize the efficiency of diagnostic tests. In this paper, we use the classical weighted least squares (CWLS) test procedure, which was originally used for testing the homogeneity of relative risks, for comparing the LRs of two or more binary diagnostic tests. We compare the performance of this method with the relative diagnostic likelihood ratio (rDLR) method and the diagnostic likelihood ratio regression (DLRReg) approach in terms of size and power, and we observe that the performances of CWLS and rDLR are the same when used to compare two diagnostic tests, while DLRReg method has higher type I error rates and powers. We also examine the performances of the CWLS and DLRReg methods for comparing three diagnostic tests in various sample size and prevalence combinations. On the basis of Monte Carlo simulations, we conclude that all of the tests are generally conservative and have low power, especially in settings of small sample size and low prevalence.  相似文献   
56.
Complex models can only be realized a limited number of times due to large computational requirements. Methods exist for generating input parameters for model realizations including Monte Carlo simulation (MCS) and Latin hypercube sampling (LHS). Recent algorithms such as maximinLHS seek to maximize the minimum distance between model inputs in the multivariate space. A novel extension of Latin hypercube sampling (LHSMDU) for multivariate models is developed here that increases the multidimensional uniformity of the input parameters through sequential realization elimination. Correlations are considered in the LHSMDU sampling matrix using a Cholesky decomposition of the correlation matrix. Computer code implementing the proposed algorithm supplements this article. A simulation study comparing MCS, LHS, maximinLHS and LHSMDU demonstrates that increased multidimensional uniformity can significantly improve realization efficiency and that LHSMDU is effective for large multivariate problems.  相似文献   
57.
A large-scale study, in which two million random Voronoi polygons (with respect to a homogeneous Poisson point process) were generated and mensurated, is described. The polygon characteristics recorded are number of sides (or vertices), perimeter, area and interior angles. A feature is the efficient “quantile” method of replicating Poisson-type random structures, which it is hoped may find useful application elsewhere.  相似文献   
58.
For the two-sample location and scale problem we propose an adaptive test which is based on so called Lepage type tests. The well known test of Lepage (1971) is a combination of the Wilcoxon test for location alternatives and the Ansari-Bradley test for scale alternatives and it behaves well for symmetric and medium-tailed distributions. For the cae of short-, medium- and long-tailed distributions we replace the Wilcoxon test and the .Ansari-Bradley test by suitable other two-sample tests for location and scale, respectively, in oder to get higher power than the classical Lepage test for such distribotions. These tests here are called Lepage type tests. in practice, however, we generally have no clear idea about the distribution having generated our data. Thus, an adaptive test should be applied which takes the the given data set inio consideration. The proposed adaptive test is based on the concept of Hogg (1974), i.e., first, to classify the unknown symmetric distribution function with respect to a measure for tailweight and second, to apply an appropriate Lepage type test for this classified type of distribution. We compare the adaptive test with the three Lepage type tests in the adaptive scheme and with the classical Lepage test as well as with other parametric and nonparametric tests. The power comparison is carried out via Monte Carlo simulation. It is shown that the adaptive test is the best one for the broad class of distributions considered.  相似文献   
59.
The estimation of earthquakes’ occurrences prediction in seismic areas is a challenging problem in seismology and earthquake engineering. Indeed, the prevention and the quantification of possible damage provoked by destructive earthquakes are directly linked to this kind of prevision. In our paper, we adopt a parametric semi-Markov approach. This model assumes that a sequence of earthquakes is seen as a Markov process and besides it permits to take into consideration the more realistic assumption of events’ dependence in space and time. The elapsed time between two consecutive events is modeled as a general Weibull distribution. We determine then the transition probabilities and the so-called crossing states probabilities. We conclude then with a Monte Carlo simulation and the model is validated through a large database containing real data.  相似文献   
60.
以多相瞬变流理论为基础,建立了井筒内气液两相流动模型,并对低压欠平衡钻井中注气量对井内压力的影响关系和规律进行计算和分析。计算分析结果表明,注气量对井内压力的影响除与注气量本身大小有关外,还与井深、井眼与钻柱结构尺寸、井内液相流量和性质等因素密切相关。单纯增大注气量并不一定就必然会导致井内压力的降低,这取决于所给条件下构成井筒内气液两相流体的静液压力和流阻间的平衡关系。介绍的模型及方法对确定低压欠平衡钻井过程中各相关参数、地面压缩机组的配置、以及设计方案等有一定指导意义。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号