首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2701篇
  免费   132篇
  国内免费   30篇
管理学   538篇
民族学   3篇
人才学   1篇
人口学   41篇
丛书文集   55篇
理论方法论   36篇
综合类   900篇
社会学   87篇
统计学   1202篇
  2024年   3篇
  2023年   9篇
  2022年   26篇
  2021年   37篇
  2020年   56篇
  2019年   86篇
  2018年   78篇
  2017年   135篇
  2016年   107篇
  2015年   131篇
  2014年   142篇
  2013年   489篇
  2012年   287篇
  2011年   160篇
  2010年   99篇
  2009年   126篇
  2008年   121篇
  2007年   106篇
  2006年   103篇
  2005年   102篇
  2004年   71篇
  2003年   51篇
  2002年   44篇
  2001年   50篇
  2000年   32篇
  1999年   35篇
  1998年   22篇
  1997年   24篇
  1996年   28篇
  1995年   25篇
  1994年   13篇
  1993年   12篇
  1992年   19篇
  1991年   7篇
  1990年   10篇
  1989年   4篇
  1988年   4篇
  1987年   5篇
  1986年   1篇
  1984年   1篇
  1982年   2篇
排序方式: 共有2863条查询结果,搜索用时 546 毫秒
41.
In this article, the least squares (LS) estimates of the parameters of periodic autoregressive (PAR) models are investigated for various distributions of error terms via Monte-Carlo simulation. Beside the Gaussian distribution, this study covers the exponential, gamma, student-t, and Cauchy distributions. The estimates are compared for various distributions via bias and MSE criterion. The effect of other factors are also examined as the non-constancy of model orders, the non-constancy of the variances of seasonal white noise, the period length, and the length of the time series. The simulation results indicate that this method is in general robust for the estimation of AR parameters with respect to the distribution of error terms and other factors. However, the estimates of those parameters were, in some cases, noticeably poor for Cauchy distribution. It is also noticed that the variances of estimates of white noise variances are highly affected by the degree of skewness of the distribution of error terms.  相似文献   
42.
This contribution deals with the Monte Carlo simulation of generalized Gaussian random variables. Such a parametric family of distributions has been proposed in many applications in science to describe physical phenomena and in engineering, and it seems to be also useful in modelling economic and financial data. For values of the shape parameter α within a certain range, the distribution presents heavy tails. In particular, the cases α=1/3 and α=1/2 are considered. For such values of the shape parameter, different simulation methods are assessed.  相似文献   
43.
Likelihood ratios (LRs) are used to characterize the efficiency of diagnostic tests. In this paper, we use the classical weighted least squares (CWLS) test procedure, which was originally used for testing the homogeneity of relative risks, for comparing the LRs of two or more binary diagnostic tests. We compare the performance of this method with the relative diagnostic likelihood ratio (rDLR) method and the diagnostic likelihood ratio regression (DLRReg) approach in terms of size and power, and we observe that the performances of CWLS and rDLR are the same when used to compare two diagnostic tests, while DLRReg method has higher type I error rates and powers. We also examine the performances of the CWLS and DLRReg methods for comparing three diagnostic tests in various sample size and prevalence combinations. On the basis of Monte Carlo simulations, we conclude that all of the tests are generally conservative and have low power, especially in settings of small sample size and low prevalence.  相似文献   
44.
Complex models can only be realized a limited number of times due to large computational requirements. Methods exist for generating input parameters for model realizations including Monte Carlo simulation (MCS) and Latin hypercube sampling (LHS). Recent algorithms such as maximinLHS seek to maximize the minimum distance between model inputs in the multivariate space. A novel extension of Latin hypercube sampling (LHSMDU) for multivariate models is developed here that increases the multidimensional uniformity of the input parameters through sequential realization elimination. Correlations are considered in the LHSMDU sampling matrix using a Cholesky decomposition of the correlation matrix. Computer code implementing the proposed algorithm supplements this article. A simulation study comparing MCS, LHS, maximinLHS and LHSMDU demonstrates that increased multidimensional uniformity can significantly improve realization efficiency and that LHSMDU is effective for large multivariate problems.  相似文献   
45.
Measuring and improving the efficiency of the Chinese commercial banking system has recently attracted increasing interest. Few studies, however, have adopted the two-stage network DEA to explore this issue in the Chinese context. Because the entire operational process of the banking system could be divided into two sub-processes (deposit producing and profit earning), the evaluation of the sub-process efficiencies could be used to assist in identifying the sources of the inefficiency of the entire banking system. In this study, we utilize the network DEA approach to disaggregate, evaluate and test the efficiencies of 16 major Chinese commercial banks during the third round of the Chinese banking reform period (2003–2011) with the variable returns to scale setting and the consideration of undesirable/bad output. The main findings of this study are as follows: (i) the two-stage DEA model is more effective than the conventional black box DEA model in identifying the inefficiency of banking system, and the inefficiency of the Chinese banking system primarily results from the inefficiency of its deposit producing sub-process; (ii) the overall efficiency of the Chinese banking system improves over the study period because of the reform; (iii) the state-owned commercial banks (SOBs) appear to be more overall efficient than the joint-stock commercial banks (JSBs) only in the pre-reform period, and the efficiency difference between the SOBs and the JSBs is reduced over the post-reform period; (iv) the disposal of non-performing loans (NPLs) from the Chinese banking system in general explains its efficiency improvement, and the joint-equity reform of the SOBs specifically increases their efficiencies.  相似文献   
46.
A large-scale study, in which two million random Voronoi polygons (with respect to a homogeneous Poisson point process) were generated and mensurated, is described. The polygon characteristics recorded are number of sides (or vertices), perimeter, area and interior angles. A feature is the efficient “quantile” method of replicating Poisson-type random structures, which it is hoped may find useful application elsewhere.  相似文献   
47.

Asymptotic confidence (delta) intervals and intervals based upon the use of Fieller's theorem are alternative methods for constructing intervals for the <$>\gamma<$>% effective doses (ED<$>_\gamma<$>). Sitter and Wu (1993) provided a comparison of the two approaches for the ED<$>_{50}<$>, for the case in which a logistic dose response curve is assumed. They showed that the Fieller intervals are generally superior. In this paper, we introduce two new families of intervals, both of which include the delta and Fieller intervals as special cases. In addition we consider interval estimation of the ED<$>_{90}<$> as well as the ED<$>_{50}<$>. We provide a comparison of the various methods for the problem of constructing a confidence interval for the ED<$>_\gamma<$>.  相似文献   
48.
We consider a problem of evaluating efficiency of Decision Making Units (DMUs) based on their deterministic performance on multiple consumed inputs and multiple produced outputs. We apply a ratio-based efficiency measure, and account for the Decision Maker׳s preference information representable with linear constraints involving input/output weights. We analyze the set of all feasible weights to answer various robustness concerns by deriving: (1) extreme efficiency scores and (2) extreme efficiency ranks for each DMU, (3) possible and necessary efficiency preference relations for pairs of DMUs, (4) efficiency distribution, (5) efficiency rank acceptability indices, and (6) pairwise efficiency outranking indices. The proposed hybrid approach combines and extends previous results from Ratio-based Efficiency Analysis and the SMAA-D method. The practical managerial implications are derived from the complementary character of accounted perspectives on DMUs׳ efficiencies. We present an innovative open-source software implementing an integrated framework for robustness analysis using a ratio-based efficiency model on the diviz platform. The proposed approach is applied to a real-world problem of evaluating efficiency of Polish airports. We consider four inputs related to the capacities of a terminal, runways, and an apron, and to the airport׳s catchment area, and two outputs concerning passenger traffic and number of aircraft movements. We present how the results can be affected by integrating the weight constraints and eliminating outlier DMUs.  相似文献   
49.
For the two-sample location and scale problem we propose an adaptive test which is based on so called Lepage type tests. The well known test of Lepage (1971) is a combination of the Wilcoxon test for location alternatives and the Ansari-Bradley test for scale alternatives and it behaves well for symmetric and medium-tailed distributions. For the cae of short-, medium- and long-tailed distributions we replace the Wilcoxon test and the .Ansari-Bradley test by suitable other two-sample tests for location and scale, respectively, in oder to get higher power than the classical Lepage test for such distribotions. These tests here are called Lepage type tests. in practice, however, we generally have no clear idea about the distribution having generated our data. Thus, an adaptive test should be applied which takes the the given data set inio consideration. The proposed adaptive test is based on the concept of Hogg (1974), i.e., first, to classify the unknown symmetric distribution function with respect to a measure for tailweight and second, to apply an appropriate Lepage type test for this classified type of distribution. We compare the adaptive test with the three Lepage type tests in the adaptive scheme and with the classical Lepage test as well as with other parametric and nonparametric tests. The power comparison is carried out via Monte Carlo simulation. It is shown that the adaptive test is the best one for the broad class of distributions considered.  相似文献   
50.
The estimation of earthquakes’ occurrences prediction in seismic areas is a challenging problem in seismology and earthquake engineering. Indeed, the prevention and the quantification of possible damage provoked by destructive earthquakes are directly linked to this kind of prevision. In our paper, we adopt a parametric semi-Markov approach. This model assumes that a sequence of earthquakes is seen as a Markov process and besides it permits to take into consideration the more realistic assumption of events’ dependence in space and time. The elapsed time between two consecutive events is modeled as a general Weibull distribution. We determine then the transition probabilities and the so-called crossing states probabilities. We conclude then with a Monte Carlo simulation and the model is validated through a large database containing real data.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号