首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3067篇
  免费   71篇
  国内免费   5篇
管理学   81篇
民族学   1篇
人口学   11篇
丛书文集   15篇
理论方法论   13篇
综合类   177篇
社会学   9篇
统计学   2836篇
  2023年   13篇
  2022年   10篇
  2021年   21篇
  2020年   56篇
  2019年   94篇
  2018年   131篇
  2017年   205篇
  2016年   86篇
  2015年   75篇
  2014年   103篇
  2013年   1149篇
  2012年   252篇
  2011年   68篇
  2010年   78篇
  2009年   83篇
  2008年   74篇
  2007年   58篇
  2006年   47篇
  2005年   66篇
  2004年   55篇
  2003年   48篇
  2002年   46篇
  2001年   41篇
  2000年   30篇
  1999年   41篇
  1998年   47篇
  1997年   26篇
  1996年   13篇
  1995年   12篇
  1994年   6篇
  1993年   9篇
  1992年   7篇
  1991年   7篇
  1990年   10篇
  1989年   7篇
  1988年   10篇
  1987年   6篇
  1986年   3篇
  1985年   10篇
  1984年   8篇
  1983年   11篇
  1982年   5篇
  1981年   1篇
  1980年   4篇
  1979年   2篇
  1978年   2篇
  1977年   3篇
  1976年   1篇
  1975年   2篇
  1973年   1篇
排序方式: 共有3143条查询结果,搜索用时 125 毫秒
101.
In this paper, we develop Bayes factor based testing procedures for the presence of a correlation or a partial correlation. The proposed Bayesian tests are obtained by restricting the class of the alternative hypotheses to maximize the probability of rejecting the null hypothesis when the Bayes factor is larger than a specified threshold. It turns out that they depend simply on the frequentist t-statistics with the associated critical values and can thus be easily calculated by using a spreadsheet in Excel and in fact by just adding one more step after one has performed the frequentist correlation tests. In addition, they are able to yield an identical decision with the frequentist paradigm, provided that the evidence threshold of the Bayesian tests is determined by the significance level of the frequentist paradigm. We illustrate the performance of the proposed procedures through simulated and real-data examples.  相似文献   
102.
This paper addresses the problems of frequentist and Bayesian estimation for the unknown parameters of generalized Lindley distribution based on lower record values. We first derive the exact explicit expressions for the single and product moments of lower record values, and then use these results to compute the means, variances and covariance between two lower record values. We next obtain the maximum likelihood estimators and associated asymptotic confidence intervals. Furthermore, we obtain Bayes estimators under the assumption of gamma priors on both the shape and the scale parameters of the generalized Lindley distribution, and associated the highest posterior density interval estimates. The Bayesian estimation is studied with respect to both symmetric (squared error) and asymmetric (linear-exponential (LINEX)) loss functions. Finally, we compute Bayesian predictive estimates and predictive interval estimates for the future record values. To illustrate the findings, one real data set is analyzed, and Monte Carlo simulations are performed to compare the performances of the proposed methods of estimation and prediction.  相似文献   
103.
The recursive least squares technique is often extended with exponential forgetting as a tool for parameter estimation in time-varying systems. The distribution of the resulting parameter estimates is, however, unknown when the forgetting factor is less than one. In this paper an approximative expression for bias of the recursively obtained parameter estimates in a time-invariant AR( na ) process with arbitrary noise is given, showing that the bias is non-zero and giving bounds on the approximation errors. Simulations confirm the approximation expressions.  相似文献   
104.
The authors consider the correlation between two arbitrary functions of the data and a parameter when the parameter is regarded as a random variable with given prior distribution. They show how to compute such a correlation and use closed form expressions to assess the dependence between parameters and various classical or robust estimators thereof, as well as between p‐values and posterior probabilities of the null hypothesis in the one‐sided testing problem. Other applications involve the Dirichlet process and stationary Gaussian processes. Using this approach, the authors also derive a general nonparametric upper bound on Bayes risks.  相似文献   
105.
In 1960 Levene suggested a potentially robust test of homogeneity of variance based on an ordinary least squares analysis of variance of the absolute values of mean-based residuals. Levene's test has since been shown to have inflated levels of significance when based on the F-distribution, and tests a hypothesis other than homogeneity of variance when treatments are unequally replicated, but the incorrect formulation is now standard output in several statistical packages. This paper develops a weighted least squares analysis of variance of the absolute values of both mean-based and median-based residuals. It shows how to adjust the residuals so that tests using the F -statistic focus on homogeneity of variance for both balanced and unbalanced designs. It shows how to modify the F -statistics currently produced by statistical packages so that the distribution of the resultant test statistic is closer to an F-distribution than is currently the case. The weighted least squares approach also produces component mean squares that are unbiased irrespective of which variable is used in Levene's test. To complete this aspect of the investigation the paper derives exact second-order moments of the component sums of squares used in the calculation of the mean-based test statistic. It shows that, for large samples, both ordinary and weighted least squares test statistics are equivalent; however they are over-dispersed compared to an F variable.  相似文献   
106.
The main objective of this work is to evaluate the performance of confidence intervals, built using the deviance statistic, for the hyperparameters of state space models. The first procedure is a marginal approximation to confidence regions, based on the likelihood test, and the second one is based on the signed root deviance profile. Those methods are computationally efficient and are not affected by problems such as intervals with limits outside the parameter space, which can be the case when the focus is on the variances of the errors. The procedures are compared to the usual approaches existing in the literature, which includes the method based on the asymptotic distribution of the maximum likelihood estimator, as well as bootstrap confidence intervals. The comparison is performed via a Monte Carlo study, in order to establish empirically the advantages and disadvantages of each method. The results show that the methods based on the deviance statistic possess a better coverage rate than the asymptotic and bootstrap procedures.  相似文献   
107.
Double robust estimators have double the chance of being a consistent estimator of a causal effect in binary treatments cases. In this paper, we proposed an estimator of a causal effect for general treatment regimes based on covariate-balancing. Under parametrical situation, our estimator has double robustness.  相似文献   
108.
In this article, we propose a new class of semiparametric instrumental variable models with partially varying coefficients, in which the structural function has a partially linear form and the impact of endogenous structural variables can vary over different levels of some exogenous variables. We propose a three-step estimation procedure to estimate both functional and constant coefficients. The consistency and asymptotic normality of these proposed estimators are established. Moreover, a generalized F-test is developed to test whether the functional coefficients are of particular parametric forms with some underlying economic intuitions, and furthermore, the limiting distribution of the proposed generalized F-test statistic under the null hypothesis is established. Finally, we illustrate the finite sample performance of our approach with simulations and two real data examples in economics.  相似文献   
109.
Negative binomial regression (NBR) and Poisson regression (PR) applications have become very popular in the analysis of count data in recent years. However, if there is a high degree of relationship between the independent variables, the problem of multicollinearity arises in these models. We introduce new two-parameter estimators (TPEs) for the NBR and the PR models by unifying the two-parameter estimator (TPE) of Özkale and Kaç?ranlar [The restricted and unrestricted two-parameter estimators. Commun Stat Theory Methods. 2007;36:2707–2725]. These new estimators are general estimators which include maximum likelihood (ML) estimator, ridge estimator (RE), Liu estimator (LE) and contraction estimator (CE) as special cases. Furthermore, biasing parameters of these estimators are given and a Monte Carlo simulation is done to evaluate the performance of these estimators using mean square error (MSE) criterion. The benefits of the new TPEs are also illustrated in an empirical application. The results show that the new proposed TPEs for the NBR and the PR models are better than the ML estimator, the RE and the LE.  相似文献   
110.
In this paper, we analytically derive the exact formula for the mean squared error (MSE) of two weighted average (WA) estimators for each individual regression coefficient. Further, we execute numerical evaluations to investigate small sample properties of the WA estimators, and compare the MSE performance of the WA estimators with the other shrinkage estimators and the usual OLS estimator. Our numerical results show that (1) the WA estimators have smaller MSE than the other shrinkage estimators and the OLS estimator over a wide region of parameter space; (2) the range where the relative MSE of the WA estimator is smaller than that of the OLS estimator gets narrower as the number of explanatory variables k increases.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号