首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1739篇
  免费   31篇
  国内免费   8篇
管理学   172篇
人口学   3篇
丛书文集   7篇
理论方法论   5篇
综合类   54篇
社会学   10篇
统计学   1527篇
  2024年   2篇
  2023年   4篇
  2022年   12篇
  2021年   5篇
  2020年   19篇
  2019年   66篇
  2018年   68篇
  2017年   117篇
  2016年   48篇
  2015年   33篇
  2014年   45篇
  2013年   418篇
  2012年   202篇
  2011年   50篇
  2010年   34篇
  2009年   43篇
  2008年   54篇
  2007年   65篇
  2006年   51篇
  2005年   60篇
  2004年   47篇
  2003年   38篇
  2002年   26篇
  2001年   38篇
  2000年   36篇
  1999年   38篇
  1998年   40篇
  1997年   23篇
  1996年   17篇
  1995年   15篇
  1994年   17篇
  1993年   6篇
  1992年   12篇
  1991年   8篇
  1990年   1篇
  1989年   3篇
  1988年   4篇
  1986年   2篇
  1985年   1篇
  1984年   3篇
  1983年   1篇
  1982年   2篇
  1981年   1篇
  1978年   1篇
  1977年   1篇
  1975年   1篇
排序方式: 共有1778条查询结果,搜索用时 0 毫秒
51.
There have been numerous tests proposed to determine whether or not the exponential model is suitable for a given data set. In this article, we propose a new test statistic based on spacings to test whether the general progressive Type-II censored samples are from exponential distribution. The null distribution of the test statistic is discussed and it could be approximated by the standard normal distribution. Meanwhile, we propose an approximate method for calculating the expectation and variance of samples under null hypothesis and corresponding power function is also given. Then, a simulation study is conducted. We calculate the approximation of the power based on normality and compare the results with those obtained by Monte Carlo simulation under different alternatives with distinct types of hazard function. Results of simulation study disclose that the power properties of this statistic by using Monte Carlo simulation are better for the alternatives with monotone increasing hazard function, and otherwise, normal approximation simulation results are relatively better. Finally, two illustrative examples are presented.  相似文献   
52.
The problem of estimation of the parameters of two-parameter inverse Weibull distributions has been considered. We establish existence and uniqueness of the maximum likelihood estimators of the scale and shape parameters. We derive Bayes estimators of the parameters under the entropy loss function. Hierarchical Bayes estimator, equivariant estimator and a class of minimax estimators are derived when shape parameter is known. Ordered Bayes estimators using information about second population are also derived. We investigate the reliability of multi-component stress-strength model using classical and Bayesian approaches. Risk comparison of the classical and Bayes estimators is done using Monte Carlo simulations. Applications of the proposed estimators are shown using real data sets.  相似文献   
53.
With the increasing salience of foundations in many policy fields, and recent changes in market conditions, policies towards foundations designed decades ago seem outdated. In this article we suggest reassessing foundation payout minimums. To examine the impact of payout rates on grantmaking foundations lifespan and performance under “new normal” economics, we simulate multiple foundations lifecycles using Monte Carlo methods in diverse capital market conditions, with varied investment and payout strategies.We find that while under past market regime perpetuity seems to be a given, under more probable future scenarios, foundations might face increasingly early mortality and endowment depletion, limiting their potential impact. Furthermore, lower payout rates allow for higher lifetime grantmaking, higher mean annual grantmaking, and lower giving volatility. Accordingly, we suggest a tiered payout policy, in line with foundations’ missions and proper financial planning.  相似文献   
54.
This paper employs advanced time series methods to identify the dynamic properties of three hostage taking series. The immediate and long run multipliers of three covariates—successful past negotiations, violent ends, and deaths—are identified. Each hostage series responds differently to the covariates. Past concessions have the strongest impact on generating future kidnapping events, supporting the conventional wisdom to abide by a stated no-concession policy. Each hostage series has different changepoints caused by a variety of circumstances. Skyjackings and kidnappings are negatively correlated, while skyjackings and other hostage events are positively correlated. Policy recommendations are offered.  相似文献   
55.
We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution. In the context of Bayesian latent variable models, the MCMC typically operates on the hyperparameters, and the subsequent weighting may be based on IS or sequential Monte Carlo (SMC), but allows for multilevel techniques as well. The IS approach provides a natural alternative to delayed acceptance (DA) pseudo-marginal/particle MCMC, and has many advantages over DA, including a straightforward parallelization and additional flexibility in MCMC implementation. We detail minimal conditions which ensure strong consistency of the suggested estimators, and provide central limit theorems with expressions for asymptotic variances. We demonstrate how our method can make use of SMC in the state space models context, using Laplace approximations and time-discretized diffusions. Our experimental results are promising and show that the IS-type approach can provide substantial gains relative to an analogous DA scheme, and is often competitive even without parallelization.  相似文献   
56.
This article considers misclassification of categorical covariates in the context of regression analysis; if unaccounted for, such errors usually result in mis-estimation of model parameters. With the presence of additional covariates, we exploit the fact that explicitly modelling non-differential misclassification with respect to the response leads to a mixture regression representation. Under the framework of mixture of experts, we enable the reclassification probabilities to vary with other covariates, a situation commonly caused by misclassification that is differential on certain covariates and/or by dependence between the misclassified and additional covariates. Using Bayesian inference, the mixture approach combines learning from data with external information on the magnitude of errors when it is available. In addition to proving the theoretical identifiability of the mixture of experts approach, we study the amount of efficiency loss resulting from covariate misclassification and the usefulness of external information in mitigating such loss. The method is applied to adjust for misclassification on self-reported cocaine use in the Longitudinal Studies of HIV-Associated Lung Infections and Complications.  相似文献   
57.
Social scientists often estimate models from correlational data, where the independent variable has not been exogenously manipulated; they also make implicit or explicit causal claims based on these models. When can these claims be made? We answer this question by first discussing design and estimation conditions under which model estimates can be interpreted, using the randomized experiment as the gold standard. We show how endogeneity – which includes omitted variables, omitted selection, simultaneity, common-method variance, and measurement error – renders estimates causally uninterpretable. Second, we present methods that allow researchers to test causal claims in situations where randomization is not possible or when causal interpretation could be confounded; these methods include fixed-effects panel, sample selection, instrumental variable, regression discontinuity, and difference-in-differences models. Third, we take stock of the methodological rigor with which causal claims are being made in a social sciences discipline by reviewing a representative sample of 110 articles on leadership published in the previous 10 years in top-tier journals. Our key finding is that researchers fail to address at least 66% and up to 90% of design and estimation conditions that make causal claims invalid. We conclude by offering 10 suggestions on how to improve non-experimental research.  相似文献   
58.
在分析MBS定价的影响因素以及比较结构化模型与简化形式模型定价方法的基础上,考虑模型的稳健性和可操作性,本文利用简化方法中的Schwartz和Torous定价模型,以建元2007-1RMBS作为研究对象,模拟出BDT利率模型下的利率期限结构,再结合提前还款模型中的PSA法确定贷款现金流,进而确定期权调整价差OAS,构建了适用于我国的MBS定价模型。  相似文献   
59.
零无效率随机前沿模型(ZISF)包含随机前沿模型和回归模型,两模型各有一定的发生概率,适用于技术无效生产单元和技术有效生产单元同时存在的情形。本文在ZISF的生产函数中引入空间效应和非参函数,并假设回归模型的发生概率为非参函数,构建了半参数空间ZISF。该模型可有效避免忽略空间效应导致的有偏且不一致估计量,也避免了线性模型的拟合不足。本文对非参函数采用B样条逼近,使用极大似然方法和JLMS法分别估计参数和技术效率。蒙特卡罗结果表明:①本文方法的估计精度和分类精度均较高。随着样本容量的增大,精度增加。②忽略空间效应或者非参数效应,估计精度和分类精度降低,文中模型有存在必要性。③忽略发生概率的非参数效应会严重降低估计和分类精度,远大于忽略生产函数的非参数效应的影响。  相似文献   
60.
Bayesian Monte Carlo (BMC) decision analysis adopts a sampling procedure to estimate likelihoods and distributions of outcomes, and then uses that information to calculate the expected performance of alternative strategies, the value of information, and the value of including uncertainty. These decision analysis outputs are therefore subject to sample error. The standard error of each estimate and its bias, if any, can be estimated by the bootstrap procedure. The bootstrap operates by resampling (with replacement) from the original BMC sample, and redoing the decision analysis. Repeating this procedure yields a distribution of decision analysis outputs. The bootstrap approach to estimating the effect of sample error upon BMC analysis is illustrated with a simple value-of-information calculation along with an analysis of a proposed control structure for Lake Erie. The examples show that the outputs of BMC decision analysis can have high levels of sample error and bias.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号