全文获取类型
收费全文 | 1685篇 |
免费 | 31篇 |
国内免费 | 7篇 |
专业分类
管理学 | 169篇 |
人口学 | 3篇 |
丛书文集 | 5篇 |
理论方法论 | 5篇 |
综合类 | 53篇 |
社会学 | 9篇 |
统计学 | 1479篇 |
出版年
2024年 | 2篇 |
2023年 | 4篇 |
2022年 | 12篇 |
2021年 | 5篇 |
2020年 | 19篇 |
2019年 | 64篇 |
2018年 | 64篇 |
2017年 | 113篇 |
2016年 | 47篇 |
2015年 | 32篇 |
2014年 | 44篇 |
2013年 | 396篇 |
2012年 | 198篇 |
2011年 | 45篇 |
2010年 | 33篇 |
2009年 | 43篇 |
2008年 | 53篇 |
2007年 | 64篇 |
2006年 | 50篇 |
2005年 | 59篇 |
2004年 | 47篇 |
2003年 | 36篇 |
2002年 | 26篇 |
2001年 | 36篇 |
2000年 | 37篇 |
1999年 | 39篇 |
1998年 | 40篇 |
1997年 | 22篇 |
1996年 | 16篇 |
1995年 | 15篇 |
1994年 | 17篇 |
1993年 | 6篇 |
1992年 | 12篇 |
1991年 | 8篇 |
1990年 | 1篇 |
1989年 | 3篇 |
1988年 | 3篇 |
1986年 | 2篇 |
1985年 | 1篇 |
1984年 | 3篇 |
1983年 | 1篇 |
1982年 | 1篇 |
1981年 | 1篇 |
1978年 | 1篇 |
1977年 | 1篇 |
1975年 | 1篇 |
排序方式: 共有1723条查询结果,搜索用时 15 毫秒
991.
Makram Talih Nicolas Hengartner 《Journal of the Royal Statistical Society. Series B, Statistical methodology》2005,67(3):321-341
Summary. When modelling multivariate financial data, the problem of structural learning is compounded by the fact that the covariance structure changes with time. Previous work has focused on modelling those changes by using multivariate stochastic volatility models. We present an alternative to these models that focuses instead on the latent graphical structure that is related to the precision matrix. We develop a graphical model for sequences of Gaussian random vectors when changes in the underlying graph occur at random times, and a new block of data is created with the addition or deletion of an edge. We show how a Bayesian hierarchical model incorporates both the uncertainty about that graph and the time variation thereof. 相似文献
992.
Uffe Kjærulff 《Statistics and Computing》1992,2(1):7-17
This paper investigates the applicability of a Monte Carlo technique known as simulated annealing to achieve optimum or sub-optimum decompositions of probabilistic networks under bounded resources. High-quality decompositions are essential for performing efficient inference in probabilistic networks. Optimum decomposition of probabilistic networks is known to be NP-hard (Wen, 1990). The paper proves that cost-function changes can be computed locally, which is essential to the efficiency of the annealing algorithm. Pragmatic control schedules which reduce the running time of the annealing algorithm are presented and evaluated. Apart from the conventional temperature parameter, these schedules involve the radius of the search space as a new control parameter. The evaluation suggests that the inclusion of this new parameter is important for the success of the annealing algorithm for the present problem. 相似文献
993.
本文讨论了蒙特卡洛模拟在结构可靠性分析中的应用 ,并给出了直接抽样法中失效概率的计算公式和置信区间 相似文献
994.
Gavin J. Gibson 《Journal of the Royal Statistical Society. Series C, Applied statistics》1997,46(2):215-233
Strategies for controlling plant epidemics are investigated by fitting continuous time spatiotemporal stochastic models to data consisting of maps of disease incidence observed at discrete times. Markov chain Monte Carlo methods are used for fitting two such models to data describing the spread of citrus tristeza virus (CTV) in an orchard. The approach overcomes some of the difficulties encountered when fitting stochastic models to infrequent observations of a continuous process. The results of the analysis cast doubt on the effectiveness of a strategy identified from a previous spatial analysis of the CTV data. Extensions of the approaches to more general models and other problems are also considered. 相似文献
995.
D. A. Stephens & A. F. M. Smith 《Journal of the Royal Statistical Society. Series C, Applied statistics》1997,46(4):477-492
To assess radiation damage in steel for reactor pressure vessels in the nuclear industry, specimens are subjected to the Charpy test, which measures how much energy a specimen can absorb at a given test temperature before cracking. The resulting Charpy impact energy data are well represented by a three-parameter Burr curve as a function of test temperature, in which the parameters of the Burr curve are themselves dependent on irradiation dose. The resulting non-linear model function, combined with heteroscedastic random errors, gives rise to complicated likelihood surfaces that make conventional statistical techniques difficult to implement. To compute estimates of parameters of practical interest, Markov chain Monte Carlo sampling-based techniques are implemented. The approach is applied to 40 data sets from specimens subjected to no irradiation or one or two doses of irradiation. The influence of irradiation dose on the amount of energy absorbed is investigated. 相似文献
996.
In this paper we discuss new adaptive proposal strategies for sequential Monte Carlo algorithms—also known as particle filters—relying on criteria evaluating the quality of the proposed particles. The choice of the proposal distribution is a major concern and can dramatically influence the quality of the estimates. Thus, we show how the long-used coefficient of variation (suggested by Kong et al. in J. Am. Stat. Assoc. 89(278–288):590–599, 1994) of the weights can be used for estimating the chi-square distance between the target and instrumental distributions of the auxiliary particle filter. As a by-product of this analysis we obtain an auxiliary adjustment multiplier weight type for which this chi-square distance is minimal. Moreover, we establish an empirical estimate of linear complexity of the Kullback-Leibler divergence between the involved distributions. Guided by these results, we discuss adaptive designing of the particle filter proposal distribution and illustrate the methods on a numerical example. This work was partly supported by the National Research Agency (ANR) under the program “ANR-05-BLAN-0299”. 相似文献
997.
Based on recent developments in the field of operations research, we propose two adaptive resampling algorithms for estimating bootstrap distributions. One algorithm applies the principle of the recently proposed cross-entropy (CE) method for rare event simulation, and does not require calculation of the resampling probability weights via numerical optimization methods (e.g., Newton's method), whereas the other algorithm can be viewed as a multi-stage extension of the classical two-step variance minimization approach. The two algorithms can be easily used as part of a general algorithm for Monte Carlo calculation of bootstrap confidence intervals and tests, and are especially useful in estimating rare event probabilities. We analyze theoretical properties of both algorithms in an idealized setting and carry out simulation studies to demonstrate their performance. Empirical results on both one-sample and two-sample problems as well as a real survival data set show that the proposed algorithms are not only superior to traditional approaches, but may also provide more than an order of magnitude of computational efficiency gains. 相似文献
998.
The performance of different information criteria – namely Akaike, corrected Akaike (AICC), Schwarz–Bayesian (SBC), and Hannan–Quinn – is investigated so as to choose the optimal lag length in stable and unstable vector autoregressive (VAR) models both when autoregressive conditional heteroscedasticity (ARCH) is present and when it is not. The investigation covers both large and small sample sizes. The Monte Carlo simulation results show that SBC has relatively better performance in lag-choice accuracy in many situations. It is also generally the least sensitive to ARCH regardless of stability or instability of the VAR model, especially in large sample sizes. These appealing properties of SBC make it the optimal criterion for choosing lag length in many situations, especially in the case of financial data, which are usually characterized by occasional periods of high volatility. SBC also has the best forecasting abilities in the majority of situations in which we vary sample size, stability, variance structure (ARCH or not), and forecast horizon (one period or five). frequently, AICC also has good lag-choosing and forecasting properties. However, when ARCH is present, the five-period forecast performance of all criteria in all situations worsens. 相似文献
999.
Kasper K. Berthelsen Jesper Møller 《Australian & New Zealand Journal of Statistics》2008,50(3):257-272
With reference to a specific dataset, we consider how to perform a flexible non‐parametric Bayesian analysis of an inhomogeneous point pattern modelled by a Markov point process, with a location‐dependent first‐order term and pairwise interaction only. A priori we assume that the first‐order term is a shot noise process, and that the interaction function for a pair of points depends only on the distance between the two points and is a piecewise linear function modelled by a marked Poisson process. Simulation of the resulting posterior distribution using a Metropolis–Hastings algorithm in the ‘conventional’ way involves evaluating ratios of unknown normalizing constants. We avoid this problem by applying a recently introduced auxiliary variable technique. In the present setting, the auxiliary variable used is an example of a partially ordered Markov point process model. 相似文献
1000.
One linear and two nonlinear adaptive robust procedures have been developed in which preliminary statistics, based on tail lengths, attempt to identify distributions from which the samples arise so that a suitable robust estimator based on trimmed means can be used to estimate the location parameter. The efficiencies of the estimators based on the three proposed adaptive robust procedures have been obtained using Monte Carlo methods involving eight distributions and these efficiencies are compared with the efficiencies of nineteen other robust estimators. 相似文献