首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1657篇
  免费   39篇
  国内免费   8篇
管理学   171篇
人口学   3篇
丛书文集   4篇
理论方法论   4篇
综合类   47篇
社会学   10篇
统计学   1465篇
  2024年   2篇
  2023年   3篇
  2022年   9篇
  2021年   5篇
  2020年   18篇
  2019年   64篇
  2018年   63篇
  2017年   114篇
  2016年   48篇
  2015年   33篇
  2014年   43篇
  2013年   388篇
  2012年   193篇
  2011年   45篇
  2010年   31篇
  2009年   43篇
  2008年   53篇
  2007年   64篇
  2006年   51篇
  2005年   59篇
  2004年   47篇
  2003年   36篇
  2002年   26篇
  2001年   37篇
  2000年   35篇
  1999年   38篇
  1998年   40篇
  1997年   23篇
  1996年   16篇
  1995年   15篇
  1994年   17篇
  1993年   6篇
  1992年   12篇
  1991年   8篇
  1990年   1篇
  1989年   3篇
  1988年   3篇
  1986年   2篇
  1985年   1篇
  1984年   3篇
  1983年   1篇
  1982年   1篇
  1981年   1篇
  1978年   1篇
  1977年   1篇
  1975年   1篇
排序方式: 共有1704条查询结果,搜索用时 295 毫秒
71.
Statistical inference procedures based on transforms such as characteristic function and probability generating function have been examined by many researchers because they are much simpler than probability density functions. Here, a probability generating function based Jeffrey's divergence measure is proposed for parameter estimation and goodness-of-fit test. Being a member of the M-estimators, the proposed estimator is consistent. Also, the proposed goodness-of-fit test has good statistical power. The proposed divergence measure shows improved performance over existing probability generating function based measures. Real data examples are given to illustrate the proposed parameter estimation method and goodness-of-fit test.  相似文献   
72.
73.
In this article, we develop the theory of k-factor Gegenbauer Autoregressive Moving Average (GARMA) process with infinite variance innovations which is a generalization of the stable seasonal fractional Autoregressive Integrated Moving Average (ARIMA) model introduced by Diongue et al. (2008 Diongue, A.K., Guégan, D. (2008). Estimation of k-Factor GIGARCH Process: A Monte Carlo Study. Communications in Statistics-Simulation and Computation 37:20372049.[Taylor &; Francis Online], [Web of Science ®] [Google Scholar]). Stationarity and invertibility conditions of this new model are derived. Conditional Sum of Squares (CSS) and Markov Chains Monte Carlo (MCMC) Whittle methods are investigated for parameter estimation. Monte Carlo simulations are also used to evaluate the finite sample performance of these estimation techniques. Finally, the usefulness of the model is corroborated with the application to streamflow data for Senegal River at Bakel.  相似文献   
74.
This article reviews symmetrical global sensitivity analysis based on the analysis of variance of high-dimensional model representation. To overcome the computational difficulties and explore the use of symmetrical design of experiment (SDOE), two methods are presented. If the form of the objective function f is known, we use SDOE to estimate the symmetrical global sensitivity indices instead of Monte Carlo or quasi-Monte Carlo simulation. Otherwise, we use the observed values of the experiment to do symmetrical global sensitivity analysis. These methods are easy to implement and can reduce the computational cost. An example is given by symmetrical design of experiment.  相似文献   
75.
Remote sensing of the earth with satellites yields datasets that can be massive in size, nonstationary in space, and non‐Gaussian in distribution. To overcome computational challenges, we use the reduced‐rank spatial random effects (SRE) model in a statistical analysis of cloud‐mask data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on board NASA's Terra satellite. Parameterisations of cloud processes are the biggest source of uncertainty and sensitivity in different climate models’ future projections of Earth's climate. An accurate quantification of the spatial distribution of clouds, as well as a rigorously estimated pixel‐scale clear‐sky‐probability process, is needed to establish reliable estimates of cloud‐distributional changes and trends caused by climate change. Here we give a hierarchical spatial‐statistical modelling approach for a very large spatial dataset of 2.75 million pixels, corresponding to a granule of MODIS cloud‐mask data, and we use spatial change‐of‐Support relationships to estimate cloud fraction at coarser resolutions. Our model is non‐Gaussian; it postulates a hidden process for the clear‐sky probability that makes use of the SRE model, EM‐estimation, and optimal (empirical Bayes) spatial prediction of the clear‐sky‐probability process. Measures of prediction uncertainty are also given.  相似文献   
76.
The memory-type adaptive and non-adaptive control charts are among the best control charts for detecting small-to-moderate changes in the process parameter(s). In this paper, we propose the Crosier CUSUM (CCUSUM), EWMA, adaptive CCUSUM (ACCUSUM) and adaptive EWMA (AEWMA) charts for efficiently monitoring the changes in the covariance matrix of a multivariate normal process without subgrouping. Using extensive Monte Carlo simulations, the length characteristics of these control charts are computed. It turns out that the ACCUSUM and AEWMA charts perform uniformly and substantially better than the CCUSUM and EWMA charts when detecting a range of shift sizes in the covariance matrix. Moreover, the AEWMA chart outperforms the ACCUSUM chart. A real dataset is used to explain the implementation of the proposed control charts.  相似文献   
77.
The Finnish common toad data of Heikkinen and Hogmander are reanalysed using an alternative fully Bayesian model that does not require a pseudolikelihood approximation and an alternative prior distribution for the true presence or absence status of toads in each 10 km×10 km square. Markov chain Monte Carlo methods are used to obtain posterior probability estimates of the square-specific presences of the common toad and these are presented as a map. The results are different from those of Heikkinen and Hogmander and we offer an explanation in terms of the prior used for square-specific presence of the toads. We suggest that our approach is more faithful to the data and avoids unnecessary confounding of effects. We demonstrate how to extend our model efficiently with square-specific covariates and illustrate this by introducing deterministic spatial changes.  相似文献   
78.
We consider the competing risks set-up. In many practical situations, the conditional probability of the cause of failure given the failure time is of direct interest. We propose to model the competing risks by the overall hazard rate and the conditional probabilities rather than the cause-specific hazards. We adopt a Bayesian smoothing approach for both quantities of interest. Illustrations are given at the end.  相似文献   
79.
We show that, in the context of double-bootstrap confidence intervals, linear interpolation at the second level of the double bootstrap can reduce the simulation error component of coverage error by an order of magnitude. Intervals that are indistinguishable in terms of coverage error with theoretical, infinite simulation, double-bootstrap confidence intervals may be obtained at substantially less computational expense than by using the standard Monte Carlo approximation method. The intervals retain the simplicity of uniform bootstrap sampling and require no special analysis or computational techniques. Interpolation at the first level of the double bootstrap is shown to have a relatively minor effect on the simulation error.  相似文献   
80.
In this paper, we consider parametric Bayesian inference for stochastic differential equations driven by a pure‐jump stable Lévy process, which is observed at high frequency. In most cases of practical interest, the likelihood function is not available; hence, we use a quasi‐likelihood and place an associated prior on the unknown parameters. It is shown under regularity conditions that there is a Bernstein–von Mises theorem associated to the posterior. We then develop a Markov chain Monte Carlo algorithm for Bayesian inference, and assisted with theoretical results, we show how to scale Metropolis–Hastings proposals when the frequency of the data grows, in order to prevent the acceptance ratio from going to zero in the large data limit. Our algorithm is presented on numerical examples that help verify our theoretical findings.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号