首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1430篇
  免费   46篇
  国内免费   18篇
管理学   193篇
民族学   1篇
人口学   23篇
丛书文集   22篇
理论方法论   77篇
综合类   231篇
社会学   10篇
统计学   937篇
  2023年   6篇
  2022年   6篇
  2021年   10篇
  2020年   32篇
  2019年   45篇
  2018年   47篇
  2017年   87篇
  2016年   37篇
  2015年   44篇
  2014年   40篇
  2013年   357篇
  2012年   116篇
  2011年   40篇
  2010年   46篇
  2009年   43篇
  2008年   56篇
  2007年   57篇
  2006年   59篇
  2005年   38篇
  2004年   28篇
  2003年   35篇
  2002年   35篇
  2001年   27篇
  2000年   18篇
  1999年   20篇
  1998年   13篇
  1997年   14篇
  1996年   12篇
  1995年   12篇
  1994年   14篇
  1993年   13篇
  1992年   18篇
  1991年   11篇
  1990年   4篇
  1989年   5篇
  1988年   10篇
  1987年   6篇
  1986年   5篇
  1985年   6篇
  1984年   5篇
  1983年   3篇
  1982年   2篇
  1981年   4篇
  1980年   3篇
  1979年   1篇
  1978年   1篇
  1977年   2篇
  1975年   1篇
排序方式: 共有1494条查询结果,搜索用时 437 毫秒
881.
With rapid development of computing technology, Bayesian statistics have increasingly gained more attention in various areas of public health. However, the full potential of Bayesian sequential methods applied to vaccine safety surveillance has not yet been realized, despite acknowledged practical benefits and philosophical advantages of Bayesian statistics. In this paper, we describe how sequential analysis can be performed in a Bayesian paradigm in the field of vaccine safety. We compared the performance of the frequentist sequential method, specifically, Maximized Sequential Probability Ratio Test (MaxSPRT), and a Bayesian sequential method using simulations and a real world vaccine safety example. The performance is evaluated using three metrics: false positive rate, false negative rate, and average earliest time to signal. Depending on the background rate of adverse events, the Bayesian sequential method could significantly improve the false negative rate and decrease the earliest time to signal. We consider the proposed Bayesian sequential approach to be a promising alternative for vaccine safety surveillance.  相似文献   
882.
In this paper, a jump–diffusion Omega model with a two-step premium rate is studied. In this model, the surplus process is a perturbation of a compound Poisson process by a Brown motion. Firstly, using the strong Markov property, the integro-differential equations for the Gerber–Shiu expected discounted penalty function and the bankruptcy probability are derived. Secondly, for a constant bankruptcy rate function, the renewal equations satisfied by the Gerber–Shiu expected discounted penalty function are obtained, and by iteration, the closed-form solutions of the function are also given. Further, the explicit solutions of the Gerber–Shiu expected discounted penalty function are obtained when the individual claim size is subject to exponential distribution. Finally, a numerical example is presented to illustrate some properties of the model.  相似文献   
883.
The paper proposes an alternative algorithm to implement the current manual choosing mechanism of energy companies whose dedicated department has a library of electric load models and one best model is chosen manually everyday for daily forecast. The proposed algorithm is a combination of an estimation of change point, and a fast ECM algorithm based on the empirical probability function, as well as methods of a hidden markov chain. We train parameters of the proposed algorithm based on a historical dataset consisting of loads, exogenous information such as temperature, and the daily recommended best model which is unavailable sometimes. Simulations and a test on a real-world dataset show that compared with other state-of-art algorithms, the proposed algorithm is fast and efficient for short-term electric load forecasting. An implement to the proposed algorithm written in Matlab is provided in supplement file.  相似文献   
884.
Currently, a binary alarm system is used in the United States to issue deterministic warning polygons in case of tornado events. To enhance the effectiveness of the weather information, a likelihood alarm system, which uses a tool called probabilistic hazard information (PHI), is being developed at National Severe Storms Laboratory to issue probabilistic information about the threat. This study aims to investigate the effects of providing the uncertainty information about a tornado occurrence through the PHI's graphical swath on laypeople's concern, fear, and protective action, as compared with providing the warning information with the deterministic polygon. The displays of color‐coded swaths and deterministic polygons were shown to subjects. Some displays had a blue background denoting the probability of any tornado formation in the general area. Participants were asked to report their levels of concern, fear, and protective action at randomly chosen locations within each of seven designated levels on each display. Analysis of a three‐stage nested design showed that providing the uncertainty information via the PHI would appropriately increase recipients’ levels of concern, fear, and protective action in highly dangerous scenarios, with a more than 60% chance of being affected by the threat, as compared with deterministic polygons. The blue background and the color‐coding type did not have a significant effect on the people's cognition of the threat and reaction to it. This study shows that using a likelihood alarm system leads to more conscious decision making by the weather information recipients and enhances the system safety.  相似文献   
885.
Motivated by McShane, Adrian, Bradlow and Fader [Journal of Business and Economic Statistics, 26, 2008, 369–378], we introduce thirteen discrete distributions. We give explicit expressions for their probability mass functions. We analyze two football data sets and show that some of the proposed distributions provide better fits than the distribution due to McShane, Adrian, Bradlow, and Fader.  相似文献   
886.
This paper is concerned with model averaging procedure for varying-coefficient partially linear models with missing responses. The profile least-squares estimation process and inverse probability weighted method are employed to estimate regression coefficients of the partially restricted models, in which the propensity score is estimated by the covariate balancing propensity score method. The estimators of the linear parameters are shown to be asymptotically normal. Then we develop the focused information criterion, formulate the frequentist model averaging estimators and construct the corresponding confidence intervals. Some simulation studies are conducted to examine the finite sample performance of the proposed methods. We find that the covariate balancing propensity score improves the performance of the inverse probability weighted estimator. We also demonstrate the superiority of the proposed model averaging estimators over those of existing strategies in terms of mean squared error and coverage probability. Finally, our approach is further applied to a real data example.  相似文献   
887.
ABSTRACT

Expert opinion and judgment enter into the practice of statistical inference and decision-making in numerous ways. Indeed, there is essentially no aspect of scientific investigation in which judgment is not required. Judgment is necessarily subjective, but should be made as carefully, as objectively, and as scientifically as possible.

Elicitation of expert knowledge concerning an uncertain quantity expresses that knowledge in the form of a (subjective) probability distribution for the quantity. Such distributions play an important role in statistical inference (for example as prior distributions in a Bayesian analysis) and in evidence-based decision-making (for example as expressions of uncertainty regarding inputs to a decision model). This article sets out a number of practices through which elicitation can be made as rigorous and scientific as possible.

One such practice is to follow a recognized protocol that is designed to address and minimize the cognitive biases that experts are prone to when making probabilistic judgments. We review the leading protocols in the field, and contrast their different approaches to dealing with these biases through the medium of a detailed case study employing the SHELF protocol.

The article ends with discussion of how to elicit a joint probability distribution for multiple uncertain quantities, which is a challenge for all the leading protocols. Supplementary materials for this article are available online.  相似文献   
888.
Bootstrap smoothed (bagged) parameter estimators have been proposed as an improvement on estimators found after preliminary data‐based model selection. A result of Efron in 2014 is a very convenient and widely applicable formula for a delta method approximation to the standard deviation of the bootstrap smoothed estimator. This approximation provides an easily computed guide to the accuracy of this estimator. In addition, Efron considered a confidence interval centred on the bootstrap smoothed estimator, with width proportional to the estimate of this approximation to the standard deviation. We evaluate this confidence interval in the scenario of two nested linear regression models, the full model and a simpler model, and a preliminary test of the null hypothesis that the simpler model is correct. We derive computationally convenient expressions for the ideal bootstrap smoothed estimator and the coverage probability and expected length of this confidence interval. In terms of coverage probability, this confidence interval outperforms the post‐model‐selection confidence interval with the same nominal coverage and based on the same preliminary test. We also compare the performance of the confidence interval centred on the bootstrap smoothed estimator, in terms of expected length, to the usual confidence interval, with the same minimum coverage probability, based on the full model.  相似文献   
889.
In this paper, some confidence intervals (CIs) for the product of powers of the generalized variances of k multivariate normal populations with possibly different dimensions are proposed. The performance of these CIs in terms of the coverage probabilities and average lengths were evaluated via a Monte Carlo simulation study. The results were found to be satisfactory. To demonstrate utility of the proposed CIs, applications on three real data sets were provided.  相似文献   
890.
To characterize the dependence of a response on covariates of interest, a monotonic structure is linked to a multivariate polynomial transformation of the central subspace (CS) directions with unknown structural degree and dimension. Under a very general semiparametric model formulation, such a sufficient dimension reduction (SDR) score is shown to enjoy the existence, optimality, and uniqueness up to scale and location in the defined concordance probability function. In light of these properties and its single-index representation, two types of concordance-based generalized Bayesian information criteria are constructed to estimate the optimal SDR score and the maximum concordance index. The estimation criteria are further carried out by effective computational procedures. Generally speaking, the outer product of gradients estimation in the first approach has an advantage in computational efficiency and the parameterization system in the second approach greatly reduces the number of parameters in estimation. Different from most existing SDR approaches, only one CS direction is required to be continuous in the proposals. Moreover, the consistency of structural degree and dimension estimators and the asymptotic normality of the optimal SDR score and maximum concordance index estimators are established under some suitable conditions. The performance and practicality of our methodology are also investigated through simulations and empirical illustrations.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号