首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2185篇
  免费   102篇
  国内免费   3篇
管理学   233篇
人口学   11篇
丛书文集   10篇
理论方法论   32篇
综合类   48篇
社会学   24篇
统计学   1932篇
  2023年   34篇
  2022年   26篇
  2021年   37篇
  2020年   38篇
  2019年   87篇
  2018年   100篇
  2017年   185篇
  2016年   85篇
  2015年   73篇
  2014年   103篇
  2013年   485篇
  2012年   185篇
  2011年   79篇
  2010年   66篇
  2009年   85篇
  2008年   64篇
  2007年   72篇
  2006年   65篇
  2005年   56篇
  2004年   56篇
  2003年   37篇
  2002年   33篇
  2001年   28篇
  2000年   34篇
  1999年   26篇
  1998年   28篇
  1997年   21篇
  1996年   9篇
  1995年   8篇
  1994年   14篇
  1993年   7篇
  1992年   11篇
  1991年   13篇
  1990年   3篇
  1989年   5篇
  1988年   7篇
  1987年   3篇
  1986年   3篇
  1985年   4篇
  1984年   2篇
  1983年   3篇
  1982年   5篇
  1981年   1篇
  1980年   2篇
  1979年   1篇
  1975年   1篇
排序方式: 共有2290条查询结果,搜索用时 796 毫秒
101.
Social networks describe the relationships and interactions among a group of individuals. In many peer relationships, individuals tend to associate more often with some members than others, forming subgroups or clusters. Subgroup structure varies across networks; subgroups may be insular, appearing distinct and isolated from one another, or subgroups may be so integrated that subgroup structure is not visually apparent, and there are numerous ways of quantifying these types of structures. We propose a new model that relates the amount of subgroup integration to network attributes, building on the mixed membership stochastic blockmodel (Airoldi et al., 2008) and subsequent work by Sweet and Zheng (2017) and Sweet et al. (2014). We explore some of the operating characteristics of this model with simulated data and apply this model to determine the relationship between teachers’ instructional practices and their classrooms’ peer network subgroup structure.  相似文献   
102.
Summary.  Structured additive regression models are perhaps the most commonly used class of models in statistical applications. It includes, among others, (generalized) linear models, (generalized) additive models, smoothing spline models, state space models, semiparametric regression, spatial and spatiotemporal models, log-Gaussian Cox processes and geostatistical and geoadditive models. We consider approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models , where the latent field is Gaussian, controlled by a few hyperparameters and with non-Gaussian response variables. The posterior marginals are not available in closed form owing to the non-Gaussian response variables. For such models, Markov chain Monte Carlo methods can be implemented, but they are not without problems, in terms of both convergence and computational time. In some practical applications, the extent of these problems is such that Markov chain Monte Carlo sampling is simply not an appropriate tool for routine analysis. We show that, by using an integrated nested Laplace approximation and its simplified version, we can directly compute very accurate approximations to the posterior marginals. The main benefit of these approximations is computational: where Markov chain Monte Carlo algorithms need hours or days to run, our approximations provide more precise estimates in seconds or minutes. Another advantage with our approach is its generality, which makes it possible to perform Bayesian analysis in an automatic, streamlined way, and to compute model comparison criteria and various predictive measures so that models can be compared and the model under study can be challenged.  相似文献   
103.
Several models for studies related to tensile strength of materials are proposed in the literature where the size or length component has been taken to be an important factor for studying the specimens’ failure behaviour. An important model, developed on the basis of cumulative damage approach, is the three-parameter extension of the Birnbaum–Saunders fatigue model that incorporates size of the specimen as an additional variable. This model is a strong competitor of the commonly used Weibull model and stands better than the traditional models, which do not incorporate the size effect. The paper considers two such cumulative damage models, checks their compatibility with a real dataset, compares them with some of the recent toolkits, and finally recommends a model, which appears an appropriate one. Throughout the study is Bayesian based on Markov chain Monte Carlo simulation.  相似文献   
104.
Semiparametric regression models that use spline basis functions with penalization have graphical model representations. This link is more powerful than previously established mixed model representations of semiparametric regression, as a larger class of models can be accommodated. Complications such as missingness and measurement error are more naturally handled within the graphical model architecture. Directed acyclic graphs, also known as Bayesian networks, play a prominent role. Graphical model-based Bayesian 'inference engines', such as bugs and vibes , facilitate fitting and inference. Underlying these are Markov chain Monte Carlo schemes and recent developments in variational approximation theory and methodology.  相似文献   
105.
Using generalized linear models (GLMs), Jalaludin  et al. (2006;  J. Exposure Analysis and Epidemiology   16 , 225–237) studied the association between the daily number of visits to emergency departments for cardiovascular disease by the elderly (65+) and five measures of ambient air pollution. Bayesian methods provide an alternative approach to classical time series modelling and are starting to be more widely used. This paper considers Bayesian methods using the dataset used by Jalaludin  et al.  (2006) , and compares the results from Bayesian methods with those obtained by Jalaludin  et al.  (2006) using GLM methods.  相似文献   
106.
The variational approach to Bayesian inference enables simultaneous estimation of model parameters and model complexity. An interesting feature of this approach is that it also leads to an automatic choice of model complexity. Empirical results from the analysis of hidden Markov models with Gaussian observation densities illustrate this. If the variational algorithm is initialized with a large number of hidden states, redundant states are eliminated as the method converges to a solution, thereby leading to a selection of the number of hidden states. In addition, through the use of a variational approximation, the deviance information criterion for Bayesian model selection can be extended to the hidden Markov model framework. Calculation of the deviance information criterion provides a further tool for model selection, which can be used in conjunction with the variational approach.  相似文献   
107.
We proposed a modification to the variant of link-tracing sampling suggested by Félix-Medina and Thompson [M.H. Félix-Medina, S.K. Thompson, Combining cluster sampling and link-tracing sampling to estimate the size of hidden populations, Journal of Official Statistics 20 (2004) 19–38] that allows the researcher to have certain control of the final sample size, precision of the estimates or other characteristics of the sample that the researcher is interested in controlling. We achieve this goal by selecting an initial sequential sample of sites instead of an initial simple random sample of sites as those authors suggested. We estimate the population size by means of the maximum likelihood estimators suggested by the above-mentioned authors or by the Bayesian estimators proposed by Félix-Medina and Monjardin [M.H. Félix-Medina, P.E. Monjardin, Combining link-tracing sampling and cluster sampling to estimate the size of hidden populations: A Bayesian-assisted approach, Survey Methodology 32 (2006) 187–195]. Variances are estimated by means of jackknife and bootstrap estimators as well as by the delta estimators proposed in the two above-mentioned papers. Interval estimates of the population size are obtained by means of Wald and bootstrap confidence intervals. The results of an exploratory simulation study indicate good performance of the proposed sampling strategy.  相似文献   
108.
The distribution of the aggregate claims in one year plays an important role in Actuarial Statistics for computing, for example, insurance premiums when both the number and size of the claims must be implemented into the model. When the number of claims follows a Poisson distribution the aggregated distribution is called the compound Poisson distribution. In this article we assume that the claim size follows an exponential distribution and later we make an extensive study of this model by assuming a bidimensional prior distribution for the parameters of the Poisson and exponential distribution with marginal gamma. This study carries us to obtain expressions for net premiums, marginal and posterior distributions in terms of some well-known special functions used in statistics. Later, a Bayesian robustness study of this model is made. Bayesian robustness on bidimensional models was deeply treated in the 1990s, producing numerous results, but few applications dealing with this problem can be found in the literature.  相似文献   
109.
Abstract.  One of the main research areas in Bayesian Nonparametrics is the proposal and study of priors which generalize the Dirichlet process. In this paper, we provide a comprehensive Bayesian non-parametric analysis of random probabilities which are obtained by normalizing random measures with independent increments (NRMI). Special cases of these priors have already shown to be useful for statistical applications such as mixture models and species sampling problems. However, in order to fully exploit these priors, the derivation of the posterior distribution of NRMIs is crucial: here we achieve this goal and, indeed, provide explicit and tractable expressions suitable for practical implementation. The posterior distribution of an NRMI turns out to be a mixture with respect to the distribution of a specific latent variable. The analysis is completed by the derivation of the corresponding predictive distributions and by a thorough investigation of the marginal structure. These results allow to derive a generalized Blackwell–MacQueen sampling scheme, which is then adapted to cover also mixture models driven by general NRMIs.  相似文献   
110.
The Bayesian analysis based on the partial likelihood for Cox's proportional hazards model is frequently used because of its simplicity. The Bayesian partial likelihood approach is often justified by showing that it approximates the full Bayesian posterior of the regression coefficients with a diffuse prior on the baseline hazard function. This, however, may not be appropriate when ties exist among uncensored observations. In that case, the full Bayesian and Bayesian partial likelihood posteriors can be much different. In this paper, we propose a new Bayesian partial likelihood approach for many tied observations and justify its use.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号