首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   234篇
  免费   8篇
管理学   38篇
人口学   4篇
丛书文集   4篇
理论方法论   2篇
综合类   35篇
社会学   12篇
统计学   147篇
  2023年   1篇
  2021年   3篇
  2020年   9篇
  2019年   9篇
  2018年   5篇
  2017年   25篇
  2016年   9篇
  2015年   7篇
  2014年   11篇
  2013年   35篇
  2012年   27篇
  2011年   16篇
  2010年   3篇
  2009年   10篇
  2008年   8篇
  2007年   9篇
  2006年   4篇
  2005年   4篇
  2004年   5篇
  2003年   5篇
  2002年   1篇
  2001年   8篇
  2000年   2篇
  1999年   4篇
  1998年   5篇
  1997年   3篇
  1996年   1篇
  1995年   1篇
  1992年   3篇
  1991年   1篇
  1989年   3篇
  1988年   1篇
  1987年   3篇
  1981年   1篇
排序方式: 共有242条查询结果,搜索用时 15 毫秒
121.
简述了遗传算法的过程和应用问题。主要讨论了遗传算法的基本概念、数据的转换工作和确定适应度计算函数的作用,从代码举例说明遗传算法的选择和交叉的基本用法。  相似文献   
122.
When available data comprise a number of sampled households in each of a number of income classes, the likelihood function is obtained from a multinomial distribution with the income class population proportions as the unknown parameters. Two methods for going from this likelihood function to a posterior distribution on the Gini coefficient are investigated. In the first method, two alternative assumptions about the underlying income distribution are considered, namely a lognormal distribution and the Singh–Maddala (1976) income distribution. In these cases the likelihood function is reparameterized and the Gini coefficient is a nonlinear function of the income distribution parameters. The Metropolis algorithm is used to find the corresponding posterior distributions of the Gini coefficient from a sample of Bangkok households. The second method does not require an assumption about the nature of the income distribution, but uses (a) triangular prior distributions, and (b) beta prior distributions, on the location of mean income within each income class. By sampling from these distributions, and the Dirichlet posterior distribution of the income class proportions, alternative posterior distributions of the Gini coefficient are calculated.  相似文献   
123.
It is now possible to carry out Bayesian image segmentation from a continuum parametric model with an unknown number of regions. However, few suitable parametric models exist. We set out to model processes which have realizations that are naturally described by coloured planar triangulations. Triangulations are already used, to represent image structure in machine vision, and in finite element analysis, for domain decomposition. However, no normalizable parametric model, with realizations that are coloured triangulations, has been specified to date. We show how this must be done, and in particular we prove that a normalizable measure on the space of triangulations in the interior of a fixed simple polygon derives from a Poisson point process of vertices. We show how such models may be analysed by using Markov chain Monte Carlo methods and we present two case-studies, including convergence analysis.  相似文献   
124.
A number of market changes are impacting the way financial institutions are managing their automated teller machines (ATMs). We propose a new class of adaptive data‐driven policies for a stochastic inventory control problem faced by a large financial institution that manages cash at several ATMs. Senior management were concerned that their current cash supply system to manage ATMs was inefficient and outdated, and suspected that using improved cash management could reduce overall system cost. Our task was to provide a robust procedure to tackle the ATM's cash deployment strategies. Current industry practice uses a periodic review system with infrequent parameter updates for cash management based on the assumption that demand is normally distributed during the review period. This assumption did not hold during our investigation, warranting a new and robust analysis. Moreover, we discovered that forecast errors are often not normally distributed and that these error distributions change dramatically over time. Our approach finds the optimal time series forecaster and the best‐fitting weekly forecast error distribution. The guaranteed optimal target cash inventory level and time between orders could only be obtained through an optimization module that was embedded in a simulation routine that we built for the institution. We employed an exploratory case study methodology to collect cash withdrawal data at 21 ATMs owned and operated by the financial institution. Our new approach shows a 4.6% overall cost reduction. This reflects an annual cost savings of over $250,000 for the 2,500 ATM units that are operated by the bank.  相似文献   
125.
This paper describes a Bayesian approach to mixture modelling and a method based on predictive distribution to determine the number of components in the mixtures. The implementation is done through the use of the Gibbs sampler. The method is described through the mixtures of normal and gamma distributions. Analysis is presented in one simulated and one real data example. The Bayesian results are then compared with the likelihood approach for the two examples.  相似文献   
126.
127.
In this paper, we consider a Bayesian mixture model that allows us to integrate out the weights of the mixture in order to obtain a procedure in which the number of clusters is an unknown quantity. To determine clusters and estimate parameters of interest, we develop an MCMC algorithm denominated by sequential data-driven allocation sampler. In this algorithm, a single observation has a non-null probability to create a new cluster and a set of observations may create a new cluster through the split-merge movements. The split-merge movements are developed using a sequential allocation procedure based in allocation probabilities that are calculated according to the Kullback–Leibler divergence between the posterior distribution using the observations previously allocated and the posterior distribution including a ‘new’ observation. We verified the performance of the proposed algorithm on the simulated data and then we illustrate its use on three publicly available real data sets.  相似文献   
128.
This study focuses on the classical and Bayesian analysis of a k-components load-sharing parallel system in which components have time-dependent failure rates. In the classical set up, the maximum likelihood estimates of the load-share parameters with their standard errors (SEs) are obtained. (1?γ) 100% simultaneous and two bootstrap confidence intervals for the parameters and system reliability and hazard functions have been constructed. Further, on recognizing the fact that life-testing experiments are very time consuming, the parameters involved in the failure time distribution of the system are expected to follow some random variations. Therefore, Bayes estimates along with their posterior SEs of the parameters and system reliability and hazard functions are obtained by assuming gamma and Jeffrey's priors of the unknown parameters. Markov chain Monte Carlo technique such as Gibbs sampler has been used to obtain Bayes estimates and highest posterior density credible intervals.  相似文献   
129.
This paper aims at evaluating different aspects of Monte Carlo expectation – maximization algorithm to estimate heavy-tailed mixed logistic regression (MLR) models. As a novelty it also proposes a multiple chain Gibbs sampler to generate of the latent variables distributions thus obtaining independent samples. In heavy-tailed MLR models, the analytical forms of the full conditional distributions for the random effects are unknown. Four different Metropolis–Hastings algorithms are assumed to generate from them. We also discuss stopping rules in order to obtain more efficient algorithms in heavy-tailed MLR models. The algorithms are compared through the analysis of simulated and Ascaris Suum data.  相似文献   
130.
In this paper, maximum likelihood and Bayes estimators of the parameters, reliability and hazard functions have been obtained for two-parameter bathtub-shaped lifetime distribution when sample is available from progressive Type-II censoring scheme. The Markov chain Monte Carlo (MCMC) method is used to compute the Bayes estimates of the model parameters. It has been assumed that the parameters have gamma priors and they are independently distributed. Gibbs within the Metropolis–Hasting algorithm has been applied to generate MCMC samples from the posterior density function. Based on the generated samples, the Bayes estimates and highest posterior density credible intervals of the unknown parameters as well as reliability and hazard functions have been computed. The results of Bayes estimators are obtained under both the balanced-squared error loss and balanced linear-exponential (BLINEX) loss. Moreover, based on the asymptotic normality of the maximum likelihood estimators the approximate confidence intervals (CIs) are obtained. In order to construct the asymptotic CI of the reliability and hazard functions, we need to find the variance of them, which are approximated by delta and Bootstrap methods. Two real data sets have been analyzed to demonstrate how the proposed methods can be used in practice.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号