首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3929篇
  免费   114篇
  国内免费   13篇
管理学   190篇
民族学   1篇
人口学   40篇
丛书文集   24篇
理论方法论   27篇
综合类   358篇
社会学   25篇
统计学   3391篇
  2024年   1篇
  2023年   23篇
  2022年   12篇
  2021年   26篇
  2020年   69篇
  2019年   151篇
  2018年   175篇
  2017年   273篇
  2016年   137篇
  2015年   85篇
  2014年   119篇
  2013年   1166篇
  2012年   351篇
  2011年   104篇
  2010年   126篇
  2009年   134篇
  2008年   122篇
  2007年   91篇
  2006年   91篇
  2005年   88篇
  2004年   78篇
  2003年   61篇
  2002年   66篇
  2001年   62篇
  2000年   59篇
  1999年   59篇
  1998年   53篇
  1997年   42篇
  1996年   24篇
  1995年   20篇
  1994年   26篇
  1993年   19篇
  1992年   23篇
  1991年   8篇
  1990年   15篇
  1989年   9篇
  1988年   17篇
  1987年   8篇
  1986年   6篇
  1985年   4篇
  1984年   12篇
  1983年   13篇
  1982年   6篇
  1981年   5篇
  1980年   1篇
  1979年   6篇
  1978年   5篇
  1977年   2篇
  1975年   2篇
  1973年   1篇
排序方式: 共有4056条查询结果,搜索用时 31 毫秒
31.
To reduce nonresponse bias in sample surveys, a method of nonresponse weighting adjustment is often used which consists of multiplying the sampling weight of the respondent by the inverse of the estimated response probability. The authors examine the asymptotic properties of this estimator. They prove that it is generally more efficient than an estimator which uses the true response probability, provided that the parameters which govern this probability are estimated by maximum likelihood. The authors discuss variance estimation methods that account for the effect of using the estimated response probability; they compare their performances in a small simulation study. They also discuss extensions to the regression estimator.  相似文献   
32.
A novel framework is proposed for the estimation of multiple sinusoids from irregularly sampled time series. This spectral analysis problem is addressed as an under-determined inverse problem, where the spectrum is discretized on an arbitrarily thin frequency grid. As we focus on line spectra estimation, the solution must be sparse, i.e. the amplitude of the spectrum must be zero almost everywhere. Such prior information is taken into account within the Bayesian framework. Two models are used to account for the prior sparseness of the solution, namely a Laplace prior and a Bernoulli–Gaussian prior, associated to optimization and stochastic sampling algorithms, respectively. Such approaches are efficient alternatives to usual sequential prewhitening methods, especially in case of strong sampling aliases perturbating the Fourier spectrum. Both methods should be intensively tested on real data sets by physicists.  相似文献   
33.
We consider the problem of density estimation when the data is in the form of a continuous stream with no fixed length. In this setting, implementations of the usual methods of density estimation such as kernel density estimation are problematic. We propose a method of density estimation for massive datasets that is based upon taking the derivative of a smooth curve that has been fit through a set of quantile estimates. To achieve this, a low-storage, single-pass, sequential method is proposed for simultaneous estimation of multiple quantiles for massive datasets that form the basis of this method of density estimation. For comparison, we also consider a sequential kernel density estimator. The proposed methods are shown through simulation study to perform well and to have several distinct advantages over existing methods.  相似文献   
34.
Summary. We propose a simple estimation procedure for a proportional hazards frailty regression model for clustered survival data in which the dependence is generated by a positive stable distribution. Inferences for the frailty parameter can be obtained by using output from Cox regression analyses. The computational burden is substantially less than that of the other approaches to estimation. The large sample behaviour of the estimator is studied and simulations show that the approximations are appropriate for use with realistic sample sizes. The methods are motivated by studies of familial associations in the natural history of diseases. Their practical utility is illustrated with sib pair data from Beaver Dam, Wisconsin.  相似文献   
35.
Abstract.  We consider the problem of estimating a compactly supported density taking a Bayesian nonparametric approach. We define a Dirichlet mixture prior that, while selecting piecewise constant densities, has full support on the Hellinger metric space of all commonly dominated probability measures on a known bounded interval. We derive pointwise rates of convergence for the posterior expected density by studying the speed at which the posterior mass accumulates on shrinking Hellinger neighbourhoods of the sampling density. If the data are sampled from a strictly positive, α -Hölderian density, with α  ∈ ( 0,1] , then the optimal convergence rate n− α / (2 α +1) is obtained up to a logarithmic factor. Smoothing histograms by polygons, a continuous piecewise linear estimator is obtained that for twice continuously differentiable, strictly positive densities satisfying boundary conditions attains a rate comparable up to a logarithmic factor to the convergence rate n −4/5 for integrated mean squared error of kernel type density estimators.  相似文献   
36.
The ability to infer parameters of gene regulatory networks is emerging as a key problem in systems biology. The biochemical data are intrinsically stochastic and tend to be observed by means of discrete-time sampling systems, which are often limited in their completeness. In this paper we explore how to make Bayesian inference for the kinetic rate constants of regulatory networks, using the stochastic kinetic Lotka-Volterra system as a model. This simple model describes behaviour typical of many biochemical networks which exhibit auto-regulatory behaviour. Various MCMC algorithms are described and their performance evaluated in several data-poor scenarios. An algorithm based on an approximating process is shown to be particularly efficient.  相似文献   
37.
If the unknown mean of a univariate population is sufficiently close to the value of an initial guess then an appropriate shrinkage estimator has smaller average squared error than the sample mean. This principle has been known for some time, but it does not appear to have found extension to problems of interval estimation. The author presents valid two‐sided 95% and 99% “shrinkage” confidence intervals for the mean of a normal distribution. These intervals are narrower than the usual interval based on the Student distribution when the population mean lies in such an “effective interval.” A reduction of 20% in the mean width of the interval is possible when the population mean is sufficiently close to the value of the guess. The author also describes a modification to existing shrinkage point estimators of the general univariate mean that enables the effective interval to be enlarged.  相似文献   
38.
In this paper, we present a general formulation of an algorithm, the adaptive independent chain (AIC), that was introduced in a special context in Gåsemyr et al . [ Methodol. Comput. Appl. Probab. 3 (2001)]. The algorithm aims at producing samples from a specific target distribution Π, and is an adaptive, non-Markovian version of the Metropolis–Hastings independent chain. A certain parametric class of possible proposal distributions is fixed, and the parameters of the proposal distribution are updated periodically on the basis of the recent history of the chain, thereby obtaining proposals that get ever closer to Π. We show that under certain conditions, the algorithm produces an exact sample from Π in a finite number of iterations, and hence that it converges to Π. We also present another adaptive algorithm, the componentwise adaptive independent chain (CAIC), which may be an alternative in particular in high dimensions. The CAIC may be regarded as an adaptive approximation to the Gibbs sampler updating parametric approximations to the conditionals of Π.  相似文献   
39.
Summary.  The paper considers the problem of estimating the entire temperature field for every location on the globe from scattered surface air temperatures observed by a network of weather-stations. Classical methods such as spherical harmonics and spherical smoothing splines are not efficient in representing data that have inherent multiscale structures. The paper presents an estimation method that can adapt to the multiscale characteristics of the data. The method is based on a spherical wavelet approach that has recently been developed for a multiscale representation and analysis of scattered data. Spatially adaptive estimators are obtained by coupling the spherical wavelets with different thresholding (selective reconstruction) techniques. These estimators are compared for their spatial adaptability and extrapolation performance by using the surface air temperature data.  相似文献   
40.
Modeling for Risk Assessment of Neurotoxic Effects   总被引:2,自引:0,他引:2  
The regulation of noncancer toxicants, including neurotoxicants, has usually been based upon a reference dose (allowable daily intake). A reference dose is obtained by dividing a no-observed-effect level by uncertainty (safety) factors to account for intraspecies and interspecies sensitivities to a chemical. It is assumed that the risk at the reference dose is negligible, but no attempt generally is made to estimate the risk at the reference dose. A procedure is outlined that provides estimates of risk as a function of dose. The first step is to establish a mathematical relationship between a biological effect and the dose of a chemical. Knowledge of biological mechanisms and/or pharmacokinetics can assist in the choice of plausible mathematical models. The mathematical model provides estimates of average responses as a function of dose. Secondly, estimates of risk require selection of a distribution of individual responses about the average response given by the mathematical model. In the case of a normal or lognormal distribution, only an estimate of the standard deviation is needed. The third step is to define an adverse level for a response so that the probability (risk) of exceeding that level can be estimated as a function of dose. Because a firm response level often cannot be established at which adverse biological effects occur, it may be necessary to at least establish an abnormal response level that only a small proportion of individuals would exceed in an unexposed group. That is, if a normal range of responses can be established, then the probability (risk) of abnormal responses can be estimated. In order to illustrate this process, measures of the neurotransmitter serotonin and its metabolite 5-hydroxyindoleacetic acid in specific areas of the brain of rats and monkeys are analyzed after exposure to the neurotoxicant methylene-dioxymethamphetamine. These risk estimates are compared with risk estimates from the quantal approach in which animals are classified as either abnormal or not depending upon abnormal serotonin levels.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号