首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   12116篇
  免费   425篇
  国内免费   173篇
管理学   3061篇
劳动科学   2篇
民族学   32篇
人才学   2篇
人口学   262篇
丛书文集   657篇
理论方法论   579篇
综合类   5779篇
社会学   601篇
统计学   1739篇
  2024年   23篇
  2023年   159篇
  2022年   145篇
  2021年   166篇
  2020年   349篇
  2019年   327篇
  2018年   348篇
  2017年   419篇
  2016年   379篇
  2015年   369篇
  2014年   612篇
  2013年   1194篇
  2012年   758篇
  2011年   745篇
  2010年   631篇
  2009年   607篇
  2008年   651篇
  2007年   716篇
  2006年   691篇
  2005年   656篇
  2004年   497篇
  2003年   495篇
  2002年   371篇
  2001年   323篇
  2000年   197篇
  1999年   141篇
  1998年   86篇
  1997年   80篇
  1996年   67篇
  1995年   66篇
  1994年   68篇
  1993年   50篇
  1992年   43篇
  1991年   41篇
  1990年   44篇
  1989年   26篇
  1988年   37篇
  1987年   24篇
  1986年   21篇
  1985年   14篇
  1984年   21篇
  1983年   17篇
  1982年   20篇
  1981年   16篇
  1980年   1篇
  1979年   2篇
  1977年   1篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
71.
It is often of interest to find the maximum or near maxima among a set of vector‐valued parameters in a statistical model; in the case of disease mapping, for example, these correspond to relative‐risk “hotspots” where public‐health intervention may be needed. The general problem is one of estimating nonlinear functions of the ensemble of relative risks, but biased estimates result if posterior means are simply substituted into these nonlinear functions. The authors obtain better estimates of extrema from a new, weighted ranks squared error loss function. The derivation of these Bayes estimators assumes a hidden‐Markov random‐field model for relative risks, and their behaviour is illustrated with real and simulated data.  相似文献   
72.
Abstract.  We consider the problem of estimating a compactly supported density taking a Bayesian nonparametric approach. We define a Dirichlet mixture prior that, while selecting piecewise constant densities, has full support on the Hellinger metric space of all commonly dominated probability measures on a known bounded interval. We derive pointwise rates of convergence for the posterior expected density by studying the speed at which the posterior mass accumulates on shrinking Hellinger neighbourhoods of the sampling density. If the data are sampled from a strictly positive, α -Hölderian density, with α  ∈ ( 0,1] , then the optimal convergence rate n− α / (2 α +1) is obtained up to a logarithmic factor. Smoothing histograms by polygons, a continuous piecewise linear estimator is obtained that for twice continuously differentiable, strictly positive densities satisfying boundary conditions attains a rate comparable up to a logarithmic factor to the convergence rate n −4/5 for integrated mean squared error of kernel type density estimators.  相似文献   
73.
课堂教学由教师和学生组成,教师与学生默契配合是调动学生学习积极性的重要因素。丰富的教学内容,生动活泼的讲课节律,活跃的课堂气氛,科学的考试方法等,均影响着学生的听课兴趣。  相似文献   
74.
In this paper, we present a general formulation of an algorithm, the adaptive independent chain (AIC), that was introduced in a special context in Gåsemyr et al . [ Methodol. Comput. Appl. Probab. 3 (2001)]. The algorithm aims at producing samples from a specific target distribution Π, and is an adaptive, non-Markovian version of the Metropolis–Hastings independent chain. A certain parametric class of possible proposal distributions is fixed, and the parameters of the proposal distribution are updated periodically on the basis of the recent history of the chain, thereby obtaining proposals that get ever closer to Π. We show that under certain conditions, the algorithm produces an exact sample from Π in a finite number of iterations, and hence that it converges to Π. We also present another adaptive algorithm, the componentwise adaptive independent chain (CAIC), which may be an alternative in particular in high dimensions. The CAIC may be regarded as an adaptive approximation to the Gibbs sampler updating parametric approximations to the conditionals of Π.  相似文献   
75.
To ascertain the viability of a project, undertake resource allocation, take part in bidding processes, and other related decisions, modern project management requires forecasting techniques for cost, duration, and performance of a project, not only under normal circumstances, but also under external events that might abruptly change the status quo. We provide a Bayesian framework that provides a global forecast of a project's performance. We aim at predicting the probabilities and impacts of a set of potential scenarios caused by combinations of disruptive events, and using this information to deal with project management issues. To introduce the methodology, we focus on a project's cost, but the ideas equally apply to project duration or performance forecasting. We illustrate our approach with an example based on a real case study involving estimation of the uncertainty in project cost while bidding for a contract.  相似文献   
76.
Recently, we developed a GIS-Integrated Integral Risk Index (IRI) to assess human health risks in areas with presence of environmental pollutants. Contaminants were previously ranked by applying a self-organizing map (SOM) to their characteristics of persistence, bioaccumulation, and toxicity in order to obtain the Hazard Index (HI). In the present study, the original IRI was substantially improved by allowing the entrance of probabilistic data. A neuroprobabilistic HI was developed by combining SOM and Monte Carlo analysis. In general terms, the deterministic and probabilistic HIs followed a similar pattern: polychlorinated biphenyls (PCBs) and light polycyclic aromatic hydrocarbons (PAHs) were the pollutants showing the highest and lowest values of HI, respectively. However, the bioaccumulation value of heavy metals notably increased after considering a probability density function to explain the bioaccumulation factor. To check its applicability, a case study was investigated. The probabilistic integral risk was calculated in the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain), where an environmental program has been carried out since 2002. The risk change between 2002 and 2005 was evaluated on the basis of probabilistic data of the levels of various pollutants in soils. The results indicated that the risk of the chemicals under study did not follow a homogeneous tendency. However, the current levels of pollution do not mean a relevant source of health risks for the local population. Moreover, the neuroprobabilistic HI seems to be an adequate tool to be taken into account in risk assessment processes.  相似文献   
77.
This article presents a framework for using probabilistic terrorism risk modeling in regulatory analysis. We demonstrate the framework with an example application involving a regulation under consideration, the Western Hemisphere Travel Initiative for the Land Environment, (WHTI‐L). First, we estimate annualized loss from terrorist attacks with the Risk Management Solutions (RMS) Probabilistic Terrorism Model. We then estimate the critical risk reduction, which is the risk‐reducing effectiveness of WHTI‐L needed for its benefit, in terms of reduced terrorism loss in the United States, to exceed its cost. Our analysis indicates that the critical risk reduction depends strongly not only on uncertainties in the terrorism risk level, but also on uncertainty in the cost of regulation and how casualties are monetized. For a terrorism risk level based on the RMS standard risk estimate, the baseline regulatory cost estimate for WHTI‐L, and a range of casualty cost estimates based on the willingness‐to‐pay approach, our estimate for the expected annualized loss from terrorism ranges from $2.7 billion to $5.2 billion. For this range in annualized loss, the critical risk reduction for WHTI‐L ranges from 7% to 13%. Basing results on a lower risk level that results in halving the annualized terrorism loss would double the critical risk reduction (14–26%), and basing the results on a higher risk level that results in a doubling of the annualized terrorism loss would cut the critical risk reduction in half (3.5–6.6%). Ideally, decisions about terrorism security regulations and policies would be informed by true benefit‐cost analyses in which the estimated benefits are compared to costs. Such analyses for terrorism security efforts face substantial impediments stemming from the great uncertainty in the terrorist threat and the very low recurrence interval for large attacks. Several approaches can be used to estimate how a terrorism security program or regulation reduces the distribution of risks it is intended to manage. But, continued research to develop additional tools and data is necessary to support application of these approaches. These include refinement of models and simulations, engagement of subject matter experts, implementation of program evaluation, and estimating the costs of casualties from terrorism events.  相似文献   
78.
文章针对目前小学作文教学中情感虚假、内容空洞、语言苍白等弊病,在改革作文教学实践中提出引导学生“个性写真”的主张,并就其操作策略与教学模式进行实践探索。  相似文献   
79.
The benchmark dose (BMD) is an exposure level that would induce a small risk increase (BMR level) above the background. The BMD approach to deriving a reference dose for risk assessment of noncancer effects is advantageous in that the estimate of BMD is not restricted to experimental doses and utilizes most available dose-response information. To quantify statistical uncertainty of a BMD estimate, we often calculate and report its lower confidence limit (i.e., BMDL), and may even consider it as a more conservative alternative to BMD itself. Computation of BMDL may involve normal confidence limits to BMD in conjunction with the delta method. Therefore, factors, such as small sample size and nonlinearity in model parameters, can affect the performance of the delta method BMDL, and alternative methods are useful. In this article, we propose a bootstrap method to estimate BMDL utilizing a scheme that consists of a resampling of residuals after model fitting and a one-step formula for parameter estimation. We illustrate the method with clustered binary data from developmental toxicity experiments. Our analysis shows that with moderately elevated dose-response data, the distribution of BMD estimator tends to be left-skewed and bootstrap BMDL s are smaller than the delta method BMDL s on average, hence quantifying risk more conservatively. Statistically, the bootstrap BMDL quantifies the uncertainty of the true BMD more honestly than the delta method BMDL as its coverage probability is closer to the nominal level than that of delta method BMDL. We find that BMD and BMDL estimates are generally insensitive to model choices provided that the models fit the data comparably well near the region of BMD. Our analysis also suggests that, in the presence of a significant and moderately strong dose-response relationship, the developmental toxicity experiments under the standard protocol support dose-response assessment at 5% BMR for BMD and 95% confidence level for BMDL.  相似文献   
80.
The paper considers simultaneous estimation of finite population means for several strata. A model-based approach is taken, where the covariates in the super-population model are subject to measurement errors. Empirical Bayes (EB) estimators of the strata means are developed and an asymptotic expression for the MSE of the EB estimators is provided. It is shown that the proposed EB estimators are “first order optimal” in the sense of Robbins [1956. An empirical Bayes approach to statistics. In: Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, University of California Press, Berkeley, pp. 157–164], while the regular EB estimators which ignore the measurement error are not.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号