首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5632篇
  免费   81篇
管理学   775篇
民族学   19篇
人才学   6篇
人口学   540篇
丛书文集   18篇
理论方法论   490篇
综合类   63篇
社会学   2580篇
统计学   1222篇
  2023年   31篇
  2021年   37篇
  2020年   81篇
  2019年   113篇
  2018年   122篇
  2017年   187篇
  2016年   121篇
  2015年   83篇
  2014年   107篇
  2013年   944篇
  2012年   203篇
  2011年   146篇
  2010年   112篇
  2009年   105篇
  2008年   118篇
  2007年   136篇
  2006年   125篇
  2005年   125篇
  2004年   118篇
  2003年   93篇
  2002年   120篇
  2001年   140篇
  2000年   105篇
  1999年   126篇
  1998年   102篇
  1997年   105篇
  1996年   76篇
  1995年   80篇
  1994年   95篇
  1993年   82篇
  1992年   91篇
  1991年   80篇
  1990年   85篇
  1989年   80篇
  1988年   94篇
  1987年   84篇
  1986年   76篇
  1985年   84篇
  1984年   70篇
  1983年   71篇
  1982年   65篇
  1981年   62篇
  1980年   56篇
  1979年   74篇
  1978年   60篇
  1977年   49篇
  1976年   59篇
  1975年   57篇
  1974年   35篇
  1973年   38篇
排序方式: 共有5713条查询结果,搜索用时 15 毫秒
111.
Price determinants as well as strategies can be studies by use of simulation, particularly if cost and price relationships can be related to market activity [1] [9] [11]. But, through the use of dynamic programming, given the market conditions, one can extend the analysis to include an optimal strategy. This paper describes a dynamic programming approach to studying price strategy. A model is developed to show that in a market characterized by cost/volume and price/volume relationships, profitability can be extended beyond that resulting from a dominant market strategy to an optimal maximizing strategy. Extension of the model is suggested for studying (a) sensitivity of a strategy (solution) to price level and cost changes, (b) optimal timing of withdrawal, and (c) present value analysis.  相似文献   
112.
Based upon the Decision Field Theory (Busemeyer and Townsend 1993), we tested a model of dynamic reasoning to predict the effect of time pressure on analytical and experiential processing during decision-making. Forty-six participants were required to make investment decisions under four levels of time pressure. In each decision, participants were presented with experiential cues which were either congruent or incongruent with the analytical information. The congruent/incongruent conditions allowed us to examine how many decisions were based upon the experiential versus the analytical information, and to see if this was affected by the varying degrees of time pressure. As expected, the overall accuracy was reduced with greater time pressure and accuracy was higher when the experiential and analytical cues were congruent than when they were incongruent. Of great interest was the data showing that under high time pressure participants used more experiential cues than at other time pressures. We suggest that the dynamic reasoning paradigm has some future potential for predicting the effects of experiential biases in general, and specifically under time pressure.  相似文献   
113.
114.
The United States is entering a new era, a period marked by some important demographic changes in the composition of the population, most especially significant increases in the Latino and Latino immigrant segments of society. These population shifts require corresponding interpersonal, organizational, and structural changes. The present issue bridges research and theory across disciplines and includes studies incorporating a variety of methodologies to examine these important areas. These articles begin to fill some of the voids where a systematic and robust corpus of knowledge is lacking. The contributions address topics ranging from issues of identity and interpersonal relations to pressing matters of educational significance to general approaches to navigating the cultural transitions that mark fluid transnational adaptations. Finally, each contribution delineates the policy implications resulting from the processes and literatures that are examined.  相似文献   
115.
Differential Evolution (DE) is a simple genetic algorithm for numerical optimization in real parameter spaces. In a statistical context one would not just want the optimum but also its uncertainty. The uncertainty distribution can be obtained by a Bayesian analysis (after specifying prior and likelihood) using Markov Chain Monte Carlo (MCMC) simulation. This paper integrates the essential ideas of DE and MCMC, resulting in Differential Evolution Markov Chain (DE-MC). DE-MC is a population MCMC algorithm, in which multiple chains are run in parallel. DE-MC solves an important problem in MCMC, namely that of choosing an appropriate scale and orientation for the jumping distribution. In DE-MC the jumps are simply a fixed multiple of the differences of two random parameter vectors that are currently in the population. The selection process of DE-MC works via the usual Metropolis ratio which defines the probability with which a proposal is accepted. In tests with known uncertainty distributions, the efficiency of DE-MC with respect to random walk Metropolis with optimal multivariate Normal jumps ranged from 68% for small population sizes to 100% for large population sizes and even to 500% for the 97.5% point of a variable from a 50-dimensional Student distribution. Two Bayesian examples illustrate the potential of DE-MC in practice. DE-MC is shown to facilitate multidimensional updates in a multi-chain “Metropolis-within-Gibbs” sampling approach. The advantage of DE-MC over conventional MCMC are simplicity, speed of calculation and convergence, even for nearly collinear parameters and multimodal densities.  相似文献   
116.
Statistics and stained glass may seem an odd combination, but the windows of Gonville & Caius College, Cambridge, say otherwise. Anthony Edwards explains.  相似文献   
117.
Bayesian palaeoclimate reconstruction   总被引:1,自引:0,他引:1  
Summary.  We consider the problem of reconstructing prehistoric climates by using fossil data that have been extracted from lake sediment cores. Such reconstructions promise to provide one of the few ways to validate modern models of climate change. A hierarchical Bayesian modelling approach is presented and its use, inversely, is demonstrated in a relatively small but statistically challenging exercise: the reconstruction of prehistoric climate at Glendalough in Ireland from fossil pollen. This computationally intensive method extends current approaches by explicitly modelling uncertainty and reconstructing entire climate histories. The statistical issues that are raised relate to the use of compositional data (pollen) with covariates (climate) which are available at many modern sites but are missing for the fossil data. The compositional data arise as mixtures and the missing covariates have a temporal structure. Novel aspects of the analysis include a spatial process model for compositional data, local modelling of lattice data, the use, as a prior, of a random walk with long-tailed increments, a two-stage implementation of the Markov chain Monte Carlo approach and a fast approximate procedure for cross-validation in inverse problems. We present some details, contrasting its reconstructions with those which have been generated by a method in use in the palaeoclimatology literature. We suggest that the method provides a basis for resolving important challenging issues in palaeoclimate research. We draw attention to several challenging statistical issues that need to be overcome.  相似文献   
118.
In 1997 intense media coverage raised public concerns about germ warfare simulation experiments conducted by the Ministry of Defence during the 1960s, which included the release of bacteria over Dorset. Families in East Lulworth, Dorset, have linked this with allegedly high rates of miscarriages, still-births, congenital malformations, learning and other neurodevelopmental disabilities in their village. The response of the Dorset Health Authority (DHA) included the examination of background information from the Ministry of Defence, national data on congenital malformations in Dorset, health information collected by campaigners and a systematic health survey conducted by the DHA among former and current residents of East Lulworth. The investigation did not confirm the presence of a cluster. It is debatable whether the DHA should have proceeded with their survey when none of the other more immediately available results indicated the presence of a cluster.  相似文献   
119.
Many companies are trying to get to the bottom of what their main objectives are and what their business should be doing. The new Six Sigma approach concentrates on clarifying business strategy and making sure that everything relates to company objectives. It is vital to clarify each part of the business in such a way that everyone can understand the causes of variation that can lead to improvements in processes and performance. This paper describes a situation where the full implementation of SPC methodology has made possible a visual and widely appreciated summary of the performance of one important aspect of the business. The major part of the work was identifying the core objectives and deciding how to encapsulate each of them in one or more suitable measurements. The next step was to review the practicalities of obtaining the measurements and their reliability and representativeness. Finally, the measurements were presented in chart form and the more traditional steps of SPC analysis were commenced. Data from fast changing business environments are prone to many different problems, such as the short previous span of typical data, strange distributions and other uncertainties. Issues surrounding these and the eventual extraction of a meaningful set of information will be discussed in the paper. The measurement framework has proved very useful and, from an initial circulation of a handful of people, it now forms an important part of an information process that provides responsible managers with valuable control information. The measurement framework is kept fresh and vital by constant review and modifications. Improved electronic data collection and dissemination of the report has proved very important.  相似文献   
120.
Model checking with discrete data regressions can be difficult because the usual methods such as residual plots have complicated reference distributions that depend on the parameters in the model. Posterior predictive checks have been proposed as a Bayesian way to average the results of goodness-of-fit tests in the presence of uncertainty in estimation of the parameters. We try this approach using a variety of discrepancy variables for generalized linear models fitted to a historical data set on behavioural learning. We then discuss the general applicability of our findings in the context of a recent applied example on which we have worked. We find that the following discrepancy variables work well, in the sense of being easy to interpret and sensitive to important model failures: structured displays of the entire data set, general discrepancy variables based on plots of binned or smoothed residuals versus predictors and specific discrepancy variables created on the basis of the particular concerns arising in an application. Plots of binned residuals are especially easy to use because their predictive distributions under the model are sufficiently simple that model checks can often be made implicitly. The following discrepancy variables did not work well: scatterplots of latent residuals defined from an underlying continuous model and quantile–quantile plots of these residuals.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号