首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5503篇
  免费   139篇
  国内免费   55篇
管理学   415篇
民族学   10篇
人才学   7篇
人口学   28篇
丛书文集   362篇
理论方法论   63篇
综合类   3251篇
社会学   222篇
统计学   1339篇
  2024年   8篇
  2023年   47篇
  2022年   20篇
  2021年   64篇
  2020年   80篇
  2019年   94篇
  2018年   127篇
  2017年   174篇
  2016年   109篇
  2015年   161篇
  2014年   260篇
  2013年   707篇
  2012年   337篇
  2011年   383篇
  2010年   293篇
  2009年   310篇
  2008年   323篇
  2007年   336篇
  2006年   292篇
  2005年   297篇
  2004年   223篇
  2003年   231篇
  2002年   209篇
  2001年   162篇
  2000年   111篇
  1999年   56篇
  1998年   36篇
  1997年   38篇
  1996年   23篇
  1995年   17篇
  1994年   21篇
  1993年   14篇
  1992年   16篇
  1991年   14篇
  1990年   11篇
  1989年   11篇
  1988年   5篇
  1987年   10篇
  1986年   3篇
  1985年   18篇
  1984年   18篇
  1983年   7篇
  1982年   8篇
  1981年   2篇
  1980年   2篇
  1979年   2篇
  1978年   1篇
  1977年   4篇
  1976年   1篇
  1975年   1篇
排序方式: 共有5697条查询结果,搜索用时 15 毫秒
101.
An approach to the analysis of time-dependent ordinal quality score data from robust design experiments is developed and applied to an experiment from commercial horticultural research, using concepts of product robustness and longevity that are familiar to analysts in engineering research. A two-stage analysis is used to develop models describing the effects of a number of experimental treatments on the rate of post-sales product quality decline. The first stage uses a polynomial function on a transformed scale to approximate the quality decline for an individual experimental unit using derived coefficients and the second stage uses a joint mean and dispersion model to investigate the effects of the experimental treatments on these derived coefficients. The approach, developed specifically for an application in horticulture, is exemplified with data from a trial testing ornamental plants that are subjected to a range of treatments during production and home-life. The results of the analysis show how a number of control and noise factors affect the rate of post-production quality decline. Although the model is used to analyse quality data from a trial on ornamental plants, the approach developed is expected to be more generally applicable to a wide range of other complex production systems.  相似文献   
102.
Most studies of quality improvement deal with ordered categorical data from industrial experiments. Accounting for the ordering of such data plays an important role in effectively determining the optimal factor level of combination. This paper utilizes the correspondence analysis to develop a procedure to improve the ordered categorical response in a multifactor state system based on Taguchi's statistic. Users may find the proposed procedure in this paper to be attractive because we suggest a simple and also popular statistical tool for graphically identifying the really important factors and determining the levels to improve process quality. A case study for optimizing the polysilicon deposition process in a very large-scale integrated circuit is provided to demonstrate the effectiveness of the proposed procedure.  相似文献   
103.
Networks of ambient monitoring stations are used to monitor environmental pollution fields such as those for acid rain and air pollution. Such stations provide regular measurements of pollutant concentrations. The networks are established for a variety of purposes at various times so often several stations measuring different subsets of pollutant concentrations can be found in compact geographical regions. The problem of statistically combining these disparate information sources into a single 'network' then arises. Capitalizing on the efficiencies so achieved can then lead to the secondary problem of extending this network. The subject of this paper is a set of 31 air pollution monitoring stations in southern Ontario. Each of these regularly measures a particular subset of ionic sulphate, sulphite, nitrite and ozone. However, this subset varies from station to station. For example only two stations measure all four. Some measure just one. We describe a Bayesian framework for integrating the measurements of these stations to yield a spatial predictive distribution for unmonitored sites and unmeasured concentrations at existing stations. Furthermore we show how this network can be extended by using an entropy maximization criterion. The methods assume that the multivariate response field being measured has a joint Gaussian distribution conditional on its mean and covariance function. A conjugate prior is used for these parameters, some of its hyperparameters being fitted empirically.  相似文献   
104.
Model checking with discrete data regressions can be difficult because the usual methods such as residual plots have complicated reference distributions that depend on the parameters in the model. Posterior predictive checks have been proposed as a Bayesian way to average the results of goodness-of-fit tests in the presence of uncertainty in estimation of the parameters. We try this approach using a variety of discrepancy variables for generalized linear models fitted to a historical data set on behavioural learning. We then discuss the general applicability of our findings in the context of a recent applied example on which we have worked. We find that the following discrepancy variables work well, in the sense of being easy to interpret and sensitive to important model failures: structured displays of the entire data set, general discrepancy variables based on plots of binned or smoothed residuals versus predictors and specific discrepancy variables created on the basis of the particular concerns arising in an application. Plots of binned residuals are especially easy to use because their predictive distributions under the model are sufficiently simple that model checks can often be made implicitly. The following discrepancy variables did not work well: scatterplots of latent residuals defined from an underlying continuous model and quantile–quantile plots of these residuals.  相似文献   
105.
A problem of estimating the integral of a squared regression function and of its squared derivatives has been addressed in a number of papers. For the case of a heteroscedastic model where smoothness of the underlying regression function, the design density, and the variance of errors are known, the asymptotically sharp minimax lower bound and a sharp estimator were found in Pastuchova & Khasminski (1989). However, there are apparently no results on the either rate optimal or sharp optimal adaptive, or data-driven, estimation when neither the degree of regression function smoothness nor design density, scale function and distribution of errors are known. After a brief review of main developments in non-parametric estimation of non-linear functionals, we suggest a simple adaptive estimator for the integral of a squared regression function and its derivatives and prove that it is sharp-optimal whenever the estimated derivative is sufficiently smooth.  相似文献   
106.
When Shannon entropy is used as a criterion in the optimal design of experiments, advantage can be taken of the classical identity representing the joint entropy of parameters and observations as the sum of the marginal entropy of the observations and the preposterior conditional entropy of the parameters. Following previous work in which this idea was used in spatial sampling, the method is applied to standard parameterized Bayesian optimal experimental design. Under suitable conditions, which include non-linear as well as linear regression models, it is shown in a few steps that maximizing the marginal entropy of the sample is equivalent to minimizing the preposterior entropy, the usual Bayesian criterion, thus avoiding the use of conditional distributions. It is shown using this marginal formulation that under normality assumptions every standard model which has a two-point prior distribution on the parameters gives an optimal design supported on a single point. Other results include a new asymptotic formula which applies as the error variance is large and bounds on support size.  相似文献   
107.
In this paper, we present an access network design problem with end-to-end quality of service (QoS) requirement. The problem can be conceptualized as a two-level hierarchical location-allocation problem on the tree topology with nonlinear side constraints. The objective function of the nonlinear mixed integer programming model minimizes the total cost of switch and fiber cable, while satisfying demand within the prescribed level of QoS. By exploiting the inherent structure of the nonlinear QoS constraints, we develop linearization techniques for finding an optimal solution. Also, we devise an effective exact optimal algorithm within the context of disjunctive constraint generation. We present promising computational results that demonstrate the effectiveness of the proposed solution procedure.  相似文献   
108.
The three classic pillars of risk analysis are risk assessment (how big is the risk and how sure can we be?), risk management (what shall we do about it?), and risk communication (what shall we say about it, to whom, when, and how?). We propose two complements as important parts of these three bases: risk attribution (who or what addressable conditions actually caused an accident or loss?) and learning from experience about risk reduction (what works, and how well?). Failures in complex systems usually evoke blame, often with insufficient attention to root causes of failure, including some aspects of the situation, design decisions, or social norms and culture. Focusing on blame, however, can inhibit effective learning, instead eliciting excuses to deflect attention and perceived culpability. Productive understanding of what went wrong, and how to do better, thus requires moving past recrimination and excuses. This article identifies common blame‐shifting “lame excuses” for poor risk management. These generally contribute little to effective improvements and may leave real risks and preventable causes unaddressed. We propose principles from risk and decision sciences and organizational design to improve results. These start with organizational leadership. More specifically, they include: deliberate testing and learning—especially from near‐misses and accident precursors; careful causal analysis of accidents; risk quantification; candid expression of uncertainties about costs and benefits of risk‐reduction options; optimization of tradeoffs between gathering additional information and immediate action; promotion of safety culture; and mindful allocation of people, responsibilities, and resources to reduce risks. We propose that these principles provide sound foundations for improving successful risk management.  相似文献   
109.
ABSTRACT

This article examines the use and interpretation of the terms “touch”, “reach” and “movement” in Ministry of Education (later, Department of Education) official publications known as Building Bulletins between the years 1949–1972. A close critical reading of Building Bulletins concerned primarily with school design for young children (infant and primary schools) in the English context has been carried out and the results of this exercise are discussed in the wider context of close relationships established between architects designing schools and leading progressive educationalists in Britain. The wider international context, particularly progressive educational design in the USA, is used to further understand the use and interpretation of these terms. The article contributes to a current interest among historians of education in exploring material and sensory histories of schooling.  相似文献   
110.
Does uncertainty about an outcome influence decisions? The sure-thing principle (Savage, 1954) posits that it should not, but Tversky and Shafir (1992) found that people regularly violate it in hypothetical gambling and vacation decisions, a phenomenon they termed “disjunction effect”. Very close replications and extensions of Tversky and Shafir (1992) were conducted in this paper (N = 890, MTurk). The target article demonstrated the effect using two paradigms in a between-subject design: here, an extension also testing a within-subject design, with design being randomly assigned was added. These results were consistent with the original findings for the “paying to know“ problem (original: Cramer’s V = 0.22, 95% (CI) [0.14, 0.32]; replication: Cramer’s V = 0.30, 95% CI [0.24, 0.37]), yet not for the “choice under risk” problem (original: Cramer’s V = 0.26, 95% CI [0.14, 0.39]; replication: Cramer’s V = 0.11, 95% CI [−0.07, 0.20]). The within-subject extension showed very similar results. Implications for the disjunction effect and judgment and decision-making theory are discussed, and a call for improvements on the statistical understanding of comparisons of between-subject and within-subject designs is introduced. All materials, data, and code are available on https://osf.io/gu58m/.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号