首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5298篇
  免费   133篇
  国内免费   55篇
管理学   402篇
民族学   10篇
人才学   6篇
人口学   26篇
丛书文集   352篇
理论方法论   70篇
综合类   3144篇
社会学   227篇
统计学   1249篇
  2024年   8篇
  2023年   45篇
  2022年   17篇
  2021年   59篇
  2020年   70篇
  2019年   88篇
  2018年   123篇
  2017年   172篇
  2016年   106篇
  2015年   155篇
  2014年   246篇
  2013年   665篇
  2012年   331篇
  2011年   368篇
  2010年   286篇
  2009年   305篇
  2008年   315篇
  2007年   323篇
  2006年   289篇
  2005年   289篇
  2004年   216篇
  2003年   223篇
  2002年   201篇
  2001年   165篇
  2000年   109篇
  1999年   54篇
  1998年   34篇
  1997年   34篇
  1996年   24篇
  1995年   15篇
  1994年   20篇
  1993年   14篇
  1992年   17篇
  1991年   11篇
  1990年   10篇
  1989年   9篇
  1988年   5篇
  1987年   10篇
  1986年   3篇
  1985年   15篇
  1984年   15篇
  1983年   7篇
  1982年   7篇
  1980年   2篇
  1979年   2篇
  1977年   4篇
排序方式: 共有5486条查询结果,搜索用时 0 毫秒
101.
Most studies of quality improvement deal with ordered categorical data from industrial experiments. Accounting for the ordering of such data plays an important role in effectively determining the optimal factor level of combination. This paper utilizes the correspondence analysis to develop a procedure to improve the ordered categorical response in a multifactor state system based on Taguchi's statistic. Users may find the proposed procedure in this paper to be attractive because we suggest a simple and also popular statistical tool for graphically identifying the really important factors and determining the levels to improve process quality. A case study for optimizing the polysilicon deposition process in a very large-scale integrated circuit is provided to demonstrate the effectiveness of the proposed procedure.  相似文献   
102.
Networks of ambient monitoring stations are used to monitor environmental pollution fields such as those for acid rain and air pollution. Such stations provide regular measurements of pollutant concentrations. The networks are established for a variety of purposes at various times so often several stations measuring different subsets of pollutant concentrations can be found in compact geographical regions. The problem of statistically combining these disparate information sources into a single 'network' then arises. Capitalizing on the efficiencies so achieved can then lead to the secondary problem of extending this network. The subject of this paper is a set of 31 air pollution monitoring stations in southern Ontario. Each of these regularly measures a particular subset of ionic sulphate, sulphite, nitrite and ozone. However, this subset varies from station to station. For example only two stations measure all four. Some measure just one. We describe a Bayesian framework for integrating the measurements of these stations to yield a spatial predictive distribution for unmonitored sites and unmeasured concentrations at existing stations. Furthermore we show how this network can be extended by using an entropy maximization criterion. The methods assume that the multivariate response field being measured has a joint Gaussian distribution conditional on its mean and covariance function. A conjugate prior is used for these parameters, some of its hyperparameters being fitted empirically.  相似文献   
103.
Model checking with discrete data regressions can be difficult because the usual methods such as residual plots have complicated reference distributions that depend on the parameters in the model. Posterior predictive checks have been proposed as a Bayesian way to average the results of goodness-of-fit tests in the presence of uncertainty in estimation of the parameters. We try this approach using a variety of discrepancy variables for generalized linear models fitted to a historical data set on behavioural learning. We then discuss the general applicability of our findings in the context of a recent applied example on which we have worked. We find that the following discrepancy variables work well, in the sense of being easy to interpret and sensitive to important model failures: structured displays of the entire data set, general discrepancy variables based on plots of binned or smoothed residuals versus predictors and specific discrepancy variables created on the basis of the particular concerns arising in an application. Plots of binned residuals are especially easy to use because their predictive distributions under the model are sufficiently simple that model checks can often be made implicitly. The following discrepancy variables did not work well: scatterplots of latent residuals defined from an underlying continuous model and quantile–quantile plots of these residuals.  相似文献   
104.
A problem of estimating the integral of a squared regression function and of its squared derivatives has been addressed in a number of papers. For the case of a heteroscedastic model where smoothness of the underlying regression function, the design density, and the variance of errors are known, the asymptotically sharp minimax lower bound and a sharp estimator were found in Pastuchova & Khasminski (1989). However, there are apparently no results on the either rate optimal or sharp optimal adaptive, or data-driven, estimation when neither the degree of regression function smoothness nor design density, scale function and distribution of errors are known. After a brief review of main developments in non-parametric estimation of non-linear functionals, we suggest a simple adaptive estimator for the integral of a squared regression function and its derivatives and prove that it is sharp-optimal whenever the estimated derivative is sufficiently smooth.  相似文献   
105.
When Shannon entropy is used as a criterion in the optimal design of experiments, advantage can be taken of the classical identity representing the joint entropy of parameters and observations as the sum of the marginal entropy of the observations and the preposterior conditional entropy of the parameters. Following previous work in which this idea was used in spatial sampling, the method is applied to standard parameterized Bayesian optimal experimental design. Under suitable conditions, which include non-linear as well as linear regression models, it is shown in a few steps that maximizing the marginal entropy of the sample is equivalent to minimizing the preposterior entropy, the usual Bayesian criterion, thus avoiding the use of conditional distributions. It is shown using this marginal formulation that under normality assumptions every standard model which has a two-point prior distribution on the parameters gives an optimal design supported on a single point. Other results include a new asymptotic formula which applies as the error variance is large and bounds on support size.  相似文献   
106.
In this paper, we present an access network design problem with end-to-end quality of service (QoS) requirement. The problem can be conceptualized as a two-level hierarchical location-allocation problem on the tree topology with nonlinear side constraints. The objective function of the nonlinear mixed integer programming model minimizes the total cost of switch and fiber cable, while satisfying demand within the prescribed level of QoS. By exploiting the inherent structure of the nonlinear QoS constraints, we develop linearization techniques for finding an optimal solution. Also, we devise an effective exact optimal algorithm within the context of disjunctive constraint generation. We present promising computational results that demonstrate the effectiveness of the proposed solution procedure.  相似文献   
107.
The three classic pillars of risk analysis are risk assessment (how big is the risk and how sure can we be?), risk management (what shall we do about it?), and risk communication (what shall we say about it, to whom, when, and how?). We propose two complements as important parts of these three bases: risk attribution (who or what addressable conditions actually caused an accident or loss?) and learning from experience about risk reduction (what works, and how well?). Failures in complex systems usually evoke blame, often with insufficient attention to root causes of failure, including some aspects of the situation, design decisions, or social norms and culture. Focusing on blame, however, can inhibit effective learning, instead eliciting excuses to deflect attention and perceived culpability. Productive understanding of what went wrong, and how to do better, thus requires moving past recrimination and excuses. This article identifies common blame‐shifting “lame excuses” for poor risk management. These generally contribute little to effective improvements and may leave real risks and preventable causes unaddressed. We propose principles from risk and decision sciences and organizational design to improve results. These start with organizational leadership. More specifically, they include: deliberate testing and learning—especially from near‐misses and accident precursors; careful causal analysis of accidents; risk quantification; candid expression of uncertainties about costs and benefits of risk‐reduction options; optimization of tradeoffs between gathering additional information and immediate action; promotion of safety culture; and mindful allocation of people, responsibilities, and resources to reduce risks. We propose that these principles provide sound foundations for improving successful risk management.  相似文献   
108.
ABSTRACT

This article examines the use and interpretation of the terms “touch”, “reach” and “movement” in Ministry of Education (later, Department of Education) official publications known as Building Bulletins between the years 1949–1972. A close critical reading of Building Bulletins concerned primarily with school design for young children (infant and primary schools) in the English context has been carried out and the results of this exercise are discussed in the wider context of close relationships established between architects designing schools and leading progressive educationalists in Britain. The wider international context, particularly progressive educational design in the USA, is used to further understand the use and interpretation of these terms. The article contributes to a current interest among historians of education in exploring material and sensory histories of schooling.  相似文献   
109.
Does uncertainty about an outcome influence decisions? The sure-thing principle (Savage, 1954) posits that it should not, but Tversky and Shafir (1992) found that people regularly violate it in hypothetical gambling and vacation decisions, a phenomenon they termed “disjunction effect”. Very close replications and extensions of Tversky and Shafir (1992) were conducted in this paper (N = 890, MTurk). The target article demonstrated the effect using two paradigms in a between-subject design: here, an extension also testing a within-subject design, with design being randomly assigned was added. These results were consistent with the original findings for the “paying to know“ problem (original: Cramer’s V = 0.22, 95% (CI) [0.14, 0.32]; replication: Cramer’s V = 0.30, 95% CI [0.24, 0.37]), yet not for the “choice under risk” problem (original: Cramer’s V = 0.26, 95% CI [0.14, 0.39]; replication: Cramer’s V = 0.11, 95% CI [−0.07, 0.20]). The within-subject extension showed very similar results. Implications for the disjunction effect and judgment and decision-making theory are discussed, and a call for improvements on the statistical understanding of comparisons of between-subject and within-subject designs is introduced. All materials, data, and code are available on https://osf.io/gu58m/.  相似文献   
110.
We consider the blinded sample size re‐estimation based on the simple one‐sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two‐sample t‐test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re‐estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non‐inferiority margin for non‐inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号