首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1248篇
  免费   52篇
  国内免费   5篇
管理学   87篇
民族学   2篇
人才学   1篇
人口学   9篇
丛书文集   29篇
理论方法论   24篇
综合类   260篇
社会学   36篇
统计学   857篇
  2024年   1篇
  2023年   23篇
  2022年   12篇
  2021年   21篇
  2020年   26篇
  2019年   33篇
  2018年   33篇
  2017年   41篇
  2016年   30篇
  2015年   41篇
  2014年   50篇
  2013年   319篇
  2012年   90篇
  2011年   49篇
  2010年   53篇
  2009年   49篇
  2008年   56篇
  2007年   40篇
  2006年   23篇
  2005年   36篇
  2004年   18篇
  2003年   33篇
  2002年   18篇
  2001年   17篇
  2000年   18篇
  1999年   15篇
  1998年   17篇
  1997年   15篇
  1996年   10篇
  1995年   10篇
  1994年   6篇
  1993年   9篇
  1992年   7篇
  1991年   7篇
  1990年   6篇
  1989年   5篇
  1988年   9篇
  1987年   2篇
  1986年   2篇
  1985年   9篇
  1984年   11篇
  1983年   10篇
  1982年   8篇
  1981年   3篇
  1980年   1篇
  1979年   3篇
  1978年   2篇
  1977年   5篇
  1976年   1篇
  1975年   2篇
排序方式: 共有1305条查询结果,搜索用时 31 毫秒
1.
Proportional hazards are a common assumption when designing confirmatory clinical trials in oncology. This assumption not only affects the analysis part but also the sample size calculation. The presence of delayed effects causes a change in the hazard ratio while the trial is ongoing since at the beginning we do not observe any difference between treatment arms, and after some unknown time point, the differences between treatment arms will start to appear. Hence, the proportional hazards assumption no longer holds, and both sample size calculation and analysis methods to be used should be reconsidered. The weighted log‐rank test allows a weighting for early, middle, and late differences through the Fleming and Harrington class of weights and is proven to be more efficient when the proportional hazards assumption does not hold. The Fleming and Harrington class of weights, along with the estimated delay, can be incorporated into the sample size calculation in order to maintain the desired power once the treatment arm differences start to appear. In this article, we explore the impact of delayed effects in group sequential and adaptive group sequential designs and make an empirical evaluation in terms of power and type‐I error rate of the of the weighted log‐rank test in a simulated scenario with fixed values of the Fleming and Harrington class of weights. We also give some practical recommendations regarding which methodology should be used in the presence of delayed effects depending on certain characteristics of the trial.  相似文献   
2.
Bioequivalence (BE) studies are designed to show that two formulations of one drug are equivalent and they play an important role in drug development. When in a design stage, it is possible that there is a high degree of uncertainty on variability of the formulations and the actual performance of the test versus reference formulation. Therefore, an interim look may be desirable to stop the study if there is no chance of claiming BE at the end (futility), or claim BE if evidence is sufficient (efficacy), or adjust the sample size. Sequential design approaches specially for BE studies have been proposed previously in publications. We applied modification to the existing methods focusing on simplified multiplicity adjustment and futility stopping. We name our method modified sequential design for BE studies (MSDBE). Simulation results demonstrate comparable performance between MSDBE and the original published methods while MSDBE offers more transparency and better applicability. The R package MSDBE is available at https://sites.google.com/site/modsdbe/ . Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
3.
Comparison of Four New General Classes of Search Designs   总被引:1,自引:0,他引:1  
A factor screening experiment identifies a few important factors from a large list of factors that potentially influence the response. If a list consists of m factors each at three levels, a design is a subset of all possible 3 m runs. This paper considers the problem of finding designs with small numbers of runs, using the search linear model introduced in Srivastava (1975). The paper presents four new general classes of these 'search designs', each with 2 m −1 runs, which permit, at most, two important factors out of m factors to be searched for and identified. The paper compares the designs for 4 ≤ m ≤ 10, using arithmetic and geometric means of the determinants, traces and maximum characteristic roots of particular matrices. Two of the designs are found to be superior in all six criteria studied. The four designs are identical for m = 3 and this design is an optimal design in the class of all search designs under the six criteria. The four designs are also identical for m = 4 under some row and column permutations.  相似文献   
4.
The problem considered is that of finding an optimum measurement schedule to estimate population parameters in a nonlinear model when the patient effects are random. The paper presents examples of the use of sensitivity functions, derived from the General Equivalence Theorem for D-optimality, in the construction of optimum population designs for such schedules. With independent observations, the theorem applies to the potential inclusion of a single observation. However, in population designs the observations are correlated and the theorem applies to the inclusion of an additional measurement schedule. In one example, three groups of patients of differing size are subject to distinct schedules. Numerical, as opposed to analytical, calculation of the sensitivity function is advocated. The required covariances of the observations are found by simulation.  相似文献   
5.
For a wide variety of applications, experiments are based on units ordered over time or space. Models for these experiments generally may include one or more of: correlations, systematic trends, carryover effects and interference effects. Since the standard optimal block designs may not be efficient in these situations, orthogonal arrays of type I and type II, which were introduced in 1961 by C.R. Rao [Combinatorial arrangements analogous to orthogonal arrays, Sankhya A 23 (1961) 283–286], have been recently used to construct optimal and efficient designs for many of these experiments. Results in this area are unified and the salient features are outlined.  相似文献   
6.
Complete and partial diallel cross designs are examined as to their construction and robustness against the loss of a block of observations. A simple generalized inverse is found for the information matrix of the line effects, which allows evaluation of expressions for the variances of the line-effect differences with and without the missing block. A-efficiencies, based on average variances of the elementary contrasts of the line-effects, suggest that these designs are fairly robust. The loss of efficiency is generally less than 10%, but it is shown that specific comparisons might suffer a loss of efficiency of as much as 40%.  相似文献   
7.
基于复杂适应系统的作战理论哲学反思   总被引:1,自引:0,他引:1  
传统的作战理论与方法已经不能适应像现代信息化战争系统这类充满“活”的个体和变化因素的复杂系统,需要进行理论创新。而复杂适应系统理论是当代系统科学的一个新发展。有望成为创新作战理论的突破口。本文在分析比较作战系统的基础上,认为作战系统实质是复杂的适应系统,作战系统内的作战双方都力图以增强自身的适应性和复杂性,削弱对方的适应性和复杂性取得作战的胜利。  相似文献   
8.
Parameter design or robust parameter design (RPD) is an engineering methodology intended as a cost-effective approach for improving the quality of products and processes. The goal of parameter design is to choose the levels of the control variables that optimize a defined quality characteristic. An essential component of RPD involves the assumption of well estimated models for the process mean and variance. Traditionally, the modeling of the mean and variance has been done parametrically. It is often the case, particularly when modeling the variance, that nonparametric techniques are more appropriate due to the nature of the curvature in the underlying function. Most response surface experiments involve sparse data. In sparse data situations with unusual curvature in the underlying function, nonparametric techniques often result in estimates with problematic variation whereas their parametric counterparts may result in estimates with problematic bias. We propose the use of semi-parametric modeling within the robust design setting, combining parametric and nonparametric functions to improve the quality of both mean and variance model estimation. The proposed method will be illustrated with an example and simulations.  相似文献   
9.
描述了影响DBF系统特性的主要因素,研究了阵元间互耦对自适应方向图旁瓣和零深的影响及校正方法,讨论了在DBF阵中校正接收通道幅、相误差和I/Q支路正交误差的技术途径。计算机模拟和测试证明,按照所述方法进行校正可以得到满意的结果。另外,为了减小I/Q支路产生正交误差,建议采用中频直接采样和数字化的接收机方案。  相似文献   
10.
Summary For technological applications it can be useful to identify some simple physical mechanisms, which, on the basis of the available knowledge of the production process, may suggest the most appropriate approach to statistical control of the random quantities of interest. For this purpose the notion of rupture point is introduced firstly. A rupture point is characterized bym randomly arising out of control states, assumed to be mutually exclusive and stochastically independent. Shewhart's control charts seem to represent the natural statistical tool for controlling a rupture point; however it is shown that they are fully justified only when the hazard rates attached to the causes of failure are constant. Otherwise, typically in the presence of time increasing hazard rates, Shewhart's control charts should be completed by a preventive intervention rule (preventive maintenance). In the second place, the notion of dynamic instability point is introduced, which is specifically characterized by assuming that the random quantity of interest is ruled by a stochastic differential equation with constant coefficients. By discretization, developed according to a possibly new approach, it is shown that the former model reduces to an equation error model, which is among the simplest used in adaptive control, and thus particularly easy to deal with in regard to parameter estimation and the definition of the optimum control rule.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号