首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2455篇
  免费   101篇
  国内免费   35篇
管理学   516篇
民族学   6篇
人才学   1篇
人口学   75篇
丛书文集   107篇
理论方法论   38篇
综合类   1100篇
社会学   62篇
统计学   686篇
  2024年   7篇
  2023年   31篇
  2022年   40篇
  2021年   41篇
  2020年   54篇
  2019年   74篇
  2018年   79篇
  2017年   84篇
  2016年   67篇
  2015年   107篇
  2014年   113篇
  2013年   316篇
  2012年   176篇
  2011年   163篇
  2010年   116篇
  2009年   153篇
  2008年   125篇
  2007年   135篇
  2006年   108篇
  2005年   96篇
  2004年   94篇
  2003年   73篇
  2002年   63篇
  2001年   49篇
  2000年   48篇
  1999年   32篇
  1998年   25篇
  1997年   25篇
  1996年   18篇
  1995年   15篇
  1994年   12篇
  1993年   8篇
  1992年   9篇
  1991年   8篇
  1990年   6篇
  1989年   7篇
  1988年   5篇
  1987年   3篇
  1985年   2篇
  1984年   2篇
  1983年   1篇
  1982年   1篇
排序方式: 共有2591条查询结果,搜索用时 15 毫秒
1.
In this article, we propose a novel approach for testing the equality of two log-normal populations using a computational approach test (CAT) that does not require explicit knowledge of the sampling distribution of the test statistic. Simulation studies demonstrate that the proposed approach can perform hypothesis testing with satisfying actual size even at small sample sizes. Overall, it is superior to other existing methods. Also, a CAT is proposed for testing about reliability of two log-normal populations when the means are the same. Simulations show that the actual size of this new approach is close to nominal level and better than the score test. At the end, the proposed methods are illustrated using two examples.  相似文献   
2.
Managing risk in infrastructure systems implies dealing with interdependent physical networks and their relationships with the natural and societal contexts. Computational tools are often used to support operational decisions aimed at improving resilience, whereas economics‐related tools tend to be used to address broader societal and policy issues in infrastructure management. We propose an optimization‐based framework for infrastructure resilience analysis that incorporates organizational and socioeconomic aspects into operational problems, allowing to understand relationships between decisions at the policy level (e.g., regulation) and the technical level (e.g., optimal infrastructure restoration). We focus on three issues that arise when integrating such levels. First, optimal restoration strategies driven by financial and operational factors evolve differently compared to those driven by socioeconomic and humanitarian factors. Second, regulatory aspects have a significant impact on recovery dynamics (e.g., effective recovery is most challenging in societies with weak institutions and regulation, where individual interests may compromise societal well‐being). And third, the decision space (i.e., available actions) in postdisaster phases is strongly determined by predisaster decisions (e.g., resource allocation). The proposed optimization framework addresses these issues by using: (1) parametric analyses to test the influence of operational and socioeconomic factors on optimization outcomes, (2) regulatory constraints to model and assess the cost and benefit (for a variety of actors) of enforcing specific policy‐related conditions for the recovery process, and (3) sensitivity analyses to capture the effect of predisaster decisions on recovery. We illustrate our methodology with an example regarding the recovery of interdependent water, power, and gas networks in Shelby County, TN (USA), with exposure to natural hazards.  相似文献   
3.
When a candidate predictive marker is available, but evidence on its predictive ability is not sufficiently reliable, all‐comers trials with marker stratification are frequently conducted. We propose a framework for planning and evaluating prospective testing strategies in confirmatory, phase III marker‐stratified clinical trials based on a natural assumption on heterogeneity of treatment effects across marker‐defined subpopulations, where weak rather than strong control is permitted for multiple population tests. For phase III marker‐stratified trials, it is expected that treatment efficacy is established in a particular patient population, possibly in a marker‐defined subpopulation, and that the marker accuracy is assessed when the marker is used to restrict the indication or labelling of the treatment to a marker‐based subpopulation, ie, assessment of the clinical validity of the marker. In this paper, we develop statistical testing strategies based on criteria that are explicitly designated to the marker assessment, including those examining treatment effects in marker‐negative patients. As existing and developed statistical testing strategies can assert treatment efficacy for either the overall patient population or the marker‐positive subpopulation, we also develop criteria for evaluating the operating characteristics of the statistical testing strategies based on the probabilities of asserting treatment efficacy across marker subpopulations. Numerical evaluations to compare the statistical testing strategies based on the developed criteria are provided.  相似文献   
4.
In studies with recurrent event endpoints, misspecified assumptions of event rates or dispersion can lead to underpowered trials or overexposure of patients. Specification of overdispersion is often a particular problem as it is usually not reported in clinical trial publications. Changing event rates over the years have been described for some diseases, adding to the uncertainty in planning. To mitigate the risks of inadequate sample sizes, internal pilot study designs have been proposed with a preference for blinded sample size reestimation procedures, as they generally do not affect the type I error rate and maintain trial integrity. Blinded sample size reestimation procedures are available for trials with recurrent events as endpoints. However, the variance in the reestimated sample size can be considerable in particular with early sample size reviews. Motivated by a randomized controlled trial in paediatric multiple sclerosis, a rare neurological condition in children, we apply the concept of blinded continuous monitoring of information, which is known to reduce the variance in the resulting sample size. Assuming negative binomial distributions for the counts of recurrent relapses, we derive information criteria and propose blinded continuous monitoring procedures. The operating characteristics of these are assessed in Monte Carlo trial simulations demonstrating favourable properties with regard to type I error rate, power, and stopping time, ie, sample size.  相似文献   
5.
Bioequivalence (BE) studies are designed to show that two formulations of one drug are equivalent and they play an important role in drug development. When in a design stage, it is possible that there is a high degree of uncertainty on variability of the formulations and the actual performance of the test versus reference formulation. Therefore, an interim look may be desirable to stop the study if there is no chance of claiming BE at the end (futility), or claim BE if evidence is sufficient (efficacy), or adjust the sample size. Sequential design approaches specially for BE studies have been proposed previously in publications. We applied modification to the existing methods focusing on simplified multiplicity adjustment and futility stopping. We name our method modified sequential design for BE studies (MSDBE). Simulation results demonstrate comparable performance between MSDBE and the original published methods while MSDBE offers more transparency and better applicability. The R package MSDBE is available at https://sites.google.com/site/modsdbe/ . Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
6.
While much used in practice, latent variable models raise challenging estimation problems due to the intractability of their likelihood. Monte Carlo maximum likelihood (MCML), as proposed by Geyer & Thompson (1992 ), is a simulation-based approach to maximum likelihood approximation applicable to general latent variable models. MCML can be described as an importance sampling method in which the likelihood ratio is approximated by Monte Carlo averages of importance ratios simulated from the complete data model corresponding to an arbitrary value of the unknown parameter. This paper studies the asymptotic (in the number of observations) performance of the MCML method in the case of latent variable models with independent observations. This is in contrast with previous works on the same topic which only considered conditional convergence to the maximum likelihood estimator, for a fixed set of observations. A first important result is that when is fixed, the MCML method can only be consistent if the number of simulations grows exponentially fast with the number of observations. If on the other hand, is obtained from a consistent sequence of estimates of the unknown parameter, then the requirements on the number of simulations are shown to be much weaker.  相似文献   
7.
本文给出一类非均匀弦的横向振动的最佳控制,推广了P C Park的结果。这一结果同样适用于同类型振动问题。  相似文献   
8.
将无抵触理论应用于公共英语教学,可以对外语学习活动进行设计,使学习者水平与任务的难度相当、培养语言学习兴趣、加强师生交流、确保学习者获得学习主动权等,使其在学习过程中体验到巨大乐趣,进入以注意力高度集中和全身心投入为特征的无抵触心理状态,提高学习效率。  相似文献   
9.
提出了一种求解等式约束优化问题的异步并行拟牛顿方法 .若假设目标函数 f和约束函数h至少三次连续可微 ,且△h(x)对任意x∈Rn 均为满秩矩阵 ,证明了所提出的异步并行算法是 q—超线性收敛的 .  相似文献   
10.
本文提出利用列表和在表上逐步进行修改,最终得到工期成本优化问题的最优解的方法。该方法的特点是直观、易于操作。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号