首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   31篇
  免费   2篇
管理学   7篇
人口学   1篇
综合类   1篇
社会学   4篇
统计学   20篇
  2023年   2篇
  2021年   1篇
  2020年   1篇
  2019年   2篇
  2018年   1篇
  2017年   2篇
  2016年   1篇
  2015年   1篇
  2014年   1篇
  2013年   9篇
  2012年   1篇
  2011年   3篇
  2010年   4篇
  2009年   1篇
  2007年   1篇
  2004年   1篇
  1998年   1篇
排序方式: 共有33条查询结果,搜索用时 31 毫秒
1.
Since the implementation of the International Conference on Harmonization (ICH) E14 guideline in 2005, regulators have required a “thorough QTc” (TQT) study for evaluating the effects of investigational drugs on delayed cardiac repolarization as manifested by a prolonged QTc interval. However, TQT studies have increasingly been viewed unfavorably because of their low cost effectiveness. Several researchers have noted that a robust drug concentration‐QTc (conc‐QTc) modeling assessment in early phase development should, in most cases, obviate the need for a subsequent TQT study. In December 2015, ICH released an “E14 Q&As (R3)” document supporting the use of conc‐QTc modeling for regulatory decisions. In this article, we propose a simple improvement of two popular conc‐QTc assessment methods for typical first‐in‐human crossover‐like single ascending dose clinical pharmacology trials. The improvement is achieved, in part, by leveraging routinely encountered (and expected) intrasubject correlation patterns encountered in such trials. A real example involving a single ascending dose and corresponding TQT trial, along with results from a simulation study, illustrate the strong performance of the proposed method. The improved conc‐QTc assessment will further enable highly reliable go/no‐go decisions in early phase clinical development and deliver results that support subsequent TQT study waivers by regulators.  相似文献   
2.
The present study deals with the method of estimation of the parameters of k-components load-sharing parallel system model in which each component’s failure time distribution is assumed to be geometric. The maximum likelihood estimates of the load-share parameters with their standard errors are obtained. (1 − γ) 100% joint, Bonferroni simultaneous and two bootstrap confidence intervals for the parameters have been constructed. Further, recognizing the fact that life testing experiments are time consuming, it seems realistic to consider the load-share parameters to be random variable. Therefore, Bayes estimates along with their standard errors of the parameters are obtained by assuming Jeffrey’s invariant and gamma priors for the unknown parameters. Since, Bayes estimators can not be found in closed form expressions, Tierney and Kadane’s approximation method have been used to compute Bayes estimates and standard errors of the parameters. Markov Chain Monte Carlo technique such as Gibbs sampler is also used to obtain Bayes estimates and highest posterior density credible intervals of the load-share parameters. Metropolis–Hastings algorithm is used to generate samples from the posterior distributions of the unknown parameters.  相似文献   
3.
For nearly all call centers, agent schedules are typically created several days or weeks before the time that agents report to work. After schedules are created, call center resource managers receive additional information that can affect forecasted workload and resource availability. In particular, there is significant evidence, both among practitioners and in the research literature, suggesting that actual call arrival volumes early in a scheduling period (typically an individual day or week) can provide valuable information about the call arrival pattern later in the same scheduling period. In this paper, we develop a flexible and powerful heuristic framework for managers to make intra‐day resource adjustment decisions that take into account updated call forecasts, updated agent requirements, existing agent schedules, agents' schedule flexibility, and associated incremental labor costs. We demonstrate the value of this methodology in managing the trade‐off between labor costs and service levels to best meet variable rates of demand for service, using data from an actual call center.  相似文献   
4.
We present a novel methodology for a comprehensive statistical analysis of approximately periodic biosignal data. There are two main challenges in such analysis: (1) the automatic extraction (segmentation) of cycles from long, cyclostationary biosignals and (2) the subsequent statistical analysis, which in many cases involves the separation of temporal and amplitude variabilities. The proposed framework provides a principled approach for statistical analysis of such signals, which in turn allows for an efficient cycle segmentation algorithm. This is achieved using a convenient representation of functions called the square-root velocity function (SRVF). The segmented cycles, represented by SRVFs, are temporally aligned using the notion of the Karcher mean, which in turn allows for more efficient statistical summaries of signals. We show the strengths of this method through various disease classification experiments. In the case of myocardial infarction detection and localization, we show that our method compares favorably to methods described in the current literature.  相似文献   
5.
A randomized trial allows estimation of the causal effect of an intervention compared to a control in the overall population and in subpopulations defined by baseline characteristics. Often, however, clinical questions also arise regarding the treatment effect in subpopulations of patients, which would experience clinical or disease related events post-randomization. Events that occur after treatment initiation and potentially affect the interpretation or the existence of the measurements are called intercurrent events in the ICH E9(R1) guideline. If the intercurrent event is a consequence of treatment, randomization alone is no longer sufficient to meaningfully estimate the treatment effect. Analyses comparing the subgroups of patients without the intercurrent events for intervention and control will not estimate a causal effect. This is well known, but post-hoc analyses of this kind are commonly performed in drug development. An alternative approach is the principal stratum strategy, which classifies subjects according to their potential occurrence of an intercurrent event on both study arms. We illustrate with examples that questions formulated through principal strata occur naturally in drug development and argue that approaching these questions with the ICH E9(R1) estimand framework has the potential to lead to more transparent assumptions as well as more adequate analyses and conclusions. In addition, we provide an overview of assumptions required for estimation of effects in principal strata. Most of these assumptions are unverifiable and should hence be based on solid scientific understanding. Sensitivity analyses are needed to assess robustness of conclusions.  相似文献   
6.
The use of surrogate variables has been proposed as a means to capture, for a given observed set of data, sources driving the dependency structure among high-dimensional sets of features and remove the effects of those sources and their potential negative impact on simultaneous inference. In this article we illustrate the potential effects of latent variables on testing dependence and the resulting impact on multiple inference, we briefly review the method of surrogate variable analysis proposed by Leek and Storey (PNAS 2008; 105:18718-18723), and assess that method via simulations intended to mimic the complexity of feature dependence observed in real-world microarray data. The method is also assessed via application to a recent Merck microarray data set. Both simulation and case study results indicate that surrogate variable analysis can offer a viable strategy for tackling the multiple testing dependence problem when the features follow a potentially complex correlation structure, yielding improvements in the variability of false positive rates and increases in power.  相似文献   
7.
Registration of temporal observations is a fundamental problem in functional data analysis. Various frameworks have been developed over the past two decades where registrations are conducted based on optimal time warping between functions. Comparison of functions solely based on time warping, however, may have limited application, in particular when certain constraints are desired in the registration. In this paper, we study registration with norm-preserving constraint. A closely related problem is on signal estimation, where the goal is to estimate the ground-truth template given random observations with both compositional and additive noises. We propose to adopt the Fisher–Rao framework to compute the underlying template, and mathematically prove that such framework leads to a consistent estimator. We then illustrate the constrained Fisher–Rao registration using simulations as well as two real data sets. It is found that the constrained method is robust with respect to additive noise and has superior alignment and classification performance to conventional, unconstrained registration methods.  相似文献   
8.
The overuse of its currency processing operations by depository institutions (DIs) has motivated the Federal Reserve (Fed) to propose new currency recirculation guidelines. The Fed believes that DIs should play a more active role in recirculating fit (i.e., usable) currency so that the societal cost of providing currency to the public is minimized. The Fed characterizes the overuse by the extent of cross shipping, a practice in which the same DI deposits and withdraws currency of the same denomination within five business days in the same geographic region. The Fed's proposal encourages DIs to fit sort and reuse deposited currency through two components: a custodial inventory program and a recirculation fee that would be charged on withdrawals of cross‐shipped currency. Given the geographical network of the various branches of a DI, the extent of its participation in the proposed programs depends on a variety of factors: the nature of demand and supply of currency, number and locations of the processing centers, and the resulting fit‐sorting, holding, and transportation costs. The interrelated nature of these decisions motivates the need for an integrated model that captures the flow of currency in the entire network of the DI. Based on our work with Brink's Inc., a leading secure‐logistics provider, we develop a mixed‐integer linear programming (MILP) model to provide managers of DIs with a decision‐making tool under the Fed's new guidelines. Broadly, we analyze the following questions: (i) Over all typical practical realizations of the demand for currency that a DI may face, and over all reasonable cost implications, is there a menu of “good” operating policies? (ii) What is the monetary impact of fit‐sorting and custodial inventories on a DI? and (iii) To what extent will the Fed's new guidelines address its main goal, namely, a reduction in the practice of cross shipping by encouraging DIs to recirculate currency?  相似文献   
9.
Sample sizes of Phase 2 dose-finding studies, usually determined based on a power requirement to detect a significant dose–response relationship, will generally not provide adequate precision for Phase 3 target dose selection. We propose to calculate the sample size of a dose-finding study based on the probability of successfully identifying the target dose within an acceptable range (e.g., 80%–120% of the target) using the multiple comparison and modeling procedure (MCP-Mod). With the proposed approach, different design options for the Phase 2 dose-finding study can also be compared. Due to inherent uncertainty around an assumed true dose–response relationship, sensitivity analyses to assess the robustness of the sample size calculations to deviations from modeling assumptions are recommended. Planning for a hypothetical Phase 2 dose-finding study is used to illustrate the main points. Codes for the proposed approach is available at https://github.com/happysundae/posMCPMod .  相似文献   
10.
Theory-based, scientific research examining sexual behaviors of young adults is sparse in India, even though pre-marital sex among unmarried young people has been rising in recent years. At the same time, young people aged 15 to 24 are disproportionately affected by HIV/AIDS. This has been attributed in part to rising pre-marital sexual behaviors, coupled with a lack of sex education. The objective of this study was to advance an understanding of the determinants of sexual behavior among unmarried young adults in northern India. An adaptation of a comprehensive model of health behavior, the Multiple Domain Model, was employed to study the effects of environmental/cultural influences (parental and media), structural determinants (sex, socioeconomic status, age, caste, and place of residence), personality factors (sensation-seeking and impulsive decision making), gender role identity, psychosocial variables (attitudes, norms, and self-efficacy), contextual influences (relationship status and alcohol/drug use) and preparatory behaviors (frequency of being in sexual situations) on adolescents' sexual behaviors. Results of path analysis indicated that key predictors of ever having had vaginal sex included preparatory behaviors, masculine gender role identity, attitudes toward having sex and peer norms regarding sex. Implications of these findings for future research and intervention are discussed.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号