首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   18篇
  免费   1篇
管理学   1篇
理论方法论   2篇
社会学   6篇
统计学   10篇
  2023年   1篇
  2021年   2篇
  2020年   1篇
  2017年   2篇
  2013年   2篇
  2012年   1篇
  2011年   3篇
  2010年   1篇
  2009年   2篇
  2007年   1篇
  1993年   1篇
  1985年   1篇
  1982年   1篇
排序方式: 共有19条查询结果,搜索用时 701 毫秒
1.
This paper reviews recent developments in the stochastic comparison of order statistics. The results discussed are basically: (l) Stochastic comparisons of linear combinations of order statistics from distributions F and G where G?1 F is convex or starshaped. (2) Stochastic comparisons of individual order statistics and of vectors of order statistics from underlying heterogeneous distributions by the use of majorization and Schur function theory. (3) Stochastic comparison of random processes. Applications to reliability problems are presented illustrating the use and value of the theoretical results described  相似文献   
2.
3.

Motivated by the study of traffic accidents on a road network, we discuss the estimation of the relative risk, the ratio of rates of occurrence of different types of events occurring on a network of lines. Methods developed for two-dimensional spatial point patterns can be adapted to a linear network, but their requirements and performance are very different on a network. Computation is slow and we introduce new techniques to accelerate it. Intensities (occurrence rates) are estimated by kernel smoothing using the heat kernel on the network. The main methodological problem is bandwidth selection. Binary regression methods, such as likelihood cross-validation and least squares cross-validation, perform tolerably well in our simulation experiments, but the Kelsall–Diggle density-ratio cross-validation method does not. We find a theoretical explanation, and propose a modification of the Kelsall–Diggle method which has better performance. The methods are applied to traffic accidents in a regional city, and to protrusions on the dendritic tree of a neuron.

  相似文献   
4.
5.
Extremal problems in large deviations of the F-statistic are considered. It is shown that the slowest rate of convergence over a specified class of distributions of the F-statistic is slower than exponential, and that the Bahadur efficiency of the F-statistic with respect to some distribution-free competitors is identically zero.  相似文献   
6.
Journal of Family and Economic Issues - Flexibility is crucial when employees manage their work and family demands and their commute between home and work. The current study examined...  相似文献   
7.
OBJECTIVES: The purpose of this article is to highlight the benefits of collaboration in child focused mental health services research. METHOD: Three unique research projects are described. These projects address the mental health needs of vulnerable, urban, minority children and their families. In each one, service delivery was codesigned, interventions were co-delivered and a team of stakeholders collaboratively tested the impact of each one. RESULTS: The results indicate that the three interventions designed, delivered, and tested are associated with reductions in youth mental health symptoms. CONCLUSION: These interventions are feasible alternatives to traditional individualized outpatient treatment.  相似文献   
8.
Randomized controlled trials (RCTs) are the gold standard for evaluation of the efficacy and safety of investigational interventions. If every patient in an RCT were to adhere to the randomized treatment, one could simply analyze the complete data to infer the treatment effect. However, intercurrent events (ICEs) including the use of concomitant medication for unsatisfactory efficacy, treatment discontinuation due to adverse events, or lack of efficacy may lead to interventions that deviate from the original treatment assignment. Therefore, defining the appropriate estimand (the appropriate parameter to be estimated) based on the primary objective of the study is critical prior to determining the statistical analysis method and analyzing the data. The International Council for Harmonisation (ICH) E9 (R1), adopted on November 20, 2019, provided five strategies to define the estimand: treatment policy, hypothetical, composite variable, while on treatment, and principal stratum. In this article, we propose an estimand using a mix of strategies in handling ICEs. This estimand is an average of the “null” treatment difference for those with ICEs potentially related to safety and the treatment difference for the other patients if they would complete the assigned treatments. Two examples from clinical trials evaluating antidiabetes treatments are provided to illustrate the estimation of this proposed estimand and to compare it with the estimates for estimands using hypothetical and treatment policy strategies in handling ICEs.  相似文献   
9.
Kernel Density Estimation on a Linear Network   总被引:1,自引:0,他引:1       下载免费PDF全文
This paper develops a statistically principled approach to kernel density estimation on a network of lines, such as a road network. Existing heuristic techniques are reviewed, and their weaknesses are identified. The correct analogue of the Gaussian kernel is the ‘heat kernel’, the occupation density of Brownian motion on the network. The corresponding kernel estimator satisfies the classical time‐dependent heat equation on the network. This ‘diffusion estimator’ has good statistical properties that follow from the heat equation. It is mathematically similar to an existing heuristic technique, in that both can be expressed as sums over paths in the network. However, the diffusion estimate is an infinite sum, which cannot be evaluated using existing algorithms. Instead, the diffusion estimate can be computed rapidly by numerically solving the time‐dependent heat equation on the network. This also enables bandwidth selection using cross‐validation. The diffusion estimate with automatically selected bandwidth is demonstrated on road accident data.  相似文献   
10.
The current guidelines, ICH E14, for the evaluation of non-antiarrhythmic compounds require a 'thorough' QT study (TQT) conducted during clinical development (ICH Guidance for Industry E14, 2005). Owing to the regulatory choice of margin (10 ms), the TQT studies must be conducted to rigorous standards to ensure that variability is minimized. Some of the key sources of variation can be controlled by use of randomization, crossover design, standardization of electrocardiogram (ECG) recording conditions and collection of replicate ECGs at each time point. However, one of the key factors in these studies is the baseline measurement, which if not controlled and consistent across studies could lead to significant misinterpretation. In this article, we examine three types of baseline methods widely used in the TQT studies to derive a change from baseline in QTc (time-matched, time-averaged and pre-dose-averaged baseline). We discuss the impact of the baseline values on the guidance-recommended 'largest time-matched' analyses. Using simulation we have shown the impact of these baseline approaches on the type I error and power for both crossover and parallel group designs. In this article, we show that the power of study decreases as the number of time points tested in TQT study increases. A time-matched baseline method is recommended by several authors (Drug Saf. 2005; 28(2):115-125, Health Canada guidance document: guide for the analysis and review of QT/QTc interval data, 2006) due to the existence of the circadian rhythm in QT. However, the impact of the time-matched baseline method on statistical inference and sample size should be considered carefully during the design of TQT study. The time-averaged baseline had the highest power in comparison with other baseline approaches.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号