首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Increased transcranial Doppler ultrasound (TCD) velocity is an indicator of cerebral infarction in children with sickle cell disease (SCD). In this article, the parallel genetic algorithm (PGA) is used to select a stroke risk model with TCD velocity as the response variable. Development of such a stroke risk model leads to the identification of children with SCD who are at a higher risk of stroke and their treatment in the early stages. Using blood velocity data from SCD patients, it is shown that the PGA is an easy-to-use computationally variable selection tool. The results of the PGA are also compared with those obtained from the stochastic search variable selection method, the Dantzig selector and conventional techniques such as stepwise selection and best subset selection.  相似文献   

2.
Summary.  When evaluating potential interventions for cancer prevention, it is necessary to compare benefits and harms. With new study designs, new statistical approaches may be needed to facilitate this comparison. A case in point arose in a proposed genetic substudy of a randomized trial of tamoxifen versus placebo in asymptomatic women who were at high risk for breast cancer. Although the randomized trial showed that tamoxifen substantially reduced the risk of breast cancer, the harms from tamoxifen were serious and some were life threaten-ing. In hopes of finding a subset of women with inherited risk genes who derive greater bene-fits from tamoxifen, we proposed a nested case–control study to test some trial subjects for various genes and new statistical methods to extrapolate benefits and harms to the general population. An important design question is whether or not the study should target common low penetrance genes. Our calculations show that useful results are only likely with rare high penetrance genes.  相似文献   

3.
A new class of estimators is introduced for the problem of estimating the mean of the selected population. These estimators are found by subtracting from the largest sample mean an estimator of its bias. The new estimators are compared with those introduced by Cohen and Sackrowitz (1982) and in terms of frequentist risk they are found to perform quite similarly.  相似文献   

4.
A new minimax multiple shrinkage estimator is constructed. This estimator which can adaptively shrink towards many subspace targets, is formal Bayes with respect to a mixture of harmonic priors. Unbiased estimates of risk and simulation results suggest that the risk properties of this estimator are very similar to those of the multiple shrinkage Stein estimator proposed by George (1986a). A special case is seen to be admissible.  相似文献   

5.
There is debate within the osteoporosis research community about the relationship between the risk of osteoporotic fracture and the surrogate measures of fracture risk. Meta‐regression analyses based on summary data have shown a linear relationship between fracture risk and surrogate measures, whereas analyses based on individual patient data (IPD) have shown a nonlinear relationship. We investigated the association between changes in a surrogate measure of fracture incidence, in this case a bone turnover marker for resorption assessed in the three risedronate phase III clinical programmes, and incident osteoporosis‐related fracture risk using regression models based on patient‐level and trial‐level information. The relationship between osteoporosis‐related fracture risk and changes in bone resorption was different when analysed on the basis of IPD than when analysed on the basis of a meta‐analytic approach (i.e., meta‐regression) using summary data (e.g., treatment effect based on treatment group estimates). This inconsistency in our findings was consistent with those in the published literature. Meta‐regression based on summary statistics at the trial level is not expected to reflect causal relationships between a clinical outcome and surrogate measures. Analyses based on IPD make possible a more comprehensive analysis since all relevant data on a patient level are available. Copyright © 2004 John Wiley & Sons Ltd.  相似文献   

6.
The problem of estimating the ratio of the variances of two independent normal populations is considered under both quadratic and entropy losses, when the means are unknown. New classes of improved estimators are obtained with the following properties. They are smooth, improve on the risk of the best affine equivariant estimator at every parameter point, have very simple form and are based on all the available data. In the case of entropy loss, the estimators of one of these classes are, additionally, generalized Bayes. Our approach for constructing these improved estimators is based on Strawderman’s (1974) [18] technique. As a preliminary result of independent interest, new classes of dominating generalized Bayes procedures for a normal precision are also given.  相似文献   

7.
Value at risk (VaR) is the standard measure of market risk used by financial institutions. Interpreting the VaR as the quantile of future portfolio values conditional on current information, the conditional autoregressive value at risk (CAViaR) model specifies the evolution of the quantile over time using an autoregressive process and estimates the parameters with regression quantiles. Utilizing the criterion that each period the probability of exceeding the VaR must be independent of all the past information, we introduce a new test of model adequacy, the dynamic quantile test. Applications to real data provide empirical support to this methodology.  相似文献   

8.
Abstract

In this article, we introduce three new classes of multivariate risk statistics, which can be considered as data-based versions of multivariate risk measures. These new classes are multivariate convex risk statistics, multivariate comonotonic convex risk statistics and multivariate empirical-law-invariant convex risk statistics, respectively. Representation results are provided. The arguments of proofs are mainly developed by ourselves. It turns out that all the relevant existing results in the literature are special cases of those obtained in this article.  相似文献   

9.
In this paper, a modified exponentially weighted moving average (EWMA) statistic is proposed. The approximate distribution of the proposed modified EWMA statistic is derived. A variable acceptance sampling plan is designed using the proposed EWMA statistic. The plan parameters of the proposed sampling plan are determined such that the given producer's risk and consumer's risk are satisfied. The efficiency of the proposed plan based on the new EWMA statistic is compared with the existing EWMA plan in terms of the sample size required. The application of the proposed plan is given with the help of an example.  相似文献   

10.
The problem of nonparametric minimum risk invariant estimation has engaged a good deal of attention in the literature and minimum risk invariant estimators (MRIE's) have been constructed for some special statistical models. We present a new and simple method of obtaining the MRIE's of a continuous cumulative distribution function (cdf) under a general invariant loss function. All the MRIE's, which are known from the literature, can be constructed by the method presented in the article, in particular, under the weighted quadratic, LINEX and entropy loss functions. This method enables also to construct the MRIE's in nonparametric statistical models which have not been considered until now. In particular, considering a family of nonparametric precautionary loss functions, a new class of MRIE's of the cdf has been found. We also give some general remarks on obtaining the MRIE's and a review concerning minimaxity and admissibility of MRIE's.  相似文献   

11.
This article proposes a new class of copula-based dynamic models for high-dimensional conditional distributions, facilitating the estimation of a wide variety of measures of systemic risk. Our proposed models draw on successful ideas from the literature on modeling high-dimensional covariance matrices and on recent work on models for general time-varying distributions. Our use of copula-based models enables the estimation of the joint model in stages, greatly reducing the computational burden. We use the proposed new models to study a collection of daily credit default swap (CDS) spreads on 100 U.S. firms over the period 2006 to 2012. We find that while the probability of distress for individual firms has greatly reduced since the financial crisis of 2008–2009, the joint probability of distress (a measure of systemic risk) is substantially higher now than in the precrisis period. Supplementary materials for this article are available online.  相似文献   

12.
Let X 1, X 2, ..., X n be a random sample from a normal population with mean μ and variance σ 2. In many real life situations, specially in lifetime or reliability estimation, the parameter μ is known a priori to lie in an interval [a, ∞). This makes the usual maximum likelihood estimator (MLE) ̄ an inadmissible estimator of μ with respect to the squared error loss. This is due to the fact that it may take values outside the parameter space. Katz (1961) and Gupta and Rohatgi (1980) proposed estimators which lie completely in the given interval. In this paper we derive some new estimators for μ and present a comparative study of the risk performance of these estimators. Both the known and unknown variance cases have been explored. The new estimators are shown to have superior risk performance over the existing ones over large portions of the parameter space.  相似文献   

13.
The paper is concerned with an application of the information inequality for the Bayes risk (global Cramèr-Rao inequality) to nonexponential estimation problems. A new methodology of proving minimaxity is presented by considering the example of estimating the scale or location parameter under one-sided truncation of the parameter space.  相似文献   

14.
In this paper, we proposed a new three-parameter long-term lifetime distribution induced by a latent complementary risk framework with decreasing, increasing and unimodal hazard function, the long-term complementary exponential geometric distribution. The new distribution arises from latent competing risk scenarios, where the lifetime associated scenario, with a particular risk, is not observable, rather we observe only the maximum lifetime value among all risks, and the presence of long-term survival. The properties of the proposed distribution are discussed, including its probability density function and explicit algebraic formulas for its reliability, hazard and quantile functions and order statistics. The parameter estimation is based on the usual maximum-likelihood approach. A simulation study assesses the performance of the estimation procedure. We compare the new distribution with its particular cases, as well as with the long-term Weibull distribution on three real data sets, observing its potential and competitiveness in comparison with some usual long-term lifetime distributions.  相似文献   

15.
Acceptance sampling plans based on process yield indices provide a proven resource for the lot-sentencing problem when the required fraction defective is very low. In this study, a new sampling plan based on the exponentially weighted moving average (EWMA) model with yield index for lot sentencing for autocorrelation between polynomial profiles is proposed. The advantage of the EWMA statistic is the accumulation of quality history from previous lots. In addition, the number of profiles required for lot sentencing is more economical than in the traditional single sampling plan. Considering the acceptable quality level (AQL) at the producer's risk and the lot tolerance percent defective (LTPD) at the consumer's risk, we proposed a new search algorithm to determine the optimal plan parameters. The plan parameters are tabulated for various combinations of the smoothing constant of the EWMA statistic, AQL, LTPD, and two risks. A comparison study and two numerical examples are provided to show the applicability of the proposed sampling plan.  相似文献   

16.
We used two statistical methods to identify prognostic factors: a log-linear model (logistic and COX regression, based on the notions of linearity and multiplicative relative risk), and the CORICO method (ICOnography of CORrelations) based on the geometric significance of the correlation coefficient. We applied the methods to two different situations (a "case-control study' and a "historical cohort'). We show that the geometric exploratory tool is particularly suited to the analysis of small samples with a large number of variables. It could save time when setting up new study protocols. In this instance, the geometric approach highlighted, without preconceived ideas, the potential role of multihormonality in the course of pituitary adenoma and the unexpected influence of the date of tumour excision on the risk attached to haemorrhage.  相似文献   

17.
A new flexible cure rate survival model is developed where the initial number of competing causes of the event of interest (say lesions or altered cells) follows a compound negative binomial (NB) distribution. This model provides a realistic interpretation of the biological mechanism of the event of interest, as it models a destructive process of the initial competing risk factors and records only the damaged portion of the original number of risk factors. Besides, it also accounts for the underlying mechanisms that lead to cure through various latent activation schemes. Our method of estimation exploits maximum likelihood (ML) tools. The methodology is illustrated on a real data set on malignant melanoma, and the finite sample behavior of parameter estimates are explored through simulation studies.  相似文献   

18.
Value at risk (VaR) and expected shortfall (ES) are widely used risk measures of the risk of loss on a specific portfolio of financial assets. Adjusted empirical likelihood (AEL) is an important non parametric likelihood method which is developed from empirical likelihood (EL). It can overcome the limitation of convex hull problems in EL. In this paper, we use AEL method to estimate confidence region for VaR and ES. Theoretically, we find that AEL has the same large sample statistical properties as EL, and guarantees solution to the estimating equations in EL. In addition, simulation results indicate that the coverage probabilities of the new confidence regions are higher than that of the original EL with the same level. These results show that the AEL estimation for VaR and ES deserves to recommend for the real applications.  相似文献   

19.
This paper presents a set of new tables and procedures for the selection of the following three types of Quick Switching (QS) systems for a given Acceptable Quality Level (AQL), Limiting Quality Level (LQL), producer's risk and consumer's risk.

(1) A Single sampling QS system with equal sample sizes but with different acceptance numbers

(2) A Single sampling QS system with two different sample sizes but with same acceptance number and

(3) A QS systen with double sampling normal inspection and single sampling tightened inspection

The third type of QS systen is the one newly presented in this paper. The tables provide unique plans for a given set of conditions as well as providing a smaller sample size or a smaller sum of Average Sample Numbers(ASN) at the AQL and LQL  相似文献   

20.
A multiple state repetitive group sampling (MSRGS) plan is developed on the basis of the coefficient of variation (CV) of the quality characteristic which follows a normal distribution with unknown mean and variance. The optimal plan parameters of the proposed plan are solved by a nonlinear optimization model, which satisfies the given producer's risk and consumer's risk at the same time and minimizes the average sample number required for inspection. The advantages of the proposed MSRGS plan over the existing sampling plans are discussed. Finally an example is given to illustrate the proposed plan.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号