首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A mixture of the MANOVA and GMANOVA models is presented. The expected value of the response matrix in this model is the sum of two matrix components. The first component represents the GMANOVA portion and the second component represents the MANOVA portion. Maximum likelihood estimators are derived for the parameters in this model, and goodness-of-fit tests are constructed for fuller models via the likelihood ration criterion. Finally, likelihood ration tests for general liinear hypotheses are developed and a numerical example is presented.  相似文献   

2.
3.
ABSTRACT

This article is concerned with the problem of controlling a simple immigration-birth-death process, which represents a pest population, by the introduction of a predator in the habitat of the pests. The optimization criterion is the minimization of the expected long-run average cost per unit time. It is possible to construct an appropriate semi-Markov decision model with a finite set of states if and only if the difference between the per capita birth rate and the per capita death rate of the pests is smaller than half of the rate at which the predator is introduced in the habitat.  相似文献   

4.
One of the fundamental issues in analyzing microarray data is to determine which genes are expressed and which ones are not for a given group of subjects. In datasets where many genes are expressed and many are not expressed (i.e., underexpressed), a bimodal distribution for the gene expression levels often results, where one mode of the distribution represents the expressed genes and the other mode represents the underexpressed genes. To model this bimodality, we propose a new class of mixture models that utilize a random threshold value for accommodating bimodality in the gene expression distribution. Theoretical properties of the proposed model are carefully examined. We use this new model to examine the problem of differential gene expression between two groups of subjects, develop prior distributions, and derive a new criterion for determining which genes are differentially expressed between the two groups. Prior elicitation is carried out using empirical Bayes methodology in order to estimate the threshold value as well as elicit the hyperparameters for the two component mixture model. The new gene selection criterion is demonstrated via several simulations to have excellent false positive rate and false negative rate properties. A gastric cancer dataset is used to motivate and illustrate the proposed methodology.  相似文献   

5.
The development of models and methods for cure rate estimation has recently burgeoned into an important subfield of survival analysis. Much of the literature focuses on the standard mixture model. Recently, process-based models have been suggested. We focus on several models based on first passage times for Wiener processes. Whitmore and others have studied these models in a variety of contexts. Lee and Whitmore (Stat Sci 21(4):501–513, 2006) give a comprehensive review of a variety of first hitting time models and briefly discuss their potential as cure rate models. In this paper, we study the Wiener process with negative drift as a possible cure rate model but the resulting defective inverse Gaussian model is found to provide a poor fit in some cases. Several possible modifications are then suggested, which improve the defective inverse Gaussian. These modifications include: the inverse Gaussian cure rate mixture model; a mixture of two inverse Gaussian models; incorporation of heterogeneity in the drift parameter; and the addition of a second absorbing barrier to the Wiener process, representing an immunity threshold. This class of process-based models is a useful alternative to the standard model and provides an improved fit compared to the standard model when applied to many of the datasets that we have studied. Implementation of this class of models is facilitated using expectation-maximization (EM) algorithms and variants thereof, including the gradient EM algorithm. Parameter estimates for each of these EM algorithms are given and the proposed models are applied to both real and simulated data, where they perform well.  相似文献   

6.
We consider a stochastic differential equation involving standard and fractional Brownian motion with unknown drift parameter to be estimated. We investigate the standard maximum likelihood estimate of the drift parameter, two non-standard estimates and three estimates for the sequential estimation. Model strong consistency and some other properties are proved. The linear model and Ornstein–Uhlenbeck model are studied in detail. As an auxiliary result, an asymptotic behaviour of the fractional derivative of the fractional Brownian motion is established.  相似文献   

7.
We present a hierarchical frailty model based on distributions derived from non-negative Lévy processes. The model may be applied to data with several levels of dependence, such as family data or other general clusters, and is an alternative to additive frailty models. We present several parametric examples of the model, and properties such as expected values, variance and covariance. The model is applied to a case-cohort sample of age at onset for melanoma from the Swedish Multi-Generation Register, organized in nuclear families of parents and one or two children. We compare the genetic component of the total frailty variance to the common environmental term, and estimate the effect of birth cohort and gender.  相似文献   

8.
We present a Bayesian model selection approach to estimate the intrinsic dimensionality of a high-dimensional dataset. To this end, we introduce a novel formulation of the probabilisitic principal component analysis model based on a normal-gamma prior distribution. In this context, we exhibit a closed-form expression of the marginal likelihood which allows to infer an optimal number of components. We also propose a heuristic based on the expected shape of the marginal likelihood curve in order to choose the hyperparameters. In nonasymptotic frameworks, we show on simulated data that this exact dimensionality selection approach is competitive with both Bayesian and frequentist state-of-the-art methods.  相似文献   

9.
In this paper, we investigate the price for the zero-coupon defaultable bond under a structural form credit risk with regime switching. We model the value of a firm and the default threshold by two dependent regime-switching jump-diffusion processes, in which the Markov chain represents the states of an economy. The price is associated with the Laplace transform of the first passage time and the expected discounted ratio of the firm value to the default threshold at default. Closed-form results used for calculating the price are derived when the jump sizes follow a regime-switching double exponential distribution. We present some numerical results for the price of the zero-coupon defaultable bond via Gaver-Stehfest algorithm.  相似文献   

10.
We develop Bayesian inference methods for a recently-emerging type of epigenetic data to study the transmission fidelity of DNA methylation patterns over cell divisions. The data consist of parent-daughter double-stranded DNA methylation patterns with each pattern coming from a single cell and represented as an unordered pair of binary strings. The data are technically difficult and time-consuming to collect, putting a premium on an efficient inference method. Our aim is to estimate rates for the maintenance and de novo methylation events that gave rise to the observed patterns, while accounting for measurement error. We model data at multiple sites jointly, thus using whole-strand information, and considerably reduce confounding between parameters. We also adopt a hierarchical structure that allows for variation in rates across sites without an explosion in the effective number of parameters. Our context-specific priors capture the expected stationarity, or near-stationarity, of the stochastic process that generated the data analyzed here. This expected stationarity is shown to greatly increase the precision of the estimation. Applying our model to a data set collected at the human FMR1 locus, we find that measurement errors, generally ignored in similar studies, occur at a non-trivial rate (inappropriate bisulfite conversion error: 1.6% with 80% CI: 0.9-2.3%). Accounting for these errors has a substantial impact on estimates of key biological parameters. The estimated average failure of maintenance rate and daughter de novo rate decline from 0.04 to 0.024 and from 0.14 to 0.07, respectively, when errors are accounted for. Our results also provide evidence that de novo events may occur on both parent and daughter strands: the median parent and daughter de novo rates are 0.08 (80% CI: 0.04-0.13) and 0.07 (80% CI: 0.04-0.11), respectively.  相似文献   

11.
The main characteristic of a load sharing system is that after the failure of one component the surviving components have to shoulder extra load and hence are prone to failure at an earlier time than what is expected under the original model. In others, the failure of one component may release extra resources to the survivors, thus delaying the system failure. In this paper we consider such m component systems and some observation schemes and identifiability issues under them. Then we construct a general semiparametric multivariate family of distributions which explicitly models this phenomenon through proportional conditional hazards. We suggest estimates for the constant of proportionality. We propose a nonparametric test for the hypothesis that the failures take place independently according to the common distribution against the alternative hypothesis that the second failure takes place earlier than warranted, study its properties and illustrate its use.  相似文献   

12.
13.
We present two stochastic models that describe the relationship between biomarker process values at random time points, event times, and a vector of covariates. In both models the biomarker processes are degradation processes that represent the decay of systems over time. In the first model the biomarker process is a Wiener process whose drift is a function of the covariate vector. In the second model the biomarker process is taken to be the difference between a stationary Gaussian process and a time drift whose drift parameter is a function of the covariates. For both models we present statistical methods for estimation of the regression coefficients. The first model is useful for predicting the residual time from study entry to the time a critical boundary is reached while the second model is useful for predicting the latency time from the infection until the time the presence of the infection is detected. We present our methods principally in the context of conducting inference in a population of HIV infected individuals.  相似文献   

14.
Abstract

Two problems need to be solved before being able to give proper advice to couples undergoing in vitro fertilization therapy. Firstly, does the long-run success rate really converge to 100%? Secondly, what the success rate can be expected within a reasonable finite number of cycles? We propose a model based on a Weibull distribution. Data on 23,520 couples were used to calculate the cumulative pregnancy rate.  相似文献   

15.
In this article, we study a robust optimal investment and reinsurance problem for a general insurance company which holds shares of an insurance company and a reinsurance company. Assume that the claim process described by a Brownian motion with drift, the insurer can purchase proportional reinsurance, and both the insurer and the reinsurer can invest in a risk-free asset and a risky asset. Besides, the general insurance company’s manager is an ambiguity-averse manager (AAM) who worries about model uncertainty in model parameters. The AAM’s objective is to maximize the minimal expected exponential utility of the weighted sum surplus process of the insurer and the reinsurer. By using techniques of stochastic control theory, we first derive the closed-form expressions of the optimal strategies and the corresponding value function, and then the verification theorem is given. Finally, we present numerical examples to illustrate the effects of model parameters on the optimal investment and reinsurance strategies, and analyze utility losses from ignoring model uncertainty.  相似文献   

16.
In this paper, we consider an ergodic diffusion process with jumps whose drift coefficient depends on an unknown parameter. We suppose that the process is discretely observed. We introduce an estimator based on a contrast function, which is efficient without requiring any conditions on the rate at which the step discretization goes to zero, and where we allow the observed process to have nonsummable jumps. This extends earlier results where the condition on the step discretization was needed and where the process was supposed to have summable jumps. In general situations, our contrast function is not explicit and one has to resort to some approximation. In the case of a finite jump activity, we propose explicit approximations of the contrast function such that the efficient estimation of the drift parameter is feasible. This extends the results obtained by Kessler in the case of continuous processes.  相似文献   

17.
Two-component mixture cure rate model is popular in cure rate data analysis with the proportional hazards and accelerated failure time (AFT) models being the major competitors for modelling the latency component. [Wang, L., Du, P., and Liang, H. (2012), ‘Two-Component Mixture Cure Rate Model with Spline Estimated Nonparametric Components’, Biometrics, 68, 726–735] first proposed a nonparametric mixture cure rate model where the latency component assumes proportional hazards with nonparametric covariate effects in the relative risk. Here we consider a mixture cure rate model where the latency component assumes AFTs with nonparametric covariate effects in the acceleration factor. Besides the more direct physical interpretation than the proportional hazards, our model has an additional scalar parameter which adds more complication to the computational algorithm as well as the asymptotic theory. We develop a penalised EM algorithm for estimation together with confidence intervals derived from the Louis formula. Asymptotic convergence rates of the parameter estimates are established. Simulations and the application to a melanoma study shows the advantages of our new method.  相似文献   

18.
We statistically analyze a multivariate Heath-Jarrow-Morton diffusion model with stochastic volatility. The volatility process of the first factor is left totally unspecified while the volatility of the second factor is the product of an unknown process and an exponential function of time to maturity. This exponential term includes some real parameter measuring the rate of increase of the second factor as time goes to maturity. From historical data, we efficiently estimate the time to maturity parameter in the sense of constructing an estimator that achieves an optimal information bound in a semiparametric setting. We also nonparametrically identify the paths of the volatility processes and achieve minimax bounds. We address the problem of degeneracy that occurs when the dimension of the process is greater than two, and give in particular optimal limit theorems under suitable regularity assumptions on the drift process. We consistently analyze the numerical behavior of our estimators on simulated and real datasets of prices of forward contracts on electricity markets.  相似文献   

19.
ABSTRACT

In a load-sharing system, the failure of a component affects the residual lifetime of the surviving components. We propose a model for the load-sharing phenomenon in k-out-of-m systems. The model is based on exponentiated conditional distributions of the order statistics formed by the failure times of the components. For an illustration, we consider two component parallel systems with the initial lifetimes of the components having Weibull and linear failure rate distributions. We analyze one data set to show that the proposed model may be a better fit than the model based on sequential order statistics.  相似文献   

20.
We propose an evidence synthesis approach through a degradation model to estimate causal influences of physiological factors on myocardial infarction (MI) and coronary heart disease (CHD). For instance several studies give incidences of MI and CHD for different age strata, other studies give relative or absolute risks for strata of main risk factors of MI or CHD. Evidence synthesis of several studies allows incorporating these disparate pieces of information into a single model. For doing this we need to develop a sufficiently general dynamical model; we also need to estimate the distribution of explanatory factors in the population. We develop a degradation model for both MI and CHD using a Brownian motion with drift, and the drift is modeled as a function of indicators of obesity, lipid profile, inflammation and blood pressure. Conditionally on these factors the times to MI or CHD have inverse Gaussian ( ${\mathcal{IG}}$ ) distributions. The results we want to fit are generally not conditional on all the factors and thus we need marginal distributions of the time of occurrence of MI and CHD; this leads us to manipulate the inverse Gaussian normal distribution ( ${\mathcal{IGN}}$ ) (an ${\mathcal{IG}}$ whose drift parameter has a normal distribution). Another possible model arises if a factor modifies the threshold. This led us to define an extension of ${\mathcal{IGN}}$ obtained when both drift and threshold parameters have normal distributions. We applied the model to results published in five important studies of MI and CHD and their risk factors. The fit of the model using the evidence synthesis approach was satisfactory and the effects of the four risk factors were highly significant.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号