首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 442 毫秒
1.
Recurrence data are collected to study the recurrent events in biological, physical, and other systems. Quantities of interest include the mean cumulative number of events and the mean cumulative cost of the events. The mean cumulative function (MCF) can be estimated using non-parametric (NP) methods or by fitting parametric models, and many procedures have been suggested to construct the confidence intervals (CIs) for the MCF. This paper summarizes the results of a large simulation study that was designed to compare five CI procedures for both NP and parametric estimation. When performing parametric estimation, we assume the power law non-homogeneous Poisson process (NHPP) model. Our results include the evaluation of these procedures when they are used for window-observation recurrence data where recurrence histories of some systems are available only in observation windows with gaps in between.  相似文献   

2.
Flood events can be caused by several different meteorological circumstances. For example, heavy rain events often lead to short flood events with high peaks, whereas snowmelt normally results in events of very long duration with a high volume. Both event types have to be considered in the design of flood protection systems. Unfortunately, all these different event types are often included in annual maximum series (AMS) leading to inhomogeneous samples. Moreover, certain event types are underrepresented in the AMS. This is especially unsatisfactory if the most extreme events result from such an event type. Therefore, monthly maximum data are used to enlarge the information spectrum on the different event types. Of course, not all events can be included in the flood statistics because not every monthly maximum can be declared as a flood. To take this into account, a mixture Peak-over-threshold model is applied, with thresholds specifying flood events of several types that occur in a season of the year. This model is then extended to cover the seasonal type of the data. The applicability is shown in a German case study, where the impact of the single event types in different parts of a year is evaluated.  相似文献   

3.
Rejoinder     
Kaplan–Meier graphs of survival and other timed events are an extremely effective way to summarize outcome data from some types of clinical studies, but information on dates and the interaction among events is lost. A new mode for presenting such data is proposed, an Eventchart, that aids in looking at the study as a whole by simultaneously displaying the timing of 2–3 different types of events, including censoring and time-dependent covariate events. The chart helps to monitor accrual and can be used to “guesstimate” the amount of information resulting from additional enrollment and/or follow-up. Eventcharts are designed to supplement, not replace, quantitative methods of analysis or conventional survival graphs. Data from bone marrow transplantation and AIDS studies are used to illustrate the method. Terminology is suggested for describing the concepts and distinguishing among the complex of events studied in modern clinical trials, including timed events, loss and date censoring, and time-braided events.  相似文献   

4.
We present a new inverse sampling design for surveys of rare events, Gap-Based Inverse Sampling. In the design, sampling stops if after a predetermined interval, or gap, no new rare events are found. The length of the gap that follows after finding a rare event is used as a way of limiting sample effort. We present stopping rules using decisions based on the gap length, the total number of rare events found, and a fixed upper limit of survey effort. We illustrate the use of the design with stratified sampling of two biological populations. The design uses the intuitive behavior of a field biologist in stratified sampling, where if in a stratum nothing is found after a long search, the field surveyor would like to consider the stratum is empty and stop searching. Our design has appeal for surveying rare events (for example, a rare species) with stratified sampling where there are likely to be some completely empty strata.  相似文献   

5.
Excess zeros are encountered in many empirical count data applications. We provide a new explanation of extra zeros, related to the underlying stochastic process that generates events. The process has two rates: a lower rate until the first event and a higher one thereafter. We derive the corresponding distribution of the number of events during a fixed period and extend it to account for observed and unobserved heterogeneity. An application to the socioeconomic determinants of the individual number of doctor visits in Germany illustrates the usefulness of the new approach.  相似文献   

6.
A multiplicative seasonal forecasting model for cumulative events in which, conditional on end- of-season totals being given and seasonal shape being known, it is shown that events occurring within the season are multinomially distributed is presented. The model uses the information contained in the arrival of new events to obtain a posterior distribution for end-of-season totals. Bayesian forecasts are obtained recursively in two stages: first, by predicting the expected number and variance of event counts in future intervals within the remaining season, and then by predicting revised means and variances for end-of-season totals based on the most recent forecast error.  相似文献   

7.
The paper investigates parameter estimation problems in special Markov modulated counting processes. The events occuring at any state of an underlying Markov chain can be equipped with marks performing additional information on the events. Specifying the model to the case of two-state Markov chain modulation, the so-called switched counting process, some statistical problems are studied:maximum likelihood estimators, Rao-Blackwell optimal estimators, test of equality of the counting intensities of the two states and minimax estimation procedures. Tne consideration could be applied in various practical problems, in particular, in queueing and in reliability models, for example in failure-repair processes with alternatively operating repair systems.  相似文献   

8.
Diagnostics for dependence within time series extremes   总被引:1,自引:0,他引:1  
Summary. The analysis of extreme values within a stationary time series entails various assumptions concerning its long- and short-range dependence. We present a range of new diagnostic tools for assessing whether these assumptions are appropriate and for identifying structure within extreme events. These tools are based on tail characteristics of joint survivor functions but can be implemented by using existing estimation methods for extremes of univariate independent and identically distributed variables. Our diagnostic aids are illustrated through theoretical examples, simulation studies and by application to rainfall and exchange rate data. On the basis of these diagnostics we can explain characteristics that are found in the observed extreme events of these series and also gain insight into the properties of events that are more extreme than those observed.  相似文献   

9.
Multivariate panel count data often occur when there exist several related recurrent events or response variables defined by occurrences of related events. For univariate panel count data, several nonparametric treatment comparison procedures have been developed. However, it does not seem to exist a nonparametric procedure for multivariate cases. Based on differences between estimated mean functions, this article proposes a class of nonparametric test procedures for multivariate panel count data. The asymptotic distribution of the new test statistics is established and a simulation study is conducted. Moreover, the new procedures are applied to a skin cancer problem that motivated this study.  相似文献   

10.
This paper describes a new graphical method for comparing trends in two or more series of events. The method is applicable,e.g., to observations such as the successive times to failure of two or more devices and the arrival times on succeeding days at a service facility. The graphs provide information about the rela-tive trend and can be used to order several series with respect to rates of occurrence of events. The series are assumed independent of one another and are assumed to follow nonhomogeneous Poisson processes with mean functions and rate functions unspeci-fied.  相似文献   

11.
Simulation and extremal analysis of hurricane events   总被引:3,自引:0,他引:3  
In regions affected by tropical storms the damage caused by hurricane winds can be catastrophic. Consequently, accurate estimates of hurricane activity in such regions are vital. Unfortunately, the severity of events means that wind speed data are scarce and unreliable, even by standards which are usual for extreme value analysis. In contrast, records of atmospheric pressures are more complete. This suggests a two-stage approach: the development of a model describing spatiotemporal patterns of wind field behaviour for hurricane events; then the simulation of such events, using meteorological climate models, to obtain a realization of associated wind speeds whose extremal characteristics are summarized. This is not a new idea, but we apply careful statistical modelling for each aspect of the model development and simulation, taking the Gulf and Atlantic coastlines of the USA as our study area. Moreover, we address for the first time the issue of spatial dependence in extremes of hurricane events, which we find to have substantial implications for regional risk assessments.  相似文献   

12.
We propose a multivariate extension of the univariate chi-squared normality test. Using a known result for the distribution of quadratic forms in normal variables, we show that the proposed test statistic has an approximated chi-squared distribution under the null hypothesis of multivariate normality. As in the univariate case, the new test statistic is based on a comparison of observed and expected frequencies for specified events in sample space. In the univariate case, these events are the standard class intervals, but in the multivariate extension we propose these become hyper-ellipsoidal annuli in multivariate sample space. We assess the performance of the new test using Monte Carlo simulation. Keeping the type I error rate fixed, we show that the new test has power that compares favourably with other standard normality tests, though no uniformly most powerful test has been found. We recommend the new test due to its competitive advantages.  相似文献   

13.
Standard unit-root and cointegration tests are sensitive to atypical events such as outliers and structural breaks. In this article, we use outlier-robust estimation techniques to examine the impact of these events on cointegration analysis. Our outlier-robust cointegration test provides a new diagnostic tool for signaling when standard cointegration results might be driven by a few aberrant observations. A main feature of our approach is that the proposed robust estimator can be used to compute weights for all observations, which in turn can be used to identify the approximate dates of atypical events. We evaluate our method using simulated data and a Monte Carlo experiment. We also present an empirical example showing the usefulness of the proposed analysis.  相似文献   

14.
Dual-record system estimation has been widely used to obtain vital events in the past. Because of the weakness of the statistical assumptions of the model, as well as the biases involved in the estimators, its use became limited. The proposed estimators for dual-record systems are based on further division of the cells of the original table. The results have shown that they improved the underestimation of the total counts when compared with the classical Chandra Sekar-Deming estimator.  相似文献   

15.
A new analytic statistical technique for predictive event modeling in ongoing multicenter clinical trials with waiting time to response is developed. It allows for the predictive mean and predictive bounds for the number of events to be constructed over time, accounting for the newly recruited patients and patients already at risk in the trial, and for different recruitment scenarios. For modeling patient recruitment, an advanced Poisson-gamma model is used, which accounts for the variation in recruitment over time, the variation in recruitment rates between different centers and the opening or closing of some centers in the future. A few models for event appearance allowing for 'recurrence', 'death' and 'lost-to-follow-up' events and using finite Markov chains in continuous time are considered. To predict the number of future events over time for an ongoing trial at some interim time, the parameters of the recruitment and event models are estimated using current data and then the predictive recruitment rates in each center are adjusted using individual data and Bayesian re-estimation. For a typical scenario (continue to recruit during some time interval, then stop recruitment and wait until a particular number of events happens), the closed-form expressions for the predictive mean and predictive bounds of the number of events at any future time point are derived under the assumptions of Markovian behavior of the event progression. The technique is efficiently applied to modeling different scenarios for some ongoing oncology trials. Case studies are considered.  相似文献   

16.
Current design practice is usually to produce a safety system which meets a target level of performance that is deemed acceptable by the regulators. Safety systems are designed to prevent or alleviate the consequences of potentially hazardous events. In many modern industries the failure of such systems can lead to whole system breakdown. In reliability analysis of complex systems involving multiple components, it is assumed that the components have different failure rates with certain probabilities. This leads into extensive computational efforts involved in using the commonly employed generating function (GF) and the recursive algorithm to obtain reliability of systems consisting of a large number of components. Moreover, when the system failure results in fatalities it is desirable for the system to achieve an optimal rather than adequate level of performance given the limitations placed on available resources. This paper concerns with developing a modified branching process joint with generating function to handle reliability evaluation of a multi-robot complex system. The availability of the system is modeled to compute the failure probability of the whole system as a performance measure. The results help decision-makers in maintenance departments to analyze critical components of the system in different time periods to prevent system breakdowns.  相似文献   

17.
Time‐to‐event data are common in clinical trials to evaluate survival benefit of a new drug, biological product, or device. The commonly used parametric models including exponential, Weibull, Gompertz, log‐logistic, log‐normal, are simply not flexible enough to capture complex survival curves observed in clinical and medical research studies. On the other hand, the nonparametric Kaplan Meier (KM) method is very flexible and successful on catching the various shapes in the survival curves but lacks ability in predicting the future events such as the time for certain number of events and the number of events at certain time and predicting the risk of events (eg, death) over time beyond the span of the available data from clinical trials. It is obvious that neither the nonparametric KM method nor the current parametric distributions can fulfill the needs in fitting survival curves with the useful characteristics for predicting. In this paper, a full parametric distribution constructed as a mixture of three components of Weibull distribution is explored and recommended to fit the survival data, which is as flexible as KM for the observed data but have the nice features beyond the trial time, such as predicting future events, survival probability, and hazard function.  相似文献   

18.
Statistical models for recurrent events are of great interest in repairable systems reliability and maintenance. The adopted model under minimal repair maintenance is frequently a nonhomogeneous Poisson process with the power law process (PLP) intensity function. Although inference for the PLP is generally based on maximum likelihood theory, some advantages of the Bayesian approach have been reported in the literature. In this paper it is proposed that the PLP intensity be reparametrized in terms of (β,η), where β is the elasticity of the mean number of events with respect to time and η is the mean number of events for the period in which the system was actually observed. It is shown that β and η are orthogonal and that the likelihood becomes proportional to a product of gamma densities. Therefore, the family of natural conjugate priors is also a product of gammas. The idea is extended to the case that several realizations of the same PLP are observed along overlapping periods of time. Some Monte Carlo simulations are provided to study the frequentist behavior of the Bayesian estimates and to compare them with the maximum likelihood estimates. The results are applied to a real problem concerning the determination of the optimal periodicity of preventive maintenance for a set of power transformers. Prior distributions are elicited for β and η based on their operational interpretation and engineering expertise.  相似文献   

19.
Recurrent event data arise in longitudinal studies where each study subject may experience multiple events during the follow-up. In many situations in survival studies, pairs of individuals can potentially experience recurrent events. The analysis of such data is not straightforward as it involves two kinds of dependences, namely, dependence between the individuals in the same pair and dependence among a sequence of pairs. In the present paper, we introduce a new stochastic model for the analysis of such recurrent event data. Nonparametric estimators for a bivariate survivor function are developed. Asymptotic properties of the estimators are discussed. Simulation studies are carried out to assess the finite sample properties of the estimator. We illustrate the procedure with real life data on eye disease.  相似文献   

20.
The signature-based mixture representations for coherent systems are a good way to obtain distribution-free comparisons of systems. Unfortunately, these representations only hold for systems whose component lifetimes are independent and identically distributed (IID) or exchangeable (i.e., their joint distribution is invariant under permutations). In this paper we obtain comparison results for generalized mixtures, that is, for reliability functions that can be written as linear combinations of some baseline reliability functions with positive and negative coefficients. These results are based on some concepts in Graph Theory. We apply these results to obtain new comparison results for coherent systems without the IID or exchangeability assumptions by using their generalized mixture representations based on the minimal path sets.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号