首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Data is rapidly increasing in volume and velocity and the Internet of Things (IoT) is one important source of this data. The IoT is a collection of connected devices (things) which are constantly recording data from their surroundings using on-board sensors. These devices can record and stream data to the cloud at a very high rate, leading to high storage and analysis costs. In order to ameliorate these costs, the data is modelled as a stream and analysed online to learn about the underlying process, perform interpolation and smoothing and make forecasts and predictions. Conventional state space modelling tools assume the observations occur on a fixed regular time grid. However, many sensors change their sampling frequency, sometimes adaptively, or get interrupted and re-started out of sync with the previous sampling grid, or just generate event data at irregular times. It is therefore desirable to model the system as a partially and irregularly observed Markov process which evolves in continuous time. Both the process and the observation model are potentially non-linear. Particle filters therefore represent the simplest approach to online analysis. A functional Scala library of composable continuous time Markov process models has been developed in order to model the wide variety of data captured in the IoT.  相似文献   

2.
Panel count data occur in many fields and a number of approaches have been developed. However, most of these approaches are for situations where there is no terminal event and the observation process is independent of the underlying recurrent event process unconditionally or conditional on the covariates. In this paper, we discuss a more general situation where the observation process is informative and there exists a terminal event which precludes further occurrence of the recurrent events of interest. For the analysis, a semiparametric transformation model is presented for the mean function of the underlying recurrent event process among survivors. To estimate the regression parameters, an estimating equation approach is proposed in which an inverse survival probability weighting technique is used. The asymptotic distribution of the proposed estimates is provided. Simulation studies are conducted and suggest that the proposed approach works well for practical situations. An illustrative example is provided. The Canadian Journal of Statistics 41: 174–191; 2013 © 2012 Statistical Society of Canada  相似文献   

3.
In this paper we consider a Markovian perfect debugging model for which the software failure is caused by two types of faults, one which is easily detected and the other which is difficult to detect. When a failure occurs, a perfect debugging is immediately performed and consequently one fault is reduced from fault contents. We also treat the debugging time as a variable to develop a new debugging model. Based on the perfect debugging model, we propose an optimal software release policy that satisfies the requirements for both software reliability and expected number of faults which are required to achieve before releasing the software. Several measures, including the distribution of first passage time to the specified number of removed faults, are also obtained using the proposed debugging model.  相似文献   

4.
A fractal and its dimension has been a subject of great mathematical interest since the publication of Mandelbrot's manifestoes (1977, 1982). This paper discusses some empirical results indicating the potential usefulness of estimated fractal dimension in testing for white noise. These tests are applied for model identification in time series, and results for previously analyzed data are provided. A method for fractal interpolation of a continuous process from a finite number of observations is discussed, as well as some future research directions.  相似文献   

5.
The Log-Gaussian Cox process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly stochastic property, that is, it is a hierarchical combination of a Poisson process at the first level and a Gaussian process at the second level. Various methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods. We focus here on Bayesian methods and several approaches that have been considered for model fitting within this framework, including Hamiltonian Monte Carlo, the Integrated nested Laplace approximation, and Variational Bayes. We consider these approaches and make comparisons with respect to statistical and computational efficiency. These comparisons are made through several simulation studies as well as through two applications, the first examining ecological data and the second involving neuroimaging data.  相似文献   

6.
State-space models are widely used in ecology. However, it is well known that in practice it can be difficult to estimate both the process and observation variances that occur in such models. We consider this issue for integrated population models, which incorporate state-space models for population dynamics. To some extent, the mechanism of integrated population models protects against this problem, but it can still arise, and two illustrations are provided, in each of which the observation variance is estimated as zero. In the context of an extended case study involving data on British Grey herons, we consider alternative approaches for dealing with the problem when it occurs. In particular, we consider penalised likelihood, a method based on fitting splines and a method of pseudo-replication, which is undertaken via a simple bootstrap procedure. For the case study of the paper, it is shown that when it occurs, an estimate of zero observation variance is unimportant for inference relating to the model parameters of primary interest. This unexpected finding is supported by a simulation study.  相似文献   

7.
Joint modeling of degradation and failure time data   总被引:1,自引:0,他引:1  
This paper surveys some approaches to model the relationship between failure time data and covariate data like internal degradation and external environmental processes. These models which reflect the dependency between system state and system reliability include threshold models and hazard-based models. In particular, we consider the class of degradation–threshold–shock models (DTS models) in which failure is due to the competing causes of degradation and trauma. For this class of reliability models we express the failure time in terms of degradation and covariates. We compute the survival function of the resulting failure time and derive the likelihood function for the joint observation of failure times and degradation data at discrete times. We consider a special class of DTS models where degradation is modeled by a process with stationary independent increments and related to external covariates through a random time scale and extend this model class to repairable items by a marked point process approach. The proposed model class provides a rich conceptual framework for the study of degradation–failure issues.  相似文献   

8.
Abstract

Reliability is a major concern in the process of software development because unreliable software can cause failure in the computer system that can be hazardous. A way to enhance the reliability of software is to detect and remove the faults during the testing phase, which begins with module testing wherein modules are tested independently to remove a substantial number of faults within a limited resource. Therefore, the available resource must be allocated among the modules in such a way that the number of faults is removed as much as possible from each of the modules to achieve higher software reliability. In this article, we discuss the problem of optimal resource allocation of the testing resource for a modular software system, which maximizes the number of faults removed subject to the conditions that the amount of testing-effort is fixed, a certain percentage of faults is to be removed and a desired level of reliability is to be achieved. The problem is formulated as a non linear programming problem (NLPP), which is modeled by the inflection S-shaped software reliability growth models (SRGM) based on a non homogeneous Poisson process (NHPP) which incorporates the exponentiated Weibull (EW) testing-effort functions. A solution procedure is then developed using a dynamic programming technique to solve the NLPP. Furthermore, three special cases of optimum resource allocations are also discussed. Finally, numerical examples using three sets of software failure data are presented to illustrate the procedure developed and to validate the performance of the strategies proposed in this article. Experimental results indicate that the proposed strategies may be helpful to software project managers for making the best decisions in allocating the testing resource. In addition, the results are compared with those of Kapur et al. (2004), Huang and Lyu (2005), and Jha et al. (2010) that are available in the literature to deal the similar problems addressed in this article. It reveals that the proposed dynamic programming method for the testing-resource allocation problem yields a gain in efficiency over other methods.  相似文献   

9.
This paper discusses a class of Markov zero-inflated Poisson regression models for a time series of counts with the presence of excess zero relative to a Poisson distribution, in which the frequency distribution changes according to an underlying two-state Markov chain. Features of the proposed model, estimation method based on the EM and quasi-Newton algorithms, and other implementation issues are discussed. A Monte Carlo study shows that the estimation method is accurate and reliable as long as the sample size is reasonably large, and the choice of starting probabilities for the Markov process has little impact on the parameter estimates. The methodology is illustrated using daily numbers of phone calls reporting faults for a mainframe computer system.  相似文献   

10.
Bayesian calibration of computer models   总被引:5,自引:0,他引:5  
We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical models. Such models, implemented as computer codes, are often generic in the sense that by a suitable choice of some of the model's input parameters the code can be used to predict the behaviour of the system in a variety of specific applications. However, in any specific application the values of necessary parameters may be unknown. In this case, physical observations of the system in the specific context are used to learn about the unknown parameters. The process of fitting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc fitting, and after calibration the model is used, with the fitted input values, to predict the future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for all sources of uncertainty, including the remaining uncertainty over the fitted parameters. Second, they attempt to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best-fitting parameter values. The method is illustrated by using data from a nuclear radiation release at Tomsk, and from a more complex simulated nuclear accident exercise.  相似文献   

11.
For the recapture debugging design introduced by Nayak (1988) we consider the problem of estimating the hitting rates of the faults remaining in a system. In the context of a conditional likelihood, moment estimators are derived and are shown to be asymptotically normal and fully efficient. Fixed sample properties of the moment estimators are compared, through simulation, with those of the conditional maximum likelihood estimators. Also considered is a procedure for testing the assumption that faults have identical hitting rates; this provides a test of fit of the Jelinski-Moranda (1972) model. It is assumed that the residual hitting rates follow a log linear rate model and that the testing process is truncated when the gaps between the detection of new errors exceed a fixed amount of time.  相似文献   

12.
In this article, two new approaches are introduced to design attributes single plans, and the corresponding models are constructed separately. For Approach I, an algorithm is proposed to design sampling plans by setting a goal function to fulfill the two-point conditions on the operating characteristic curve. For Approach II, the plan parameters are solved by a nonlinear optimization model which minimizes the integration of the probability of acceptance in the interval from the producer's risk quality to the consumer's risk quality. Then numerical examples and discussions based on numerical computation results are given to illustrate the approaches, and tables of the designed plans under various conditions are provided. Moreover, a fact is given to be proved that there is a relation between the conventional design and the new approaches.  相似文献   

13.
Patient dropout is a common problem in studies that collect repeated binary measurements. Generalized estimating equations (GEE) are often used to analyze such data. The dropout mechanism may be plausibly missing at random (MAR), i.e. unrelated to future measurements given covariates and past measurements. In this case, various authors have recommended weighted GEE with weights based on an assumed dropout model, or an imputation approach, or a doubly robust approach based on weighting and imputation. These approaches provide asymptotically unbiased inference, provided the dropout or imputation model (as appropriate) is correctly specified. Other authors have suggested that, provided the working correlation structure is correctly specified, GEE using an improved estimator of the correlation parameters (‘modified GEE’) show minimal bias. These modified GEE have not been thoroughly examined. In this paper, we study the asymptotic bias under MAR dropout of these modified GEE, the standard GEE, and also GEE using the true correlation. We demonstrate that all three methods are biased in general. The modified GEE may be preferred to the standard GEE and are subject to only minimal bias in many MAR scenarios but in others are substantially biased. Hence, we recommend the modified GEE be used with caution.  相似文献   

14.
In this article, we propose an additive-multiplicative rates model for recurrent event data in the presence of a terminal event such as death. The association between recurrent and terminal events is nonparametric. For inference on the model parameters, estimating equation approaches are developed, and the asymptotic properties of the resulting estimators are established. The finite sample behavior of the proposed estimators is evaluated through simulation studies, and an application to a bladder cancer study is provided.  相似文献   

15.
There are many approaches in the estimation of spectral density. With regard to parametric approaches, different divergences are proposed in fitting a certain parametric family of spectral densities. Moreover, nonparametric approaches are also quite common considering the situation when we cannot specify the model of process. In this paper, we develop a local Whittle likelihood approach based on a general score function, with some special cases of which, the approach applies to more applications. This paper highlights the effective asymptotics of our general local Whittle estimator, and presents a comparison with other estimators. Additionally, for a special case, we construct the one-step ahead predictor based on the form of the score function. Subsequently, we show that it has a smaller prediction error than the classical exponentially weighted linear predictor. The provided numerical studies show some interesting features of our local Whittle estimator.  相似文献   

16.
DEXPERT is an expert system, built using KEE, for the design and analysis of experiments. From a mathematical model, expected mean squares are computed, tests are determined, and the power of the tests computed. Comparisons between designs are aided by suggestions and verbal interpretations provided by DEXPERT. DEXPERT provides a layout sheet for the collection of the data and then analyzes and interprets the results using analytical and graphical methods.  相似文献   

17.
This article reviews semiparametric estimators for limited dependent variable (LDV) models with endogenous regressors, where nonlinearity and nonseparability pose difficulties. We first introduce six main approaches in the linear equation system literature to handle endogenous regressors with linear projections: (i) ‘substitution’ replacing the endogenous regressors with their projected versions on the system exogenous regressors x, (ii) instrumental variable estimator (IVE) based on E{(error) × x} = 0, (iii) ‘model-projection’ turning the original model into a model in terms of only x-projected variables, (iv) ‘system reduced form (RF)’ finding RF parameters first and then the structural form (SF) parameters, (v) ‘artificial instrumental regressor’ using instruments as artificial regressors with zero coefficients, and (vi) ‘control function’ adding an extra term as a regressor to control for the endogeneity source. We then check if these approaches are applicable to LDV models using conditional mean/quantiles instead of linear projection. The six approaches provide a convenient forum on which semiparametric estimators in the literature can be categorized, although there are a few exceptions. The pros and cons of the approaches are discussed, and a small-scale simulation study is provided for some reviewed estimators.  相似文献   

18.
With increased focus on reducing costs in the healthcare industry, the economic aspects of quality control for clinical laboratories must be addressed. In order to evaluate the economic performance of statistical quality control approaches used in the clinical setting, an economic model is developed. Although the economic model is applied specifically to the clinical laboratory in this research, it is easily generalized for use in a wide variety of industry applications. Use of the economic model is illustrated through the comparison of traditional approaches to clinical quality control. Recommendations concerning the performance of the traditional approaches to clinical quality control are provided.  相似文献   

19.
In many medical studies, patients are followed longitudinally and interest is on assessing the relationship between longitudinal measurements and time to an event. Recently, various authors have proposed joint modeling approaches for longitudinal and time-to-event data for a single longitudinal variable. These joint modeling approaches become intractable with even a few longitudinal variables. In this paper we propose a regression calibration approach for jointly modeling multiple longitudinal measurements and discrete time-to-event data. Ideally, a two-stage modeling approach could be applied in which the multiple longitudinal measurements are modeled in the first stage and the longitudinal model is related to the time-to-event data in the second stage. Biased parameter estimation due to informative dropout makes this direct two-stage modeling approach problematic. We propose a regression calibration approach which appropriately accounts for informative dropout. We approximate the conditional distribution of the multiple longitudinal measurements given the event time by modeling all pairwise combinations of the longitudinal measurements using a bivariate linear mixed model which conditions on the event time. Complete data are then simulated based on estimates from these pairwise conditional models, and regression calibration is used to estimate the relationship between longitudinal data and time-to-event data using the complete data. We show that this approach performs well in estimating the relationship between multivariate longitudinal measurements and the time-to-event data and in estimating the parameters of the multiple longitudinal process subject to informative dropout. We illustrate this methodology with simulations and with an analysis of primary biliary cirrhosis (PBC) data.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号