首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
A Gaussian random function is a functional version of the normal distribution. This paper proposes a statistical hypothesis test to test whether or not a random function is a Gaussian random function. A parameter that is equal to 0 under Gaussian random function is considered, and its unbiased estimator is given. The asymptotic distribution of the estimator is studied, which is used for constructing a test statistic and discussing its asymptotic power. The performance of the proposed test is investigated through several numerical simulations. An illustrative example is also presented.  相似文献   

2.
Consider a machine that can start production off-target where the initial offset is unknown and unobservable. The goal is to determine the optimal series of machine adjustments that minimize the expected value of the sum of quadratic off-target costs and fixed adjustment costs. Apart of the unknown initial offset, the process is supposed to be in a state of statistical control, so the process model is applicable to discrete-part production processes. The process variance is also assumed unknown. We show, using a dynamic programming formulation based on the Bayesian estimation of all unknown process parameters, how the optimal process adjustment policy is of a deadband form where the width of the deadband is time-varying and U-shaped. Computational results and implementation details are presented. The simpler case of a known process variance is also solved using a dynamic programming approach. It is shown that the solution to this case is a good approximation to the first case, when the variance is actually unknown. The unknown process variance solution, however, is the most robust with respect to variation in the process parameters.  相似文献   

3.
In this paper, two tests, based on weighted CUSUM of the least squares residuals, are studied to detect in real time a change-point in a nonlinear model. A first test statistic is proposed by extension of a method already used in the literature but for the linear models. It is tested under the null hypothesis, at each sequential observation, that there is no change in the model against a change presence. The asymptotic distribution of the test statistic under the null hypothesis is given and its convergence in probability to infinity is proved when a change occurs. These results will allow to build an asymptotic critical region. Next, in order to decrease the type I error probability, a bootstrapped critical value is proposed and a modified test is studied in a similar way. A generalization of the Hájek–Rényi inequality is established.  相似文献   

4.
In this paper, we show a maximum likelihood estimation procedure in the Box-Cox model when a lagged dependent variable is included among explanatory variables and the first observation of the dependent variable is random. It is shown in a numerical example that a test of a coefficientof the lagged dependent variable is sensitive to whether the first observation of the dependentvariable is random or not.  相似文献   

5.
A reliability acceptance sampling plan (RASP) is a variable sampling plan, which is used for lot sentencing based on the lifetime of the product under consideration. If a good lot is rejected then there is a loss of sales, whereas if a bad lot is accepted then the post sale cost increases and the brand image of the product is affected. Since cost is an important decision-making factor, adopting an economically optimal RASP is indispensable. This work considers the determination of an asymptotically optimum RASP under progressive type-I interval censoring scheme with random removal (PICR-I). We formulate a decision model for lot sentencing and a cost function is proposed that quantifies the losses. The cost function includes the cost of conducting the life test and warranty cost when the lot is accepted, and the cost of batch disposition when it is rejected. The asymptotically optimal RASP is obtained by minimizing the Bayes risk in a set of decision rules based on the maximum likelihood estimator of the mean lifetime of the items in the lot. For numerical illustration, we consider that lifetimes follow exponential or Weibull distributions.  相似文献   

6.
When differences of survival functions are located in early time, a Wilcoxon test is the best test, but when differences of survival functions are located in late time, using a log-rank test is better. Therefore, a researcher needs a stable test in these situations. In this paper, a new two-sample test is proposed and considered. This test is distribution-free. This test is useful for choosing between log-rank and Wilcoxon tests. Its power is roughly the maximal power of the log-rank test and Wilcoxon test.  相似文献   

7.
In systems for online detection of regime shifts, a process is continually observed. Based on the data available an alarm is given when there is enough evidence of a change. There is a risk of a false alarm and here two different ways of controlling the false alarms are compared: a fixed average run length until the first false alarm and a fixed probability of any false alarm (fixed size). The two approaches are evaluated in terms of the timeliness of alarms. A system with a fixed size is found to have a drawback: the ability to detect a change deteriorates with the time of the change. Consequently, the probability of successful detection will tend to zero and the expected delay of a motivated alarm tends to infinity. This drawback is present even when the size is set to be very large (close to one). Utility measures expressing the costs for a false or a too late alarm are used in the comparison. How the choice of the best approach can be guided by the parameters of the process and the different costs of alarms is demonstrated. The technique is illustrated by financial transactions of the Hang Seng Index.  相似文献   

8.
A test for a hypothesized parameter is generalized by replacing the indicator function of the test critical region with a function ('weight of evidence for the alternative') having values in [0,1] and estimating the value 1 when the alternative is true and 0 otherwise. It is a 'guarded' weight of evidence if a bound is placed on the Type I risk. The focus of this paper is on a guarded weight of evidence which is a function of the likelihood ratio of the sign statistic for a two-sided alternative to a point hypothesis regarding the centre of a symmetric distribution. Inversion of a family of such guarded weights of evidence yields an 'acceptability profile' for the median which is more informative than the traditional confidence interval for the median. The main results, with the exception of the comparison of the Type II risks with an envelope risk, are based entirely on permutation arguments.  相似文献   

9.
This paper presents a new parametric model for recurrent events, in which the time of each recurrence is associated to one or multiple latent causes and no information is provided about the responsible cause for the event. This model is characterized by a rate function and it is based on the Poisson-exponential distribution, namely the distribution of the maximum among a random number (truncated Poisson distributed) of exponential times. The time of each recurrence is then given by the maximum lifetime value among all latent causes. Inference is based on a maximum likelihood approach. A simulation study is performed in order to observe the frequentist properties of the estimation procedure for small and moderate sample sizes. We also investigated likelihood-based tests procedures. A real example from a gastroenterology study concerning small bowel motility during fasting state is used to illustrate the methodology. Finally, we apply the proposed model to a real data set and compare it with the classical Homogeneous Poisson model, which is a particular case.  相似文献   

10.
As a measure of association between two nominal categorical variables, the lambda coefficient or Goodman–Kruskal's lambda has become a most popular measure. Its popularity is primarily due to its simple and meaningful definition and interpretation in terms of the proportional reduction in error when predicting a random observation's category for one variable given (versus not knowing) its category for the other variable. It is an asymmetric measure, although a symmetric version is available. The lambda coefficient does, however, have a widely recognized limitation: it can equal zero even when there is no independence between the variables and when all other measures take on positive values. In order to mitigate this problem, an alternative lambda coefficient is introduced in this paper as a slight modification of the Goodman–Kruskal lambda. The properties of the new measure are discussed and a symmetric form is introduced. A statistical inference procedure is developed and a numerical example is provided.  相似文献   

11.
Abstract. We study the Bayesian solution of a linear inverse problem in a separable Hilbert space setting with Gaussian prior and noise distribution. Our contribution is to propose a new Bayes estimator which is a linear and continuous estimator on the whole space and is stronger than the mean of the exact Gaussian posterior distribution which is only defined as a measurable linear transformation. Our estimator is the mean of a slightly modified posterior distribution called regularized posterior distribution. Frequentist consistency of our estimator and of the regularized posterior distribution is proved. A Monte Carlo study and an application to real data confirm good small‐sample properties of our procedure.  相似文献   

12.
《统计学通讯:理论与方法》2012,41(13-14):2524-2544
A calibrated small area predictor based on an area-level linear mixed model with restrictions is proposed. It is showed that such restricted predictor, which guarantees the concordance between the small area estimates and a known estimate at the aggregate level, is the best linear unbiased predictor. The mean squared prediction error of the calibrated predictor is discussed. Further, a restricted predictor under a particular time-series and cross-sectional model is presented. Within a simulation study based on real data collected from a longitudinal survey conducted by a national statistical office, the proposed estimator is compared with other competitive restricted and non-restricted predictors.  相似文献   

13.
The author presents a robust F-test for comparing nested linear models. It is suggested that the approach will be attractive to practitioners because it is based on the familiar F-statistic and corresponds to the common practice of reporting F-statistics after removing obvious outliers. It is calibrated in terms of a real parameter that can be directly interpreted as the willingness of the data analyst to remove observations, and the sensitivity of the F-statistic to this parameter is easily examined. The procedure is evaluated with a simulation study where a scale mixture distribution is used to generate outliers. The procedure is also applied to some data where the occurrence of an outlier is confounded with the significance of a regression term. This provides a comparison of two competing models for the data: one removing an outlier and the other including an additional regression term instead.  相似文献   

14.
We study integrals for arbitrary Borel-measurable functions with respect to a semiparametric estimator of the distribution function in the random censorship model. Based on a representation of these integrals, which is similar to the one given by Stute for Kaplan–Meier integrals, a central limit theorem is established which generalizes a corresponding result of the Cheng and Lin estimator. It is shown that the semiparametric integral estimator is at least as efficient as the corresponding Kaplan–Meier integral estimator in terms of asymptotic variance if the correct semiparametric model is used. Furthermore, a necessary and sufficient condition for a strict gain in efficiency is stated. Finally, this asymptotic result is confirmed in a small simulation study under moderate sample sizes.  相似文献   

15.
Summary.  A common problem with laboratory assays is that a measurement of a substance in a test sample becomes relatively imprecise as the concentration decreases. A standard solution is to establish lower limits for reliable measurement. A quantitation limit is a level above which a measurement has sufficient precision to be reliably reported. The paper proposes a new approach to defining the limit of quantitation for the case where a linear calibration curve is used to estimate actual concentrations from measured values. The approach is based on the relative precision of the estimated concentration, using the delta method to approximate the precision. A graphical display is proposed for the assessment of estimated concentrations, as well as the overall reliability of the calibration curve. Our research is motivated by a clinical inhalation experiment. Comparisons are made between the approach proposed and two standard methods, using both real and simulated data.  相似文献   

16.
A Bayesian model consists of two elements: a sampling model and a prior density. The problem of selecting a prior density is nothing but the problem of selecting a Bayesian model where the sampling model is fixed. A predictive approach is used through a decision problem where the loss function is the squared L 2 distance between the sampling density and the posterior predictive density, because the aim of the method is to choose the prior that provides a posterior predictive density as good as possible. An algorithm is developed for solving the problem; this algorithm is based on Lavine's linearization technique.  相似文献   

17.
We consider a device that is designed to perform missions consisting of a random sequence of phases or stages with random durations. The mission process is described by a Markov renewal process and the system is a complex one consisting of a number of components whose lifetimes depend on the phases of the mission. We discuss models and tools to compute system, mission, and phase reliabilities using Markov renewal theory. A simplified model involving a mission-based system with maximal repair is analyzed first, and the results are then extended to the case where there is no repair using intrinsic aging concepts. Our objective is to focus on computation of system reliability for these two possible extreme cases.  相似文献   

18.
This article studies the dispatch of consolidated shipments. Orders, following a batch Markovian arrival process, are received in discrete quantities by a depot at discrete time epochs. Instead of immediate dispatch, all outstanding orders are consolidated and shipped together at a later time. The decision of when to send out the consolidated shipment is made based on a “dispatch policy,” which is a function of the system state and/or the costs associated with that state. First, a tree structured Markov chain is constructed to record specific information about the consolidation process; the effectiveness of any dispatch policy can then be assessed by a set of long-run performance measures. Next, the effect on shipment consolidation of varying the order-arrival process is demonstrated through numerical examples and proved mathematically under some conditions. Finally, a heuristic algorithm is developed to determine a favorable parameter of a special set of dispatch policies, and the algorithm is proved to yield the overall optimal policy under certain conditions.  相似文献   

19.
Pairwise comparison matrix (PCM) is a popular technique used in multi-criteria decision making. The abelian linearly ordered group (alo-group) is a powerful tool for the discussion of PCMs. In this article, a criterion for acceptable consistency of PCM is introduced, which is independent of the scale and can be intuitively interpreted. The relation of the introduced criterion with the weak consistency is investigated. Then, a multiplicative alo-group based hierarchical decision model is proposed. The following approaches are included: (1) the introduced criterion for acceptable consistency is used to check whether or not a PCM is acceptable; (2) the row’s geometric mean method is used for deriving the local priorities of a multiplicative PCM; (3) a Hierarchy Composition Rule derived from the weighted mean is used for computing the criterion/subcriterion’s weights with regard to the total goal; and (4) the weighted geometric mean is used as the aggregation rule, where the alternative’s local priorities are min-normalized. The proposed model has the property of preserving rank. Moreover, it has counterparts in the additive case. Finally, the model is applied to a layout planning problem of an aircraft maintenance base with a computer-based software.  相似文献   

20.
In this article, a warm standby n-unit system is studied. The system is operational as long as there is one unit normal. The unit online, which has a lifetime distribution governed by a phase-type distribution, is also attacked by a shock from some external causes. Assume that shocks arrive according to a Poisson process. Whenever an interarrival time of shock is less than a threshold, the unit online fails. The lifetimes of the units in warm standby is exponentially distributed. A repairman who can take multiple vacations repairs the failed units based on the “first-in-first-out” rule. The repair times and the vacation times of repairman are governed by different phase-type distributions. For this system, the Markov process governing the system is constructed. The system is studied in a transient and stationary regime; the availability, the reliability, the rates of occurrence of the different types of failures, and the working probability of the repairman are calculated. A numerical application is performed to illustrate the calculations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号