共查询到20条相似文献,搜索用时 31 毫秒
1.
In a recent paper ( J. Statist. Comput. Simul., 1995, Vol. 53, pp. 195–203) P. A. Wright proposed a new process capability index Cs which generalizes the Pearn-Kotz-Johnson’s index Cpmk by taking into account the skewness (in addition to deviation of the mean from tliCrntarget already incorporated in Cpmk ). The purpose of this article is to study the consistency and asymptotics of an estimate ?s of Cs The asymptotic distribution provides an insight into some desirable properties of the estimate which are not apparent from its original definition 相似文献
2.
The analysis highlights the relationship between the territorial distribution of foreigners in Lombardy and the economic space they occupy, territorially defined by the Local Labour Systems. In particular, using these units the study assesses the importance of ethnic networks and local labour market conditions in defining the varying concentration of foreign nationalities living in the territory.The research was partially supported by grants from CNR (99.01522.CT10). Preliminary findings have been presented at SIS (Societá Italiana di Statistica) Annual Meeting, Milano Bicocca, 2002 相似文献
3.
Dementia caused by Alzheimer’s disease (AD) is worldwide one of the main medical and social challenges for the next years and decades. An automated analysis of changes in the electroencephalogram (EEG) of patients with AD may contribute to improving the quality of medical diagnoses. In this paper, measures based on uni- and multi-variate spectral densities are studied in order to measure slowing and, in greater detail, reduced synchrony in the EEG signals. Hereby, an EEG segment is interpreted as sample of a (weakly) stationary stochastic process. The spectral density was computed using an indirect estimator. Slowing was considered by calculating the spectral power in predefined frequency bands. As measures for synchrony between single EEG signals, we analyzed coherences, partial coherences, bivariate and conditional Granger causality; for measuring synchrony between groups of EEG signals, we considered coherences, partial coherences, bivariate and conditional Granger causality between the respective first principal components of each group, and dynamic canonic correlations. As measure for local synchrony within a group, the amount of variance explained by the respective first principal component of static and dynamic principal component analysis was investigated. These measures were exemplarily computed for resting state EEG recordings from 83 subjects diagnosed with probable AD. Here, the severity of AD is quantified by the Mini Mental State Examination score. 相似文献
5.
This article provides a concise overview of the main mathematical theory of Benford’s law in a form accessible to scientists and students who have had first courses in calculus and probability. In particular, one of the main objectives here is to aid researchers who are interested in applying Benford’s law, and need to understand general principles clarifying when to expect the appearance of Benford’s law in real-life data and when not to expect it. A second main target audience is students of statistics or mathematics, at all levels, who are curious about the mathematics underlying this surprising and robust phenomenon, and may wish to delve more deeply into the subject. This survey of the fundamental principles behind Benford’s law includes many basic examples and theorems, but does not include the proofs or the most general statements of the theorems; rather it provides precise references where both may be found. 相似文献
6.
We have observations for a t distribution with unknown mean, variance, and degrees of freedom, each of which we wish to estimate. The major problem lies in the estimate of the degrees of freedom. We show that a relatively efficient yet very simple estimator is a given function of the ratio of percentile estimates. We derive the appropriate estimator, provide equations for transformation and standard errors, contrast this with other estimators, and give examples. 相似文献
7.
In this article, a novel hybrid method to forecast stock price is proposed. This hybrid method is based on wavelet transform, wavelet denoising, linear models (autoregressive integrated moving average (ARIMA) model and exponential smoothing (ES) model), and nonlinear models (BP Neural Network and RBF Neural Network). The wavelet transform provides a set of better-behaved constitutive series than stock series for prediction. Wavelet denoising is used to eliminate some slight random fluctuations of stock series. ARIMA model and ES model are used to forecast the linear component of denoised stock series, and then BP Neural Network and RBF Neural Network are developed as tools for nonlinear pattern recognition to correct the estimation error of the prediction of linear models. The proposed method is examined in the stock market of Shanghai and Shenzhen and the results are compared with some of the most recent stock price forecast methods. The results show that the proposed hybrid method can provide a considerable improvement for the forecasting accuracy. Meanwhile, this proposed method can also be applied to analysis and forecast reliability of products or systems and improve the accuracy of reliability engineering. 相似文献
8.
Based on Stein’s famous shrinkage estimation of a multivariate normal distribution, we propose a new type of estimators of the distribution function of a random variable in a nonparametric setup. The proposed estimators are then compared with the empirical distribution function, which is the best equivariant estimator under a well-known loss function. Our extensive simulation study shows that our proposed estimators can perform better for moderate to large sample sizes. 相似文献
10.
Friedman’s (1937, 1940) S-statistic is designed to test the hypothesis that there is no treatment effect in a randomized-block design with k treatments and n blocks. In this paper we give tables of the null distribution of S for k = 5, n = 6(1)8, and for k = 6, n = 2(1)6. Computational details are discussed. 相似文献
11.
x 1, ..., x n+r can be treated as the sample values of a Markov chain of order r or less (chain in which the dependence extends over r+1 consecutive variables only), and consider the problem of testing the hypothesis H 0 that a chain of order r− 1 will be sufficient on the basis of the tools given by the Statistical Information Theory: ϕ-Divergences. More precisely, if p a 1 ....., a r: a r +1 denotes the transition probability for a r th order Markov chain, the hypothesis to be tested is H 0: p a 1 ....., a r: a r +1 = p a 2 ....., a r: a r +1, a i ∈{1, ..., s}, i = 1, ..., r + 1 The tests given in this paper, for the first time, will have as a particular case the likelihood ratio test and the test based on the chi-squared statistic. Received: August 3, 1998; revised version: November 25, 1999 相似文献
12.
In a special paired sample case, Hotelling’s T2 test based on the differences of the paired random vectors is the likelihood ratio test for testing the hypothesis that the paired random vectors have the same mean; with respect to a special group of affine linear transformations it is the uniformly most powerful invariant test for the general alternative of a difference in mean. We present an elementary straightforward proof of this result. The likelihood ratio test for testing the hypothesis that the covariance structure is of the assumed special form is derived and discussed. Applications to real data are given. 相似文献
14.
The analysis in this paper employs a methodology for dating structural breaks in tests with non-standard asymptotic distributions. The application examines whether changes in the rules of a game and major social and political events during the past century had significant effects upon various outcomes of this game. The statistical methodology first applied here proves successful as most breaks can be traced to specific events and rule changes. Dating these breaks allows us to obtain useful insights into production and competition processes in this industry. As such, using empirical tests we illustrate the utility of a valuable statistical technique not applied previously.Ignacio Palacios-Huerta: I am grateful to Gary S. Becker, Tony Lancaster, Robin Lumsdaine, Kevin M. Murphy, Gabriel Perez-Quiros, Ana I. Saracho, Amy Serrano, an associate editor and a referee for useful suggestions. I am also indebted to Barry Blake, Vicki Bogan, Salwa Hammami and Karen Wong for able research assistance, Tony Brown at the Association of Football Statisticians for the data the Hoover Institution for its hospitality, and the Spanish Ministerio de Ciencia y Tecnologia for financial support (grant BEC 2003-08182). The Gauss programs used in this paper were kindly provided by Bruce E. Hansen. Any errors are mine alone. 相似文献
15.
Statistical Methods & Applications - A seemingly unrelated regression model has been commonly used for describing a set of different regression models with correlations. This paper discusses... 相似文献
16.
The comonotonicity and countermonotonicity provide intuitive upper and lower dependence relationship between random variables. This paper constructs the shuffle of min’s random variable approximations for a given Uniform [0, 1] random vector. We find the two optimal orders under which the shuffle of min’s random variable approximations obtained are shown to be extensions of comonotonicity and countermonotonicity. We also provide the rate of convergence of these random vectors approximations and apply them to compute value-at-risk. 相似文献
17.
In dose-response studies, Wadley’s problem occurs when the number of organisms that survive exposure to varying doses of a treatment is observed but the number initially present is unknown. The unknown number of organisms initially treated has traditionally been modelled by a Poisson distribution, resulting in a Poisson distribution for the number of survivors with parameter proportional to the probability of survival. Data in this setting are often overdispersed. This study revisits the beta-Poisson distribution and considers its effectiveness in modelling overdispersed data from a Wadley’s problem setting. 相似文献
18.
The simultaneous estimation of Cronbachs alpha coefficients from q populations under the compound symmetry assumption is considered. In a multi-sample scenario, it is suspected that all the Cronbachs alpha coefficients are identical. Consequently, the inclusion of non-sample information (NSI) on the homogeneity of Cronbachs alpha coefficients in the estimation process may improve precision. We propose improved estimators based on the linear shrinkage, preliminary test, and the Steins type shrinkage strategies, to incorporate available NSI into the estimation. Their asymptotic properties are derived and discussed using the concepts of bias and risk. Extensive Monte-Carlo simulations were conducted to investigate the performance of the estimators. 相似文献
19.
Many different algorithms have been proposed to solve penalized variable selection problems, in particular lasso and its variants, including group lasso and fused lasso. Loss functions other than quadratic loss also pose significant challenges for finding efficient solvers. Here, we note that Nesterov’s method can be used to transform an optimization problem with general smooth convex loss to quadratic loss with identity covariate matrix in each iteration. After such reduction, the problem becomes much easier to solve or even can be solved in closed form in some cases. We perform some simulations and apply our implementation to phoneme discrimination. 相似文献
20.
AbstractThis study uses systematic random sampling to compare the content of “Beall’s List of Predatory Journals and Publishers” and “Cabell’s Blacklist” of journals. The Beall’s List data was generated from its new site that maintains a new list besides the original list. It found that 28.5% Beall’s List sample publishers are out of business, some Cabell’s Blacklist journals have become ceased. The main takeaway is that among the Beall’s List sample publishers with a working website for journal publishing, only 31.8% can be found on Cabell’s Blacklist. 相似文献
|