排序方式: 共有30条查询结果,搜索用时 10 毫秒
21.
The power of some rank tests, used for testing the hypothesis of shift, is found when the underlying distributions contain outliers. The outliers are assumed to occur as the result of mixing two normal distributions with common variance. A small sample case shows how the scores for the rank tests are found and the exact power is computed for each of these rank tests. A Monte Carlo study provides an estimate of the power of the usual two sample t-test. 相似文献
22.
An Investigation of Uncertainty and Sensitivity Analysis Techniques for Computer Models 总被引:12,自引:0,他引:12
Many different techniques have been proposed for performing uncertainty and sensitivity analyses on computer models for complex processes. The objective of the present study is to investigate the applicability of three widely used techniques to three computer models having large uncertainties and varying degrees of complexity in order to highlight some of the problem areas that must be addressed in actual applications. The following approaches to uncertainty and sensitivity analysis are considered: (1) response surface methodology based on input determined from a fractional factorial design; (2) Latin hypercube sampling with and without regression analysis; and (3) differential analysis. These techniques are investigated with respect to (1) ease of implementation, (2) flexibility, (3) estimation of the cumulative distribution function of the output, and (4) adaptability to different methods of sensitivity analysis. With respect to these criteria, the technique using Latin hypercube sampling and regression analysis had the best overall performance. The models used in the investigation are well documented, thus making it possible for researchers to make comparisons of other techniques with the results in this study. 相似文献
23.
Given an undirected graph with a source node s and a sink node t. The anti-risk path problem is defined as the problem of finding a path between node s to node t with the least risk under the assumption that at most one edge of each path may be blocked. Xiao et al. (J Comb Optim 17:235–246, 2009) defined the problem and presented an \(O(mn+n^2 \log n)\) time algorithm to find an anti-risk path, where n and m are the number of nodes and edges, respectively. Recently, Mahadeokar and Saxena (J Comb Optim 27:798–807, 2014) solved the problem in \(O(m+n \log n)\) time. In this paper, first, a new version of the anti-risk path (called contra-risk path) is defined, which is more effective than an anti-risk path in many networks. Then, an algorithm to find a contra-risk path is presented, which runs in \(O(m+n \log n)\) time. 相似文献
24.
25.
Iman Mersal 《Globalizations》2013,10(5):669-674
In this short narrative, an Egyptian writer living in Canada follows Egypt's 2011 revolution from afar. Being away from home, seeking the spirit of resistance not captured by factual reporting from traditional television news organizations, she takes solace in Tahrir Square protestors' sense of humor as circulated through new Internet social media. This humor, expressed in jokes, chants, graffiti, songs, and images, presents a new sense of community and solidarity, providing a means of connecting with the undefeated spirit behind unfolding events. En esta corta narrativa, una escritora egipcia que vive en Canadá, sigue de lejos la revolución de Egipto de 2011. Estando fuera de casa y buscando el espíritu de resistencia no capturado mediante reportajes basados en los hechos por organizaciones tradicionales de noticias de televisión, toma consuelo en el sentido de humor de los manifestantes de la plaza de Tahrir cuando circulaba a través de los nuevos medios sociales de Internet. Este humor expresado en chistes, cantos, grafitos, cantos e imágenes, presenta un nuevo sentido de comunidad y solidaridad, suministrando una forma de conexión con el espíritu invicto detrás de los eventos desarrollados. 本文简述了一个生活在加拿大的埃及作家在远方对2011年埃及革命的关切。远离家乡并寻找着未被传统电视新闻机构的实况报道捕捉到的反抗精神,通过新网络社会媒体的传送,她从解放广场抗议者们的幽默感中得到了安慰。这种以玩笑、儿歌、涂鸦、歌曲以及绘画表达的幽默,代表了一种新的社群意识和团结意识,提供了一种与正在发生的事件背后永不言败的精神相连的方式。 相似文献
26.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is fraught with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 billion dollars to 3 billion dollars in losses late on the 12th to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm struck the resort areas of Charlotte Harbor and moved across the densely populated central part of the state, with early poststorm estimates in the 28 dollars to 31 billion dollars range, and final estimates converging at 15 billion dollars as the actual intensity at landfall became apparent. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has a great appreciation for the role of computer models in projecting losses from hurricanes. The FCHLPM contracts with a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a sophisticated computer model based on the Holland wind field. Sensitivity analyses presented in this article utilize standardized regression coefficients to quantify the contribution of the computer input variables to the magnitude of the wind speed. 相似文献
27.
28.
The procedure of statistical discrimination Is simple in theory but so simple in practice. An observation x0possibly uiultivariate, is to be classified into one of several populations π1,…,πk which have respectively, the density functions f1(x), ? ? ? , fk(x). The decision procedure is to evaluate each density function at X0 to see which function gives the largest value fi(X0) , and then to declare that X0 belongs to the population corresponding to the largest value. If these den-sities can be assumed to be normal with equal covariance matricesthen the decision procedure is known as Fisher’s linear discrimi-nant function (LDF) method. In the case of unequal covariance matrices the procedure is called the quadratic discriminant func-tion (QDF) method. If the densities cannot be assumed to be nor-mal then the LDF and QDF might not perform well. Several different procedures have appeared in the literature which offer discriminant procedures for nonnormal data. However, these pro-cedures are generally difficult to use and are not readily available as canned statistical programs. Another approach to discriminant analysis is to use some sortof mathematical trans format ion on the samples so that their distribution function is approximately normal, and then use the convenient LDF and QDF methods. One transformation that:applies to all distributions equally well is the rank transformation. The result of this transformation is that a very simple and easy to use procedure is made available. This procedure is quite robust as is evidenced by comparisons of the rank transform results with several published simulation studies. 相似文献
29.
As modeling efforts expand to a broader spectrum of areas the amount of computer time required to exercise the corresponding computer codes has become quite costly (several hours for a single run is not uncommon). This costly process can be directly tied to the complexity of the modeling and to the large number of input variables (often numbering in the hundreds) Further, the complexity of the modeling (usually involving systems of differential equations) makes the relationships among the input variables not mathematically tractable. In this setting it is desired to perform sensitivity studies of the input-output relationships. Hence, a judicious selection procedure for the choic of values of input variables is required, Latin hypercube sampling has been shown to work well on this type of problem. However, a variety of situations require that decisions and judgments be made in the face of uncertainty. The source of this uncertainty may be lack ul knowledge about probability distributions associated with input variables, or about different hypothesized future conditions, or may be present as a result of different strategies associated with a decision making process In this paper a generalization of Latin hypercube sampling is given that allows these areas to be investigated without making additional computer runs. In particular it is shown how weights associated with Latin hypercube input vectors may be rhangpd to reflect different probability distribution assumptions on key input variables and yet provide: an unbiased estimate of the cumulative distribution function of the output variable. This allows for different distribution assumptions on input variables to be studied without additional computer runs and without fitting a response surface. In addition these same weights can be used in a modified nonparametric Friedman test to compare treatments, Sample size requirements needed to apply the results of the work are also considered. The procedures presented in this paper are illustrated using a model associated with the risk assessment of geologic disposal of radioactive waste. 相似文献
30.
A method for inducing a desired rank correlation matrix on a multivariate input random variable for use in a simulation study is introduced in this paper. This method is simple to use, is distribution free, preserves the exact form of the marginal distributions on the input variables, and may be used with any type of sampling scheme for which correlation of input variables is a meaningful concept. A Monte Carlo study provides an estimate of the bias and variability associated with the method. Input variables used in a model for study of geologic disposal of radioactive waste provide an example of the usefulness of this procedure. A textbook example shows how the output may be affected by the method presented in this paper. 相似文献