首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 500 毫秒
1.
ABSTRACT

This installment of The Balance Point column delves into the ways in which libraries create and store open educational resources (OER) in institutional repositories (IR), addressing issues such as preservation and versioning of OER content, copyright and licensing, funding, and staffing. Drawing on interviews and the literature, programs at institutions such as the University of Minnesota, the University of Kansas, and Grand Valley State University are highlighted.  相似文献   

2.
Summary. We model daily catches of fishing boats in the Grand Bank fishing grounds. We use data on catches per species for a number of vessels collected by the European Union in the context of the Northwest Atlantic Fisheries Organization. Many variables can be thought to influence the amount caught: a number of ship characteristics (such as the size of the ship, the fishing technique used and the mesh size of the nets) are obvious candidates, but one can also consider the season or the actual location of the catch. Our database leads to 28 possible regressors (arising from six continuous variables and four categorical variables, whose 22 levels are treated separately), resulting in a set of 177 million possible linear regression models for the log-catch. Zero observations are modelled separately through a probit model. Inference is based on Bayesian model averaging, using a Markov chain Monte Carlo approach. Particular attention is paid to the prediction of catches for single and aggregated ships.  相似文献   

3.
Doug Way 《Serials Review》2013,39(4):214-220
Abstract

Grand Valley State University Libraries implemented Serials Solutions' web-scale discovery tool, Summon, during the fall of 2009. This case study explores whether Summon had an impact on the use of the library's resources during its first semester of implementation. An examination of usage statistics showed a dramatic decrease in the use of traditional abstracting and indexing databases and an equally dramatic increase in the use of full-text resources from full-text database and online journal collections. The author concludes that the increase in full-text use is linked to the implementation of a web-scale discovery tool.  相似文献   

4.
The problem of comparing, contrasting and combining information from different sets of data is an enduring one in many practical applications of statistics. A specific problem of combining information from different sources arose in integrating information from three different sets of data generated by three different sampling campaigns at the input stage as well as at the output stage of a grey-water treatment process. For each stage, a common process trend function needs to be estimated to describe the input and output material process behaviours. Once the common input and output process models are established, it is required to estimate the efficiency of the grey-water treatment method. A synthesized tool for modelling different sets of process data is created by assembling and organizing a number of existing techniques: (i) a mixed model of fixed and random effects, extended to allow for a nonlinear fixed effect, (ii) variogram modelling, a geostatistical technique, (iii) a weighted least squares regression embedded in an iterative maximum-likelihood technique to handle linear/nonlinear fixed and random effects and (iv) a formulation of a transfer-function model for the input and output processes together with a corresponding nonlinear maximum-likelihood method for estimation of a transfer function. The synthesized tool is demonstrated, in a new case study, to contrast and combine information from connected process models and to determine the change in one quality characteristic, namely pH, of the input and output materials of a grey-water filtering process.  相似文献   

5.
Process capability indices (PCIs) provide numerical measures on whether a process conforms to the defined manufacturing capability prerequisite. These have been successfully applied by companies to compete with and to lead high-profit markets by evaluating the quality and productivity performance. The PCI Cp compares the output of a process to the specification limits (SLs) by forming the ratio of the width between the process SLs with the width of the natural tolerance limits which is measured by six process standard deviation units. As another common PCI, Cpm incorporates two variation components which are variation to the process mean and deviation of the process mean from the target. A meaningful generalized version of above PCIs is introduced in this paper which is able to handle in a fuzzy environment. These generalized PCIs are able to measure the capability of a fuzzy-valued process in producing products on the basis of a fuzzy quality. Fast computing formulas for the generalized PCIs are computed for normal and symmetric triangular fuzzy observations, where the fuzzy quality is defined by linear and exponential fuzzy SLs. A practical example is presented to show the performance of proposed indices.  相似文献   

6.
Consider a non-homogeneous Poisson process, N(t), with mean value function Λ(t) and intensity function λ(t). A conditional test of the hypothesis that the process is homogeneous, versus alternatives for which Λ(t) is superadditive, was proposed by Hollander and Proschan (1974). A new test for superadditivity of Λ(t), which is based on a linear combination of the occurrence times of the process N{t) is suggested in this paper. Though this test has the same Pitman efficiency as the Hollander-Proschan test, it is shown by Monte-Carlo simulation that our test has more power for many important alternatives. Tables for the exact null distribution of the test statistic have been given.  相似文献   

7.
Given a general homogeneous non-stationary autoregressive integrated moving average process ARIMA(p,d,q), the corresponding model for the subseries obtained by a systematic sampling is derived. The article then shows that the sampled subseries approaches approximately to an integrated moving average process IMA(d,l), l≤(d-l), regardless of the autoregressive and moving average structures in the original series. In particular, the sampled subseries from an ARIMA (p,l,q) process approaches approximately to a simple random walk model.  相似文献   

8.
Optimal statistical process control (SPC) requires models of both in-control and out-of-control process states. Whereas a normal distribution is the generally accepted model for the in-control state, there is a doubt as to the existence of reliable models for out-of-control cases. Various process models, available in the literature, for discrete manufacturing systems (parts industry) can be treated as bounded discrete-space Markov chains, completely characterized by the original in-control state and a transition matrix for shifts to an out-of-control state. The present work extends these models by using a continuous-state Markov chain, incorporating non-random corrective actions. These actions are to be realized according to the SPC technique and should substantially affect the model. The developed stochastic model yields a Laplace distribution of a process mean. An alternative approach, based on the Information theory, also results in a Laplace distribution. Real-data tests confirm the applicability of a Laplace distribution for the parts industry and show that the distribution parameter is mainly controlled by the SPC sample size.  相似文献   

9.
An integrated process control (IPC) procedure is a scheme which combines the engineering process control (EPC) and the statistical process control (SPC) procedures for the process where the noise and a special cause are present. The most efficient way of reducing the effect of the noise is to adjust the process by its forecast, which is done by the EPC procedure. The special cause, which produces significant deviations of the process level from the target, can be detected by the monitoring scheme, which is done by the SPC procedure. The effects of special causes can be eliminated by a rectifying action. The performance of the IPC procedure is evaluated in terms of the average run length (ARL) or the expected cost per unit time (ECU). In designing the IPC procedure for practical use, it is essential to derive its properties constituting the ARL or ECU based on the proposed process model. The process is usually assumed as it starts only with noise, and special causes occur at random times afterwards. The special cause is assumed to change the mean as well as all the parameters of the in-control model. The linear filter models for the process level as well as the controller and the observed deviations for the IPC procedure are developed here.  相似文献   

10.
Based on mixed cumulants up to order six, this paper provides a four moment approximation to the distribution of a ratio of two general quadratic forms in normal variables. The approximation is applied to calculate the percentile points of modified F-test statistics for testing treatment effects when standard F-ratio test is misleading because of dependence among observations. For the special case, when data is generated by an AR(1) process, the approximation is evaluated by a simulation study. For the general SARMA (p,q)(P,Q)s process, a modified F-test statistic Is given, and its distribution for the (0,1)(0,l)12 process, is approximated by the moment approximation technique.  相似文献   

11.
A representation of the innovation random variable for a gamma distributed first-order autoregressive process was found by Lawrance (1982) in the form of a compound Poisson distribution, connected with a shot-noise process. In this note we simplify the representation of Lawrance by providing a direct representation in terms of density functions.  相似文献   

12.
The rapid response to the requirements of customers and markets promotes the concurrent engineering (CE) technique in product and process design. The decision making for process quality target, SPC method, sampling plan, and control chart parameter design can be done at the stage of process quality plan based on historical data and process knowledge database. Therefore, it is a reasonable trend to introduce the concepts and achievements on process quality evaluation and process capability analysis, CE, and SPC techniques into process plan and tolerance design. A new systematic method for concurrent design of process quality, statistical tolerance (ST), and control chart is presented based on a NSFC research program. A set of standardized process quality indices (PQIs) for variables is introduced for meeting the measurement and evaluation to process yield, process centering, and quality loss. This index system that has relatively strong compatibility and adaptability is based on raisonne grading by using the series of preferred numbers and arithmetical progression. The expected process quality based on this system can be assured by a standardized interface between PQIs and SPC, that is, quality-oriented statistical tolerance zone. A quality-oriented ST and SPC approach that quantitatively specifies what a desired process is and how to assure it will realize the optimal control for a process toward a predetermined quality target.  相似文献   

13.
Traditional control charts assume independence of observations obtained from the monitored process. However, if the observations are autocorrelated, these charts often do not perform as intended by the design requirements. Recently, several control charts have been proposed to deal with autocorrelated observations. The residual chart, modified Shewhart chart, EWMAST chart, and ARMA chart are such charts widely used for monitoring the occurrence of assignable causes in a process when the process exhibits inherent autocorrelation. Besides autocorrelation, one other issue is the unknown values of true process parameters to be used in the control chart design, which are often estimated from a reference sample of in-control observations. Performances of the above-mentioned control charts for autocorrelated processes are significantly affected by the sample size used in a Phase I study to estimate the control chart parameters. In this study, we investigate the effect of Phase I sample size on the run length performance of these four charts for monitoring the changes in the mean of an autocorrelated process, namely an AR(1) process. A discussion of the practical implications of the results and suggestions on the sample size requirements for effective process monitoring are provided.  相似文献   

14.
One provides in this paper the pseudo-likelihood estimator (PMLE) and asymptotic theory for the GARCH (1,1) process. Strong consistency of the pseudo-maximum-likelihood estimator (MLE) is established by appealing to conditions given in Jeantheau (1998) concerning the existence of a stationary and ergodic solution to the multivariate GARCH (p, q) process. One proves the asymptotic normality of the PMLE by appealing to martingales' techniques.  相似文献   

15.
The process of serially dependent counts with deflation or inflation of zeros is commonly observed in many applications. This paper investigates the monitoring of such a process, the first-order zero-modified geometric integer-valued autoregressive process (ZMGINAR(1)). In particular, two control charts, the upper-sided and lower-sided CUSUM charts, are developed to detect the shifts in the mean process of the ZMGINAR(1). Both the average run length performance and the standard deviation of the run length performance of these two charts are investigated by using Markov chain approaches. Also, an extensive simulation is conducted to assess the effectiveness or performance of the charts, and the presented methods are applied to two sets of real data arising from a study on the drug use.  相似文献   

16.
Recently, several new applications of control chart procedures for short production runs have been introduced. Bothe (1989) and Burr (1989) proposed the use of control chart statistics which are obtained by scaling the quality characteristic by target values or process estimates of a location and scale parameter. The performance of these control charts can be significantly affected by the use of incorrect scaling parameters, resulting in either an excessive "false alarm rate," or insensitivity to the detection of moderate shifts in the process. To correct for these deficiencies, Quesenberry (1990, 1991) has developed the Q-Chart which is formed from running process estimates of the sample mean and variance. For the case where both the process mean and variance are unknown, the Q-chaxt statistic is formed from the standard inverse Z-transformation of a t-statistic. Q-charts do not perform correctly, however, in the presence of special cause disturbances at process startup. This has recently been supported by results published by Del Castillo and Montgomery (1992), who recommend the use of an alternative control chart procedure which is based upon a first-order adaptive Kalman filter model Consistent with the recommendations by Castillo and Montgomery, we propose an alternative short run control chart procedure which is based upon the second order dynamic linear model (DLM). The control chart is shown to be useful for the early detection of unwanted process trends. Model and control chart parameters are updated sequentially in a Bayesian estimation framework, providing the greatest degree of flexibility in the level of prior information which is incorporated into the model. The result is a weighted moving average control chart statistic which can be used to provide running estimates of process capability. The average run length performance of the control chart is compared to the optimal performance of the exponentially weighted moving average (EWMA) chart, as reported by Gan (1991). Using a simulation approach, the second order DLM control chart is shown to provide better overall performance than the EWMA for short production run applications  相似文献   

17.
Process capability indices (PCIs) are extensively used in the manufacturing industries in order to confirm whether the manufactured products meet their specifications or not. PCIs can be used to judge the process precision, process accuracy, and the process performance. So developing of sampling plans based on PCIs is inevitable and those plans will be very much useful for maintaining and improving the product quality in the manufacturing industries. In view of this, we propose a variables sampling system based on the process capability index Cpmk, which takes into account of process yield and process loss, when the quality characteristic under study will have double specification limits. The proposed sampling system will be effective in compliance testing. The advantages of this system over the existing sampling plans are also discussed. In order to determine the optimal parameters, tables are also constructed by formulating the problem as a nonlinear programming in which the average sample number is minimized by satisfying the producer and consumer risks.  相似文献   

18.
In this paper, we propose a new test for coefficient stability of an AR(1) model against the random coefficient autoregressive model of order 1 neither assuming a stationary nor a non-stationary process under the null hypothesis of a constant coefficient. The proposed test is obtained as a modification of the locally best invariant (LBI) test by Lee [(1998). Coefficient constancy test in a random coefficient autoregressive model. J. Statist. Plann. Inference 74, 93–101]. We examine finite sample properties of the proposed test by Monte Carlo experiments comparing with other existing tests, in particular, the LBI test by McCabe and Tremayne [(1995). Testing a time series for difference stationary. Ann. Statist. 23 (3), 1015–1028], which is for the null of a unit root process against the alternative of a stochastic unit root process.  相似文献   

19.
This article conducts a Bayesian analysis for bivariate degradation models based on the inverse Gaussian (IG) process. Assume that a product has two quality characteristics (QCs) and each of the QCs is governed by an IG process. The dependence of the QCs is described by a copula function. A bivariate simple IG process model and three bivariate IG process models with random effects are investigated by using Bayesian method. In addition, a simulation example is given to illustrate the effectiveness of the proposed methods. Finally, an example about heavy machine tools is presented to validate the proposed models.  相似文献   

20.
The last 20 years have seen an increasing emphasis on statistical process control as a practical approach to reducing variability in industrial applications. Control charts are used to detect problems such as outliers or excess variability in subgroup means that may have a special cause. We describe an approach to the computation of control limits for exponentially weighted moving average control charts where the usual statistics in classical charts are replaced by linear combinations of order statistics; in particular, the trimmed mean and Gini's mean difference instead of the mean and range, respectively. Control limits are derived, and simulated average run length experiments show the trimmed control charts to be less influenced by extreme observations than their classical counterparts, and lead to tighter control limits. An example is given that illustrates the benefits of the proposed charts. parameters; see, for example, Hunter (1986) and Montgomery (1996). On the other hand, EWMA charts have been shown to be more efficient than Shewharttype charts in detecting small shifts in the process mean; see, for example, Ng & Case (1989), Crowder (1989), Lucas & Saccucci (1990), Amin & Searcy (1991) and Wetherill & Brown (1991). In fact, the EWMA control chart has become popular for monitoring a process mean; see Hunter (1986) for a good discussion. More recently, EWMA charts have been developed for monitoring process variability;  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号