首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 140 毫秒
1.
过程控制图在股票收益波动分析中的应用研究   总被引:1,自引:0,他引:1  
应用AR型残差控制图解决过程自相关问题;用受控过程的条件标准差替代无条件标准差,构造控制图的控制线,解决过程波动簇聚问题.在实证中,以上证50指数中股票的周收益序列为样本过程,根据其统计性质将其分为四类,并分别举例、选择适当的控制图加以监控.然后,对控制图识别出的异常点举例进行验证、分析,以评价控制图应用于股票收益序列监控的有效性.  相似文献   

2.
X↑——R控制图在多品种小批量生产中的应用   总被引:1,自引:0,他引:1  
赵涛 《管理工程学报》1998,12(3):35-40,45
本文着重研究了在多品种小批量生产中X-R控制图设计参数A2、D3和D4的理论基础,并对传统控制图进行了修正转换,以便快捷地实现统计过程在线控制。  相似文献   

3.
受控过程经常呈现出自相关性和波动簇聚性并存的特征.为解决这类过程的监控问题,本文提出ARGARcH型残差控制图,用自回归模型结合残差图解决自相关问题;用受控过程的条件标准差替代无条件标准差,构造控制图的控制线,解决波动簇聚的问题.并结合对股票周收益序列进行监控的实例对该控制图的有效性加以分析.  相似文献   

4.
瞿慧  纪萍 《管理科学》2016,29(6):28-38
 金融资产的时变协方差矩阵是投资组合配置、风险管理等实务活动的关键参数。早期的协方差预测模型研究使用日数据或者更低频数据,但大多存在参数估计困难和维数灾难等问题。        运用日内高频数据可以构建协方差矩阵的后验非参数估计量,使其从隐变量转变为可以直接建模的可观测变量,降低协方差模型估计的复杂性并增强模型的高维适用性。进一步的,利用高频数据还可以识别多个金融资产的价格在日内同一采样间隔内发生的跳跃,即多资产联跳。针对联跳多由宏观经济新闻公告和政策制度等的发布引起,这些信息终将被吸收并体现在协方差矩阵中,联跳可能蕴含着对协方差预测有益的信息,因此识别联跳并将其引入协方差预测模型。        将异质自回归模型扩展至多元形式,作为协方差非参数估计量的基准模型,并将取值0/1的联跳指示变量与Hawkes模型估计出的联跳强度分别及同时引入多元形式模型,构建3种扩展模型。选择均方误差和平均绝对误差这两种常用统计意义损失函数,采用Diebold Mariano检验,评价各扩展模型的样本外预测性能相对于基准模型是否有所改进,并采用模型置信集检验并挑选最佳扩展模型。此外,比较各种预测模型用于全局最小方差投资组合策略的效果。        基于上证50指数成分股中不同行业5只高流动性个股分钟高频价格数据进行实证,研究结果表明,①相对于联跳指示变量,联跳强度对协方差矩阵的预测有更显著的贡献;②引入联跳强度可以显著提升对协方差的拟合优度和样本外预测精度;③同时引入联跳强度和联跳指示变量,且采用矩阵对数变换,确保正定性的扩展多元形式模型在统计和经济意义上都是最优模型。        研究结论肯定了在协方差预测模型中引入联跳的重要价值,并揭示了宏观信息对协方差预测的贡献,对于金融管理者和投资者进行金融风险管理及进行资产配置都具有实际指导意义。  相似文献   

5.
本文从经济效益、企业长远发展、输差不良原因调查表、测量过程监控图等方面进行分析,论述了统计在天然气公司管理中的应用和重要作用.  相似文献   

6.
因子分析是统计学上的一个重要内容,因子分析是研究协方差或相关矩阵的内在依赖关系的方法,它用一些观察变量来解释相关性较少的不可观察的潜在变量或共同因素,其基本思想就是把观测变量分类,将相关性较高即联系比较紧密的变量分在同一类,使不同类的变量之间的相关性降低。本文分析了统计中的因子分析问题。  相似文献   

7.
基于经验相关矩阵的区间主成分分析   总被引:6,自引:1,他引:5  
给出了针对区间数据样本的主成分分析方法.为此,首先研究了区间数据样本的经验描述统计量,其中包括单变量的均值与方差、双变量的协方差与相关系数.然后,基于经验相关矩阵,给出了区间主成分分析的算法,该算法最终得到区间数表达形式的主成分取值.最后给出算例,分析表明文中方法实施简单,克服了区间主成分分析现有方法的缺点.  相似文献   

8.
自相关过程的统计控制状态   总被引:13,自引:0,他引:13  
对过程实施统计质量控制的基本假设前提是观测值彼此统计独立,但是实际工作中经常出现过程的观测值存在自相关的现象.本文在简单介绍残差控制图的基础上,探讨了自相关过程的统计控制状态的含义,提出了判断自相关过程是否处于统计控制状态并进而寻找处于统计控制状态下的自相关过程的模型的途径,运用示例加以说明.最后给出保持自相关过程的统计控制状态的方法.  相似文献   

9.
通过把服务过程控制作为反馈系统来对服务过程中的真实瞬间进行控制,应用统计试验设计方法测量控制输出量和可控变量的值,为管理者控制服务过程提供了一种快速、有效的方法,协调了服务单位中利益相关者的需求,拓宽了反馈系统的应用范围和研究领域.  相似文献   

10.
在炼油生产装置中,很多装置需要变更生产方案,以生产不同需要的产品,这样就形成了多品种、小批量生产的特点。针对这个特点进行工序控制,常规的工序统计控制(SPC)技术就不能有效进行。"小批量"生产工序中目标控制图就为此过程提供了有效手段。  相似文献   

11.
传统的控制图多是假定质量特性参数服从正态分布,但在很多情况下正态分布的假设并不成立。本文基于最大熵分布从控制图的构建和评价两个方面分别提出对Shewhart控制图和CUSUM控制图的改进方法。首先根据"经济性"原则构建最大熵Shewhart控制图,实现对Shewhart控制图的改进;然后,提出结合最大熵分布和马尔科夫链方法的CUSUM控制图评价方法。仿真结果表明,基于最大熵分布改进后的Shewhart控制图控制性能优于改进之前的情况;而基于最大熵分布的CUSUM控制图的性能评价方法得到的结果能更符合真实的情况。在对不同偏移的监测中,最大熵Shewhart控制图更适用于分布未知的大偏移情况,而自适应CUSUM控制图对小偏移有更好地监控效果。  相似文献   

12.
Run-length distributions for various statistical process-control charts and techniques for computing them recently have been reported in the literature. The real advantages of knowing the run-length distribution for a process-control chart versus knowing only the associated average-run length of the chart have not been exploited. Our purpose is to use knowledge of the run-length distribution as an aid in deciding if an out-of-control signal is a true signal or merely a false alarm. The ability to distinguish between true and false signals is important, especially in operations where it is costly to investigate the causes of out-of-control conditions. Knowledge of the run-length distribution allows us to compute likelihood ratios, which are simple to calculate and to interpret and which are used to determine the odds of obtaining an out-of-control signal at a particular run length when a shift in the process mean actually has occurred vis-a-vis no such shift. We extend our analysis in a Bayesian sense by incorporating prior information on the distribution of the shift size of the process mean, combined with the likelihood ratio obtained from the run-length distribution, to determine if a shift larger than a critical size has occurred. We give examples for the Shewhart chart, the exponentially weighted moving-average chart, and the special-cause control chart for processes with autocorrelated observations. The examples show that the current recommended usage of the average-run length alone as a guide for determining whether a signal is a false alarm or otherwise can be misleading. We also show that the performance of the traditional charts, in terms of their average-run length, can be enhanced in many instances by using the likelihood-ratio procedure.  相似文献   

13.
Non-normality has a significant effect on the performance of control charts for averages. The design considerations for a control chart for averages must include recognition of the degree of non-normality of the underlying data. The performance of a control chart may be judged on its ability to correctly identify the probabilities of assignable causes of variation and chance causes of variation in a process. This paper examines the effects of non-normality, as measured by skewness and kurtosis, on the performance, and hence the design, of control charts for averages and provides an alternative method of designing charts for averages of data with non-normal distributions.  相似文献   

14.
Abstract

Two approaches for constructing control charts for quality assurance when the observations are in the form of linguistic data are presented. Both approaches are based on fuzzy set theory and use fuzzy subsets to model the linguistic terms used to describe product quality. They differ in the interpretation of the control limits and in the procedure used to reduce the fuzzy subsets to scalars for determining the chart parameters. The results obtained with simulated data suggest that, on the basis of sensitivity to process shifts, the control charts for linguistic data perform better than conventional p control charts. The number of linguistic terms used in classifying the observations was found to influence the sensitivity of these control charts. The transformation method used to obtain the representative values and the amount of fuzziness do not seem to affect the performance of either type of control charts.  相似文献   

15.
Over the last few years, ‘benchmarking’ advanced to a key word in organizational development and change management. Originally a tool in business studies to search for best practice that led to superior performance, increasingly benchmarking also became practice in non-profit and public institutions. Notably, the European Commission uses benchmarking as an instrument to monitor its employment guidelines. The radar chart approach is one of a number of special analytical tools that has been developed in this connection. The paper discusses the advantages and limits of benchmarking labour market performance by radar charts, recommends a broadening of the scope by using the employment systems approach and provides, in both cases, examples of application.  相似文献   

16.
对协方差矩阵高频估计量和预测模型的选择,共同影响协方差的预测效果,从而影响波动择时投资组合策略的绩效。资产维数很高时,协方差矩阵高频估计量的构建会因非同步交易而丢弃大量数据,降低信息利用效率。鉴于此,将可以充分利用资产日内价格信息的KEM估计量用于估计中国股市资产的高维协方差矩阵,并与两种常用协方差矩阵估计量进行比较。进一步地,将三种估计量分别用于多元异质自回归模型、指数加权移动平均模型以及短、中、长期移动平均模型进行样本外预测,并比较在三种基于风险的投资组合策略下的经济效益。采用上证50指数中20只不同流动性成份股逐笔高频数据的实证研究发现:(1)无论是在市场平稳时期还是市场剧烈震荡期,长期移动平均模型都是高维协方差估计量预测建模的最优选择,在应用于各种波动择时策略时都可以实现最低成本和最高收益。(2)在市场平稳时期,KEM估计量是高维协方差估计的最优选择,应用于各种波动择时策略时基本都可以实现最低成本和最高收益;在市场剧烈震荡期,使用KEM估计量进行波动择时仍然可以在成本方面保持优势,但在收益上并不占优。(3)无论是在市场平稳时期还是市场剧烈震荡期,最低的成本都是在采用等风险贡献投资组合时实现的,而最高的收益则都是在采用最小方差投资组合时实现的。研究不仅首次检验了KEM估计量在常用波动择时策略中的适用性,而且首次实证了实现最为简单的长期移动平均模型在高维协方差矩阵预测中的优越性,对投资决策和风险管理等实务应用都具有重要意义。  相似文献   

17.

In any business process reengineering (BPR) project, a thorough understanding of various tasks and activities of the organization is required. Very often this idea is captured using a simple flow chart or static representation diagram. The weakness here is that the process design complexity is not adequately represented by the use of flow charts, and this allows for limited human-computer interaction during the process design and analysis. In this paper, we propose an enhanced flow chart approach; the concept of activity-section flow chart to support BPR, which is a combination of the existing activity flow chart and section flow chart. Using this approach, a human-computer interactive model for BPR is developed. This model can identify the unreasonable activity loops and excessive business rounds between sections by the adjacent and reachable matrices. Via the human-computer interaction, the process can be revised by human experience. This approach provides an efficient tool for BPR of large-scale systems. It has been applied to the material supply management system of an iron and steel works, and satisfactory results have been achieved.  相似文献   

18.
Intensified research on multivariate Poisson models offers new opportunities for the analysis of purchase quantities in market basket data. The investigation of positive or negative correlations in quantity decisions among product categories facilitates a deeper understanding of consumer purchase behavior. The applied multivariate log-normal Poisson model introduces interdependencies between categories with multivariate normal-distributed latent effects by means of a covariance matrix. As the size of this covariance matrix depends on the number of categories in the model, its ation may become tedious. Furthermore, we assume that quantity decisions do not interact for all pairs of categories. That is why we propose to use covariance selection to derive a parsimonious representation of the correlation structure. For two market basket data sets, we show that the vast majority of off-diagonal elements in the covariance matrix are irrelevant. For a data set with product categories, the model with a partly restricted covariance matrix achieves a better fit to the holdout data than the model with full covariance matrix. For a data set with subcategories of the broader category beverage, the proposed model with restricted covariance outperforms the model with full covariance matrix even on the calibration data. We conclude that interactions of quantity decisions are overall the exception, even for complements-in-use.  相似文献   

19.
We review the history of statistical process control research from its origins at Bell Laboratories with Shewhart in 1924 up to the present and integrate it with the history of the larger total quality management movement that emerged from these same statistical process control origins. The original research was very philosophical and very practical and is still implemented today. Our view is that the majority of the enormous research literature after Duncan's 1956 seminal paper on optimal design of control charts has had little practical relevance. The research formulations became more mechanical, less philosophical and less practical. We explore the reasons for this and make suggestions for new research directions. We also propose changes in the supporting industry-university relationships to facilitate a program of more relevant research in statistical process control.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号