共查询到20条相似文献,搜索用时 156 毫秒
1.
针对区域宏观经济统计数据质量问题,研究并制定了宏观经济统计数据质量诊断的技术方法,在提出相关研究的理论假定后,对宏观经济数据从静态、动态和多维、系统的角度构建了统计数据质量诊断模型,给出了定量诊断方法、诊断准则和模式,并运用实际数据进行了模拟实验。 相似文献
2.
统计数据质量不高是我国统计工作中存在的严重问题之一。本文探讨用向量自回归模型(VAR),根据统计指标的内在关系,定量地对统计数据质量进行逻辑性评估,进而实现统计数据质量的有效评估和控制。 相似文献
3.
统计数据质量评估的组合模型 总被引:1,自引:1,他引:0
提高统计数据质量不仅需要科学的统计管理体制、先进的统计工作手段和高素质的统计队伍,而且还需要有一套科学的统计数据质量评估与监控方法.本文试图探讨一种利用组合模型根据某一统计指标的时间序列变动规律对统计数据准确性进行定量评估的方法. 相似文献
4.
近年来,国家统计局把统计数据质量问题从潜在危险提高到了危及统计工作生命的高度。从全国来说,统计数据质量问题已经达到严重的程度。“数据准确性是统计工作的生命”,“保证统计数据的准确性是《统计法》的核心”,在全国各阶层都十分重视统计数据质量的情况下,统计系统更应该认真研究探讨,为提高统计数据质量而努力。本文拟从统计系统内部角度,对统计数据质量问题谈几点体会和看法。一、对当前统计数据质量的评价由于统计工作具有统一性、规范性、强制性的特点,特别是受全国大环境的影响,各地在统计数据质量问题上的表现形式及其… 相似文献
5.
6.
7.
提高统计数据质量是统计工作的中心任务之一,在"三个提高"中,提高统计数据质量是核心,是重中之重。基于确保统计数据质量,本文从加强统计工作的流程管理暨统计设计、统计调查、统计数据整理和统计数据发布四个阶段,系统地提出各阶段的质量控制标准与数据质量控制技术,以达到全面提高统计数据质量的要求。 相似文献
8.
统计学与统计法的沿革表明,随着统计学的产生、发展,统计法的产生有其内在必然性。就控制统计数据质量而言,涉及两个范畴:一是属于统计科学范畴的统计技术;二是属于统计法学范畴的法律规范,后者是本文的研究对象。统计数据质量与统计法休戚相关,要考察统计法对统计数据质量的积极作用和影响,首先必须把握我国当前统计数据的质量状况。本文在法学范畴内分析影响统计数据质量的原因,进而提出提高统计数据质量的几个具体思路。 一、统计数据质量问题的原因剖析 1行政自由裁量权对主体的干扰经济、社会的发展,归根结蒂反映在… 相似文献
9.
10.
众所周知,统计数据的准确性是统计工作的生命,提高统计数据质量是统计工作的重中之重.因此,如何评价统计数据质量的统计误差指标、防范统计误差,对于保证统计资料的准确性是颇具统计理论和统计实践意义的课题. 相似文献
11.
为明确企业统计数据质量的影响因素,定量描述各因素之间的关系,利用SEM模型,通过SPSS AMOS18.0软件进行数据分析,得到企业的信息化程度、企业重视化程度、企业统计力量、统计人员知识水平、统计人员地位、组织正规化程度6个因素的企业统计数据质量结构模型,并比较各个因素对企业统计数据质量影响的大小及各影响因素之间的相关关系,为进一步提高企业的统计数据工作提供了依据。 相似文献
12.
Wu J Johnson TD Galbán CJ Chenevert TL Meyer CR Rehemtulla A Hamstra DA Ross BD 《Journal of the Royal Statistical Society. Series C, Applied statistics》2012,61(1):83-98
The prognosis for patients with high grade gliomas is poor, with a median survival of 1 year. Treatment efficacy assessment is typically unavailable until 5-6 months post diagnosis. Investigators hypothesize that quantitative magnetic resonance imaging can assess treatment efficacy 3 weeks after therapy starts, thereby allowing salvage treatments to begin earlier. The purpose of this work is to build a predictive model of treatment efficacy by using quantitative magnetic resonance imaging data and to assess its performance. The outcome is 1-year survival status. We propose a joint, two-stage Bayesian model. In stage I, we smooth the image data with a multivariate spatiotemporal pairwise difference prior. We propose four summary statistics that are functionals of posterior parameters from the first-stage model. In stage II, these statistics enter a generalized non-linear model as predictors of survival status. We use the probit link and a multivariate adaptive regression spline basis. Gibbs sampling and reversible jump Markov chain Monte Carlo methods are applied iteratively between the two stages to estimate the posterior distribution. Through both simulation studies and model performance comparisons we find that we can achieve higher overall correct classification rates by accounting for the spatiotemporal correlation in the images and by allowing for a more complex and flexible decision boundary provided by the generalized non-linear model. 相似文献
13.
Threshold selection for regional peaks-over-threshold data 总被引:1,自引:0,他引:1
A hurdle in the peaks-over-threshold approach for analyzing extreme values is the selection of the threshold. A method is developed to reduce this obstacle in the presence of multiple, similar data samples. This is for instance the case in many environmental applications. The idea is to combine threshold selection methods into a regional method. Regionalized versions of the threshold stability and the mean excess plot are presented as graphical tools for threshold selection. Moreover, quantitative approaches based on the bootstrap distribution of the spatially averaged Kolmogorov–Smirnov and Anderson–Darling test statistics are introduced. It is demonstrated that the proposed regional method leads to an increased sensitivity for too low thresholds, compared to methods that do not take into account the regional information. The approach can be used for a wide range of univariate threshold selection methods. We test the methods using simulated data and present an application to rainfall data from the Dutch water board Vallei en Veluwe. 相似文献
14.
15.
Xiaomo Jiang 《Journal of applied statistics》2008,35(1):49-65
Multivariate model validation is a complex decision-making problem involving comparison of multiple correlated quantities, based upon the available information and prior knowledge. This paper presents a Bayesian risk-based decision method for validation assessment of multivariate predictive models under uncertainty. A generalized likelihood ratio is derived as a quantitative validation metric based on Bayes’ theorem and Gaussian distribution assumption of errors between validation data and model prediction. The multivariate model is then assessed based on the comparison of the likelihood ratio with a Bayesian decision threshold, a function of the decision costs and prior of each hypothesis. The probability density function of the likelihood ratio is constructed using the statistics of multiple response quantities and Monte Carlo simulation. The proposed methodology is implemented in the validation of a transient heat conduction model, using a multivariate data set from experiments. The Bayesian methodology provides a quantitative approach to facilitate rational decisions in multivariate model assessment under uncertainty. 相似文献
16.
资本对一个地区的经济发展影响较大,根据中国各地经济增长的统计数据综合分析各地区资本流动风险水平,利用固定资产投资等指标综合反映地区资本流动性,可以通过GDP来反映区域经济差异化。因此,可以采用固定效应回归模型分析地区经济差异的资本流动因素。研究发现:固定资本投资和银行储蓄并不是造成地区引资能力差异的重要因素,各省银行存贷差和财政转移支付是各个地区间资本流动风险的重要因素。 相似文献
17.
18.
数据质量评估是统计数据质量管理的重要环节。统计数据质量的评估方法有逻辑关系检验法、计量模型分析法、核算数据重估法、统计分布检验法、调查误差评估法以及多维评估法六个类别,在详细讨论其各自的评估过程、适用性及其优缺点的基础上,按照评估维度和评估方式对各种统计数据质量的评估方法进行再归类,指出评估方法的进一步研究应该围绕计量模型分析法、核算数据重估法中的物量指数重估法、调查误差评估法和多维评估法等方向展开。 相似文献
19.
Data quality: A statistical perspective 总被引:1,自引:0,他引:1
We present the old-but-new problem of data quality from a statistical perspective, in part with the goal of attracting more statisticians, especially academics, to become engaged in research on a rich set of exciting challenges. The data quality landscape is described, and its research foundations in computer science, total quality management and statistics are reviewed. Two case studies based on an EDA approach to data quality are used to motivate a set of research challenges for statistics that span theory, methodology and software tools. 相似文献
20.
政府统计数据质量是当前各界关注的热点问题,如何采用严谨的诊断方法,对我国统计数据进行科学的评估具有重要的现实意义。稳健回归方法可使求出的回归估计不受异常值的强烈影响,并且能更好的识别异常点。本文首次运用基于稳健MM估计的异常值诊断方法,在生产函数模型的框架下,分别使用两种不同的劳动投入数据,对改革以来我国GDP数据质量进行了评估。结果表明,基于稳健MM估计的异常值诊断方法可有效的解决传统方法容易出现的多个异常点的掩盖现象,改革以来我国的GDP数据是相对可靠的。 相似文献