首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The declining employment fortunes of Britain's non-white communities, relative to whites, are explained. To achieve this comparison, some historical official sources of unemployment data are reviewed. The earliest known official time series on unemployment of non-whites dates back to 1960. This historical context is explored, especially the reluctance to make the early data more widely known. An unemployment series for non-white males and females from 1970 to 1999 is derived in two separate ways by splicing together official sources. These series are compared with unemployment of whites to demonstrate a relative increase in unemployment of non-whites.  相似文献   

2.
This article presents some applications of time-series procedures to solve two typical problems that arise when analyzing demographic information in developing countries: (1) unavailability of annual time series of population growth rates (PGRs) and their corresponding population time series and (2) inappropriately defined population growth goals in official population programs. These problems are considered as situations that require combining information of population time series. Firstly, we suggest the use of temporal disaggregation techniques to combine census data with vital statistics information in order to estimate annual PGRs. Secondly, we apply multiple restricted forecasting to combine the official targets on future PGRs with the disaggregated series. Then, we propose a mechanism to evaluate the compatibility of the demographic goals with the annual data. We apply the aforementioned procedures to data of the Mexico City Metropolitan Zone divided by concentric rings and conclude that the targets established in the official program are not feasible. Hence, we derive future PGRs that are both in line with the official targets and with the historical demographic behavior. We conclude that growth population programs should be based on this kind of analysis to be supported empirically. So, through specialized multivariate time-series techniques, we propose to obtain first an optimal estimate of a disaggregate vector of population time series and then, produce restricted forecasts in agreement with some data-based population policies here derived.  相似文献   

3.
Why do the three quarterly GNP inflation measures differ so much when they are constructed from the same underlying price data? Algebraically and in tables using data of the second quarter of 1984, it is shown that these differences occur because of quarterly shifts in the composition of the nation's product. Disaggregation of the inflation contributions of the GNP components also makes it clear why, for quarterly analyses, the GNP chain price index is superior to both the implicit GNP deflator and the fixed-weight GNP price index. In particular, the implicit GNP deflator can give severely distorted inflation signals.  相似文献   

4.
Nonstationary time series are frequently detrended in empirical investigations by regressing the series on time or a function of time. The effects of the detrending on the tests for causal relationships in the sense of Granger are investigated using quarterly U.S. data. The causal relationships between nominal or real GNP and M1, inferred from the Granger–Sims tests, are shown to depend very much on, among other factors, whether or not series are detrended. Detrending tends to remove or weaken causal relationships, and conversely, failure to detrend tends to introduce or enhance causal relationships. The study suggests that we need a more robust test or a better definition of causality.  相似文献   

5.
A threshold autoregressive (TAR) model is an important class of nonlinear time series models that possess many desirable features such as asymmetric limit cycles and amplitude-dependent frequencies. Statistical inference for the TAR model encounters a major difficulty in the estimation of thresholds, however. This article develops an efficient procedure to estimate the thresholds. The procedure first transforms multiple-threshold detection to a regression variable selection problem, and then employs a group orthogonal greedy algorithm to obtain the threshold estimates. Desirable theoretical results are derived to lend support to the proposed methodology. Simulation experiments are conducted to illustrate the empirical performances of the method. Applications to U.S. GNP data are investigated.  相似文献   

6.
Revisions of the early GNP estimates may contain elements of measurement errors as well as forecast errors. These types of error behave differently but need to satisfy a common set of criteria for well-behavedness. This article tests these criteria for U.S. GNP revisions. The tests are similar to tests of rationality and are based on the generalized method of moments estimator. The flash, 15-day, and 45-day estimates are found to be ill behaved, but the 75-day estimate satisfies the criteria for well-behavedness.  相似文献   

7.
Two methods of using labor-market data as indicators of contemporaneous gross national product (GNP) are developed. The establishment survey data are used by inverting a partial-adjustment equation for hours. A second GNP forecast can be extracted from the household survey by using Okun's law. Using preliminary rather than final data adds about .2 to .4 percentage point to the expected value of the root mean squared errors and changes the weights that the pooling procedure assigns to the two forecasts. The use of preliminary rather than final data results in a procedure that assigns more importance to the Okun's-law forecast.  相似文献   

8.
A spectral decomposition method is described for obtaining an upper bound on the amount of measurement error in a time series. The method is applied to generated data and to M1b, real GNP, and the CPI. The bounds provide insight into both the amount of measurement error in these series and the stochastic specification of the errors.  相似文献   

9.
A bootstrap algorithm is proposed for testing Gaussianity and linearity in stationary time series, and consistency of the relevant bootstrap approximations is proven rigorously for the first time. Subba Rao and Gabr (1980) and Hinich (1982) have formulated some well-known nonparametric tests for Gaussianity and linearity based on the asymptotic distribution of the normalized bispectrum. The proposed bootstrap procedure gives an alternative way to approximate the finite-sample null distribution of such test statistics. We revisit a modified form of Hinich's test utilizing kernel smoothing, and compare its performance to the bootstrap test on several simulated data sets and two real data sets—the S&P 500 returns and the quarterly US real GNP growth rate. Interestingly, Hinich's test and the proposed bootstrapped version yield substantially different results when testing Gaussianity and linearity of the GNP data.  相似文献   

10.
We compare the forecast accuracy of autoregressive integrated moving average (ARIMA) models based on data observed with high and low frequency, respectively. We discuss how, for instance, a quarterly model can be used to predict one quarter ahead even if only annual data are available, and we compare the variance of the prediction error in this case with the variance if quarterly observations were indeed available. Results on the expected information gain are presented for a number of ARIMA models including models that describe the seasonally adjusted gross national product (GNP) series in the Netherlands. Disaggregation from annual to quarterly GNP data has reduced the variance of short-run forecast errors considerably, but further disaggregation from quarterly to monthly data is found to hardly improve the accuracy of monthly forecasts.  相似文献   

11.
施发启 《统计研究》1996,13(1):19-22
The paper makes a forecast and analysis of economic growth rate in the next 15 years, both quantitatively and qualitatively, by using the Harrod-Doma Economic growth model and economic cycle model. The results show that the goal of quadrupling GNP per capita by the end of 2000 proposed in the fifth plenary session of the 14th central committee of the CPC can be realized two years ahead of schedule, the goal that the 2010 GNP should double the 2000 GNP may be attained a year in advance.  相似文献   

12.
Recently, Perron has carried out tests of the unit-root hypothesis against the alternative hypothesis of trend stationarity with a break in the trend occurring at the Great Crash of 1929 or at the 1973 oil-price shock. His analysis covers the Nelson–Plosser macroeconomic data series as well as a postwar quarterly real gross national product (GNP) series. His tests reject the unit-root null hypothesis for most of the series. This article takes issue with the assumption used by Perron that the Great Crash and the oil-price shock can be treated as exogenous events. A variation of Perron's test is considered in which the breakpoint is estimated rather than fixed. We argue that this test is more appropriate than Perron's because it circumvents the problem of data-mining. The asymptotic distribution of the estimated breakpoint test statistic is determined. The data series considered by Perron are reanalyzed using this test statistic. The empirical results make use of the asymptotics developed for the test statistic as well as extensive finite-sample corrections obtained by simulation. The effect on the empirical results of fat-tailed and temporally dependent innovations is investigated, in brief, by treating the breakpoint as endogenous, we find that there is less evidence against the unit-root hypothesis than Perron finds for many of the data series but stronger evidence against it for several of the series, including the Nelson-Plosser industrial-production, nominal-GNP, and real-GNP series.  相似文献   

13.
The article begins by surveying the existing results on the new Divisia monetary aggregates. Charts display the differences in behavior between the Divisia aggregates and the Federal Reserve's official simple-sum monetary aggregates. The article then compares system-wide fit for the simple-sum and Divisia monetary aggregates when used as data in the joint estimation of a system of demand equations. The demand system is derived from a new Laurent expansion approximation to the reciprocal indirect utility function. The Laurent expansion provides a better-behaved remainder term than that of the more commonly used Taylor series. The results favor the Divisia aggregates.  相似文献   

14.
Durairajan and Raman (1996 a, b) studied the robustness of Locally most powerful invariant (LMPI) tests for compound normal model in control and treatment populations. In the present paper, the Locally most powerful (LMP) tests are constructed for no contamination in normal mixture model through testing the parameter of mixture of distributions and the mixing proportion. The expected performance of LMP tests are compared using Efron's Statistical Curvature on the lines of Sen Gupta and Pal (1991). The Locally most powerful similar (LMPS) tests for the equality of control and treatment populations in the presence of nuisance parameters are also constructed. Further, the null and non-null distributions of the test statistics are derived and some power computations are made. Received: September 1, 1999; revised version: August 31, 2000  相似文献   

15.
This article presents a sequential scoring analysis of six econometric forecast distributions for the main components of the annual U.S. gross national product (GNP) accounts—nominal GNP, real GNP, and the implicit price deflator. Analysis of sequential forecasts is presented in terms of proper scoring rules. Computations relevant to the calibration and refinement properties of the forecast distributions are discussed. Annual data are studied for the period 1952–1982. The six forecast distributions are distinguished by the different stances they entail with respect to a subjectivist characterization of the rational-expectations hypothesis.  相似文献   

16.
Many statistical series that are available from official agencies, such as the Office for National Statistics in the UK and the Bureau of Economic Analysis in the USA, are subject to an extensive process of revision and refinement. This feature of the data is often not explicitly recognized by users even though it may be important to their use of the data. The starting-point of this study is to conceptualize and model the data measurement process as it is relevant to the index of production (IOP). The IOP attracts considerable attention because of its timely publication and importance as an indicator of the UK's industrial base. This study shows that there is one common stochastic trend (and one common factor in terms of observable variables) `driving' 13 vintages of data on the IOP. Necessary and sufficient conditions are derived for the `final' vintage of data on the IOP to be the permanent component of the series in the Gonzalo–Granger sense, and the revisions to be the transitory components. These conditions are not satisfied for the IOP; hence, the per-manent component is a function of all the published vintages.  相似文献   

17.
The article presents extensive results from testing for bias and serially correlated errors in a collection of time series of quarterly multiperiod forecasts for six variables including real GNP growth, inflation, and unemployment. The analysis covers responses by 79 frequent participants in economic outlook surveys conducted regularly since 1968. It shows much greater incidence of apparently systematic errors for inflation than for the other variables. Also, the tests are more favorable to composite group forecasts than to most of the individual forecast sets.  相似文献   

18.
We develop a continuous-time model for analyzing and valuing catastrophe mortality contingent claims based on stochastic modeling of the force of mortality. We derive parameter estimates from a 105-year time series of U.S. population mortality data using a simulated maximum likelihood approach based on a particle filter. Relying on the resulting parameters, we calculate loss profiles for a representative catastrophe mortality transaction and compare them to the “official” loss profiles that are provided by the issuers to investors and rating agencies. We find that although the loss profiles are subject to great uncertainties, the official figures fall significantly below the corresponding risk statistics based on our model. In particular, we find that the annualized incidence probability of a mortality catastrophe, defined as a 15% increase in aggregated mortality probabilities, is about 1.4%—compared to about 0.1% according to the official loss profiles.  相似文献   

19.
Series evaluation of Tweedie exponential dispersion model densities   总被引:2,自引:0,他引:2  
Exponential dispersion models, which are linear exponential families with a dispersion parameter, are the prototype response distributions for generalized linear models. The Tweedie family comprises those exponential dispersion models with power mean-variance relationships. The normal, Poisson, gamma and inverse Gaussian distributions belong to theTweedie family. Apart from these special cases, Tweedie distributions do not have density functions which can be written in closed form. Instead, the densities can be represented as infinite summations derived from series expansions. This article describes how the series expansions can be summed in an numerically efficient fashion. The usefulness of the approach is demonstrated, but full machine accuracy is shown not to be obtainable using the series expansion method for all parameter values. Derivatives of the density with respect to the dispersion parameter are also derived to facilitate maximum likelihood estimation. The methods are demonstrated on two data examples and compared with with Box-Cox transformations and extended quasi-likelihoood.  相似文献   

20.
In this paper, it is proposed to modify autoregressive fractionally integrated moving average (ARFIMA) processes by introducing an additional parameter to comply with the criticism of Hauser et al . (1999) that ARFIMA processes are not appropriate for the estimation of persistence, because of the degenerate behavior of their spectral densities at frequency zero. When fitting these modified ARFIMA processes to the US GNP, it turns out that the estimated spectra are very similar to those obtained with conventional ARFIMA models, indicating that, in this special case, the disadvantage of ARFIMA models cited by Hauser et al. (1999) does not seriously aff ect the estimation of persistence. However, according to the results of a goodness-of-fit test applied to the estimated spectra, both the ARFIMA models and the modified ARFIMA models seem to overfit the data in the neighborhood of frequency zero.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号