首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 850 毫秒
1.
由于复杂时序存在结构性断点和异常值等问题,往往导致预测模型训练效果不佳,并可能出现极端预测值的情况。为此,本文提出了基于修剪平均的神经网络集成预测方法。该方法首先从训练数据中生成多组训练集,然后分别训练多个神经网络预测模型,最后将多个神经网络的预测结果使用修剪平均策略进行集成。相较于简单平均策略而言,修剪平均策略不容易受到极值的影响,能够使集成模型获得鲁棒性强的预测效果。在实证研究中,本文构造了两种神经网络集成预测模型,分别为基于修剪平均的自举神经网络集成模型(Trimmed Average based Bootstrap Neural Network Ensemble, TA-BNNE)和基于修剪平均的蒙特卡洛神经网络集成模型(Trimmed Average based Monte Carlo Neural Network Ensemble, TA-MCNNE),并采用这两种模型对NN3竞赛数据集进行预测,结果表明在常规和复杂数据集上,修剪平均策略比简单平均策略具有更好的预测精度。此外,本文将所提出的集成模型与NN3的前十名模型进行比较,发现两种模型在全部数据集上均超过了第6名,在复杂数据集上的表现均超过了第1名,进一步验证本文所提方法的有效性。  相似文献   

2.
针对金融市场条件收益存在的有偏胖尾分布与非对称波动性特征以及长记忆特征等典型事实特征,运用ARFIMA-FIGARCH-SKST模型等来测度股市动态风险,并通过规范的返回测试检验中的LRT和DQR方法实证考察了测度模型的可靠性。得到了一些非常有价值的实证结果:有无长记忆约束的非对称结构风险模型在中国大陆沪深股市动态风险测度能力上并无实质性差异;ARFIMA-FIAPARCH-SKST模型能够准确测度股市的动态风险;股票市场极端风险的测度尤其不能放弃非对称结构的这一约束条件。  相似文献   

3.
Quantile regression (QR) fits a linear model for conditional quantiles just as ordinary least squares (OLS) fits a linear model for conditional means. An attractive feature of OLS is that it gives the minimum mean‐squared error linear approximation to the conditional expectation function even when the linear model is misspecified. Empirical research using quantile regression with discrete covariates suggests that QR may have a similar property, but the exact nature of the linear approximation has remained elusive. In this paper, we show that QR minimizes a weighted mean‐squared error loss function for specification error. The weighting function is an average density of the dependent variable near the true conditional quantile. The weighted least squares interpretation of QR is used to derive an omitted variables bias formula and a partial quantile regression concept, similar to the relationship between partial regression and OLS. We also present asymptotic theory for the QR process under misspecification of the conditional quantile function. The approximation properties of QR are illustrated using wage data from the U.S. census. These results point to major changes in inequality from 1990 to 2000.  相似文献   

4.
We consider scheduling issues at Beyçelik, a Turkish automotive stamping company that uses presses to give shape to metal sheets in order to produce auto parts. The problem concerns the minimization of the total completion time of job orders (i.e., makespan) during a planning horizon. This problem may be classified as a combined generalized flowshop and flexible flowshop problem with special characteristics. We show that the Stamping Scheduling Problem is NP‐Hard. We develop an integer programming‐based method to build realistic and usable schedules. Our results show that the proposed method is able to find higher quality schedules (i.e., shorter makespan values) than both the company's current process and a model from the literature. However, the proposed method has a relatively long run time, which is not practical for the company in situations when a (new) schedule is needed quickly (e.g., when there is a machine breakdown or a rush order). To improve the solution time, we develop a second method that is inspired by decomposition. We show that the second method provides higher‐quality solutions—and in most cases optimal solutions—in a shorter time. We compare the performance of all three methods with the company's schedules. The second method finds a solution in minutes compared to Beyçelik's current process, which takes 28 hours. Further, the makespan values of the second method are about 6.1% shorter than the company's schedules. We estimate that the company can save over €187,000 annually by using the second method. We believe that the models and methods developed in this study can be used in similar companies and industries.  相似文献   

5.
本文采用深度门控循环单元(GRU)神经网络探讨三种汇率货币模型(弹性价格、前瞻性和实际利率差模型)的非线性协整关系。GRU技术在深度学习中具有智能记忆、自主学习和强逼近能力等优点。为此,本文运用该技术对6组典型浮动汇率制国别数据进行了非线性Johansen协整检验。结果表明,汇率与宏观经济基本面之间存在非线性协整关系,从而说明了货币模型在非线性条件下的有效性,以及先进的深度学习工具在检验经济理论中的优势。  相似文献   

6.
期货交易的高杠杆率意味着期货市场的高风险特征,而能源市场因其特殊的战略意义一直以来备受关注,因而对能源期货市场的风险测度对投资者和监管者都极其重要。本文对上海燃油期货构建了四个反映不同交割期限的连续价格序列,基于不同的金融市场典型事实分别运用GARCH、GJR、FIGARCH三个模型对波动率建模,并假设条件收益分别服从正态、学生t、有偏学生t(skst)分布进行动态风险价值(VaR)测度,然后运用严格的似然比(LR)检验和动态分位数回归(DQR)检验对风险测度的可靠性进行后验分析(Backtesting),尝试从中提取出在风险管理中最有应用价值的典型事实。研究发现:(1)基于skst分布的波动模型的动态风险测度准确性明显优于其他分布下的相同模型;(2)基于杠杆效应的GJR模型和基于长记忆性的FIGARCH模型并没有表现出比普通GARCH模型更高的精度;(3)远期合约的市场平均收益更高,风险测度比近期合约更准确。  相似文献   

7.
This paper considers the problem of selection of weights for averaging across least squares estimates obtained from a set of models. Existing model average methods are based on exponential Akaike information criterion (AIC) and Bayesian information criterion (BIC) weights. In distinction, this paper proposes selecting the weights by minimizing a Mallows criterion, the latter an estimate of the average squared error from the model average fit. We show that our new Mallows model average (MMA) estimator is asymptotically optimal in the sense of achieving the lowest possible squared error in a class of discrete model average estimators. In a simulation experiment we show that the MMA estimator compares favorably with those based on AIC and BIC weights. The proof of the main result is an application of the work of Li (1987).  相似文献   

8.
Abstract

Resource scheduling for emergency relief operations is complex as it has many constraints. However, an effective allocation and sequencing of resources are crucial for the minimization of the completion times in emergency relief operations. Despite the importance of such decisions, only a few mathematical models of emergency relief operations have been studied. This article presents a bi-objective mixed integer programming (MIP) that helps to minimize both the total weighted time of completion of the demand points and the makespan of the total emergency relief operation. A two-phase method is developed to solve the bi-objective MIP problem. Additionally, a case study of hospital network in the Melbourne metropolitan area is used to evaluate the model. The results indicate that the model can successfully support the decisions required in the optimal resource scheduling of emergency relief operations.  相似文献   

9.
We study a variant of classical scheduling, which is called scheduling with “end of sequence” information. It is known in advance that the last job has the longest processing time. Moreover, the last job is marked, and thus it is known for every new job whether it is the final job of the sequence. We explore this model on two uniformly related machines, that is, two machines with possibly different speeds. Two objectives are considered, maximizing the minimum completion time and minimizing the maximum completion time (makespan). Let s be the speed ratio between the two machines, we consider the competitive ratios which are possible to achieve for the two problems as functions of s. We present algorithms for different values of s and lower bounds on the competitive ratio. The proposed algorithms are best possible for a wide range of values of s. For the overall competitive ratio, we show tight bounds of ϕ + 1 ≈ 2.618 for the first problem, and upper and lower bounds of 1.5 and 1.46557 for the second problem. The authors would like to dedicate this paper to the memory of our colleague and friend Yong He who passed away in August 2005 after struggling with illness. D. Ye: Research was supported in part by NSFC (10601048).  相似文献   

10.
Forecasting the stock market price index is a challenging task. The exponential smoothing model (ESM), autoregressive integrated moving average model (ARIMA), and the back propagation neural network (BPNN) can be used to make forecasts based on time series. In this paper, a hybrid approach combining ESM, ARIMA, and BPNN is proposed to be the most advantageous of all three models. The weight of the proposed hybrid model (PHM) is determined by genetic algorithm (GA). The closing of the Shenzhen Integrated Index (SZII) and opening of the Dow Jones Industrial Average Index (DJIAI) are used as illustrative examples to evaluate the performances of the PHM. Numerical results show that the proposed model outperforms all traditional models, including ESM, ARIMA, BPNN, the equal weight hybrid model (EWH), and the random walk model (RWM).  相似文献   

11.
This paper studies necessity of transversality conditions for the continuous time, reduced form model. By generalizing Benveniste and Scheinkman's (1982) “envelope” condition and Michel's (1990) version of the squeezing argument, we show a generalization of Michel's (1990, Theorem 1) necessity result that does not assume concavity. The generalization enables us to generalize Ekeland and Scheinkman's (1986) result as well as to establish a new result that does not require the objective functional to be finite. The new result implies that homogeneity of the return function alone is sufficient for the necessity of the most standard transversality condition. Our results are also applied to a nonstationary version of the one‐sector growth model. It is shown that bubbles never arise in an equilibrium asset pricing model with a nonlinear constraint.  相似文献   

12.
The focus of this study is on the A+B transportation procurement mechanism, which uses the proposed cost (A component) and the proposed time (B component) to score contractors’ bids. Empirical studies have shown that this mechanism shortens project durations. We use normative models to study the effect of certain discretionary parameters set by state transportation agencies on contractors’ equilibrium bidding strategies, winner selection, and actual completion times. We model the bidding environment in detail including multi‐dimensional bids, contractors’ uncertainty about completion times, and reputation cost. The latter refers to a private penalty that accrues to tardy contractors from increased cost of posting bonds and reduced prospects of winning future projects. Our model explains why contractors may skew line‐item bids and why winners frequently finish earlier than bid. It has several policy implications as well. For example, we recommend that agencies set the daily incentive, disincentive, and road user cost to be equal and not cap incentives. This is a departure from current practice, where incentives are often capped and weaker than penalties. Furthermore, we show that agencies may be justified in setting daily road user cost strictly smaller than the true cost of traffic disruption during construction.  相似文献   

13.
对波动率的建模和估计传统上主要基于由收盘价计算得到的收益率信息,而基于包含更多日内价格变动信息的价格极差对波动率的研究却相对较少.对经典的针对价格极差动态性建模的条件自回归极差(CARR)模型进行扩展,借鉴随机波动率(SV)模型的建模思路,同时考虑波动率的长记忆特征,引入Gamma分布刻画价格极差新息的分布,构建了双因子随机条件极差(2FSCR)模型来描述价格极差的动态性.进一步,基于连续粒子滤波算法,给出了2FSCR模型参数的极大似然估计方法,并通过蒙特卡罗模拟实验表明了该估计方法的有效性.采用上证综合指数(SSE)、深证成份指数(SZSE)、香港恒生指数(HSI)和美国标普500指数(SPX)数据进行了实证研究,结果表明:2FSCR模型相比CARR模型以及单因子的SCR模型都具有更好的数据拟合效果.进一步的模型诊断分析表明,2FSCR模型相比CARR模型和SCR模型能够更好地刻画价格极差新息的尾部分布,能够更充分地捕获波动率的动态特征(时变性、聚集性与长记忆性).采用滚动窗方法对波动率进行预测,利用价格极差与已实现波动率作为比较基准对模型的预测能力进行了比较分析,结果表明:2FSCR模型相比CARR模型和SCR模型都具有更为优越的波动率预测效果.  相似文献   

14.
对协方差矩阵高频估计量和预测模型的选择,共同影响协方差的预测效果,从而影响波动择时投资组合策略的绩效。资产维数很高时,协方差矩阵高频估计量的构建会因非同步交易而丢弃大量数据,降低信息利用效率。鉴于此,将可以充分利用资产日内价格信息的KEM估计量用于估计中国股市资产的高维协方差矩阵,并与两种常用协方差矩阵估计量进行比较。进一步地,将三种估计量分别用于多元异质自回归模型、指数加权移动平均模型以及短、中、长期移动平均模型进行样本外预测,并比较在三种基于风险的投资组合策略下的经济效益。采用上证50指数中20只不同流动性成份股逐笔高频数据的实证研究发现:(1)无论是在市场平稳时期还是市场剧烈震荡期,长期移动平均模型都是高维协方差估计量预测建模的最优选择,在应用于各种波动择时策略时都可以实现最低成本和最高收益。(2)在市场平稳时期,KEM估计量是高维协方差估计的最优选择,应用于各种波动择时策略时基本都可以实现最低成本和最高收益;在市场剧烈震荡期,使用KEM估计量进行波动择时仍然可以在成本方面保持优势,但在收益上并不占优。(3)无论是在市场平稳时期还是市场剧烈震荡期,最低的成本都是在采用等风险贡献投资组合时实现的,而最高的收益则都是在采用最小方差投资组合时实现的。研究不仅首次检验了KEM估计量在常用波动择时策略中的适用性,而且首次实证了实现最为简单的长期移动平均模型在高维协方差矩阵预测中的优越性,对投资决策和风险管理等实务应用都具有重要意义。  相似文献   

15.
Product development occurs in multiproject environments where preemption is often allowed so that critical projects can be addressed immediately. Because product development is characterized by time-based competition, there is pressure to make decisions quickly using heuristics methods that yield fast project completion. Preemption heuristics are needed both to choose activities for preemption and then to determine which resources to use to restart preempted activities. Past research involving preemption has ignored any completion time penalty due to the forgetting experienced by project personnel during preemption and the resulting relearning time required to regain lost proficiency. The purpose of this research is to determine the impact of learning, forgetting, and relearning (LFR) on project completion time when preemption is allowed. We present a model for the LFR cycle in multiproject development environments. We test a number of priority rules for activity scheduling, activity preemption, and resource assignment subsequent to preemption, subject to the existence of the LFR cycle, for which a single type of knowledge worker resource is assigned among multiple projects. The results of the simulation experiments clearly demonstrate that LFR effects are significant. The tests of different scheduling, preemption, and resource reassignment rules show that the choice of rule is crucial in mitigating the completion time penalty effects of the LFR cycle, while maintaining high levels of resource utilization. Specifically, the worst performing rules tested for each performance measure are those that attempt to maintain high resource utilization. The best performing rules are based on activity criticality and resource learning.  相似文献   

16.
We study the problem of optimally sequencing the creation of elements in a software project to optimize a time‐weighted value objective. As elements are created, certain parts of the system (referred to as “groups”) become functional and provide value, even though the entire system has not been completed. The main tradeoff in the sequencing problem arises from elements that belong to multiple groups. On the one hand, creating groups with common elements early in the project reduces the effort required to build later functionality that uses these elements. On the other hand, the early creation of such groups can delay the release of some critical functionality. We formulate the element sequencing problem and propose a heuristic to solve it. This heuristic is compared against a lower bound developed for the problem. Next, we study a more general version of the element sequencing problem in which an element requires some effort to be made reusable. When a reusable element is used in another group, some more effort is needed to specialize the element to work as desired in that group. We study reuse decisions under a weighted completion time objective (i.e., the sum of the completion time of each group weighted by its value is minimized), and show how these decisions differ from those under a traditional makespan objective (i.e., only the final completion time of the project is minimized). A variety of analytical and numerical results are presented. The model is also implemented on data obtained from a real software project. A key finding of this work is that the optimal effort on reuse is never increased (typically lowered) when a weighted completion time objective is used. This finding has implications for managing reuse in projects in which user value influences the order in which functionality is created.  相似文献   

17.
“Time‐to‐build” models of investment expenditures play an important role in many traditional and modern theories of the business cycle, especially for explaining the dynamic propagation of shocks. We estimate the structural parameters of a time‐to‐build model using annual firm‐level investment data on equipment and structures. For expenditures on equipment, we find no evidence of time‐to‐build effects beyond one year. For expenditures on structures, by contrast, there is clear evidence of such effects in the range of two to three years. The contrast between equipment and structures is intuitively reasonable and consistent with previous results. The estimates for structures also indicate that initial‐period expenditures are low and increase as projects near completion. These results provide empirical support for including “time‐to‐plan” effects for investment in structures. More generally, these results suggest a potential source of specification error for Q models of investment and production‐based asset pricing models that ignore the time required to plan, build, and install new capital. (JEL: D24, G31, C33, C34)  相似文献   

18.
We consider semiparametric estimation of the memory parameter in a model that includes as special cases both long‐memory stochastic volatility and fractionally integrated exponential GARCH (FIEGARCH) models. Under our general model the logarithms of the squared returns can be decomposed into the sum of a long‐memory signal and a white noise. We consider periodogram‐based estimators using a local Whittle criterion function. We allow the optional inclusion of an additional term to account for possible correlation between the signal and noise processes, as would occur in the FIEGARCH model. We also allow for potential nonstationarity in volatility by allowing the signal process to have a memory parameter d*1/2. We show that the local Whittle estimator is consistent for d*∈(0,1). We also show that the local Whittle estimator is asymptotically normal for d*∈(0,3/4) and essentially recovers the optimal semiparametric rate of convergence for this problem. In particular, if the spectral density of the short‐memory component of the signal is sufficiently smooth, a convergence rate of n2/5−δ for d*∈(0,3/4) can be attained, where n is the sample size and δ>0 is arbitrarily small. This represents a strong improvement over the performance of existing semiparametric estimators of persistence in volatility. We also prove that the standard Gaussian semiparametric estimator is asymptotically normal if d*=0. This yields a test for long memory in volatility.  相似文献   

19.
The primary mission of search and rescue (SAR) is the saving of lives. To assess SAR operations from a planning perspective, one must draw a connection between operations and the number of lives saved. Our approach is to model the probability that an incident results in at least one fatality, given the response time between the time of incident occurrence and time of rescue. We show that incidents involving air crashes, capsizing, foundering, grounding and other/unknown types of incidents tended to have higher probabilities of fatalities as the response time became higher. However, other emergency types did not exhibit the same overall tendency as these did. These statistical results do not prove causality between faster response times and lower fatality incidence for the above-mentioned emergency types. They can be used, however, for estimating the average number of fatalities for a given distribution of response time, and ultimately the marginal savings in lives for a change in the mix of resources and locations.  相似文献   

20.
Cointegrated bivariate nonstationary time series are considered in a fractional context, without allowance for deterministic trends. Both the observable series and the cointegrating error can be fractional processes. The familiar situation in which the respective integration orders are 1 and 0 is nested, but these values have typically been assumed known. We allow one or more of them to be unknown real values, in which case Robinson and Marinucci (2001, 2003) have justified least squares estimates of the cointegrating vector, as well as narrow‐band frequency‐domain estimates, which may be less biased. While consistent, these estimates do not always have optimal convergence rates, and they have nonstandard limit distributional behavior. We consider estimates formulated in the frequency domain, that consequently allow for a wide variety of (parametric) autocorrelation in the short memory input series, as well as time‐domain estimates based on autoregressive transformation. Both can be interpreted as approximating generalized least squares and Gaussian maximum likelihood estimates. The estimates share the same limiting distribution, having mixed normal asymptotics (yielding Wald test statistics with χ2 null limit distributions), irrespective of whether the integration orders are known or unknown, subject in the latter case to their estimation with adequate rates of convergence. The parameters describing the short memory stationary input series are √n‐consistently estimable, but the assumptions imposed on these series are much more general than ones of autoregressive moving average type. A Monte Carlo study of finite‐sample performance is included.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号