首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 762 毫秒
1.
为了处理序列曲线间的负相关等问题,并使关联满足一定的性质,基于灰色关联分析的基本思想,本文利用数列曲线的平均相对变化态势构建了一种新的灰色绝对关联度模型,并探讨了模型的唯一性、对称性、相似性、平行性、一致性等性质,而后以新模型对我国及各省市区产业结构的有序度进行了测算,结果表明新模型计算简单,计算量小,更与实际相符。  相似文献   

2.
上交所国债市场利率期限结构及其信息价值   总被引:10,自引:0,他引:10  
首先利用Nelson Siegel参数估计模型求出上交所债券市场债券价格隐含的利率期限结构,发现债券市场隐含的利率期限结构呈现两种典型形状:1996年前为逆向的利率期限结构,1996年后为上升的利率期限结构,然后实证检验了预期假设对上交所国债市场的解释能力,发现预期假设不成立,利率期限结构对以预测债券的回报率,最后实证发现,充分利用利率期限结构信息,债券回报率的可预测性可达50%以上。  相似文献   

3.
In the evaluation of chemical compounds for carcinogenic risk, regulatory agencies such as the U.S. Environmental Protection Agency and National Toxicology Program (NTP) have traditionally fit a dose-response model to data from rodent bioassays, and then used the fitted model to estimate a Virtually Safe Dose or the dose corresponding to a very small increase (usually 10(-6)) in risk over background. Much recent interest has been directed at incorporating additional scientific information regarding the properties of the specific chemical under investigation into the risk assessment process, including biological mechanisms of cancer induction, metabolic pathways, and chemical structure and activity. Despite the fact that regulatory agencies are currently poised to allow use of nonlinear dose-response models based on the concept of an underlying threshold for nongenotoxic chemicals, there have been few attempts to investigate the overall relationship between the shape of dose-response curves and mutagenicity. Using data from an historical database of NTP cancer bioassays, the authors conducted a repeated-measures Analysis of the estimated shape from fitting extended Weibull dose-response curves. It was concluded that genotoxic chemicals have dose-response curves that are closer to linear than those for nongenotoxic chemicals, though on average, both types of compounds have dose-response curves that are convex and the effect of genotoxicity is small.  相似文献   

4.
In this paper, we propose a new learning effect model in which the actual job processing time is a general function of the normal processing time of jobs already processed and its scheduled position. This model has the advantage that different learning curves can be constructed easily, such as the plateau function. It is found that most of the models in the literature are special cases of our proposed model. The optimal sequences for some single-machine problems are then provided.  相似文献   

5.
上交所利率期限结构的三因子广义高斯仿射模型   总被引:3,自引:0,他引:3  
本文以上交所债券价格隐含的利率期限结构数据作为分析对象,首先利用主成份分析法对利率期限结构的变化进行分析,发现需要两个至三个状态变量,利率模型才可能反映利率期限结构的变化。同时在以前的研究里,发现利率期限结构具有一定的可预测性,因此本文选择三因子广义高斯仿射模型描述上交所的利率期限结构。利用卡尔曼滤波法以及极大似然估计法,估计了连续时间的三因子广义高斯仿射模型,模型可以描述上交所利率期限结构的相对变化。  相似文献   

6.
Scour (localized erosion by water) is an important risk to bridges, and hence many infrastructure networks, around the world. In Britain, scour has caused the failure of railway bridges crossing rivers in more than 50 flood events. These events have been investigated in detail, providing a data set with which we develop and test a model to quantify scour risk. The risk analysis is formulated in terms of a generic, transferrable infrastructure network risk model. For some bridge failures, the severity of the causative flood was recorded or can be reconstructed. These data are combined with the background failure rate, and records of bridges that have not failed, to construct fragility curves that quantify the failure probability conditional on the severity of a flood event. The fragility curves generated are to some extent sensitive to the way in which these data are incorporated into the statistical analysis. The new fragility analysis is tested using flood events simulated from a spatial joint probability model for extreme river flows for all river gauging sites in Britain. The combined models appear robust in comparison with historical observations of the expected number of bridge failures in a flood event. The analysis is used to estimate the probability of single or multiple bridge failures in Britain's rail network. Combined with a model for passenger journey disruption in the event of bridge failure, we calculate a system‐wide estimate for the risk of scour failures in terms of passenger journey disruptions and associated economic costs.  相似文献   

7.
On Broadening Failure Rate Distributions in PRA Uncertainty Analyses   总被引:1,自引:0,他引:1  
Several recent nuclear power plant probabilistic risk assessments (PRAs) have utilized broadened Reactor Safety Study (RSS) component failure rate population variability curves to compensate for such things as expert "overvaluation bias" in the estimates upon which the curves are based.
A simple two-components of variation empirical Bayes model is proposed for use in estimating the between-expert variability curve in the presence of such biases. Under certain conditions this curve is a population variability curve. Comparisons are made with the existing method.
The popular procedure appears to be generally much more conservative than the empirical Bayes method in removing such biases. In one case the broadened curve based on the popular method is more than two orders of magnitude broader than the empirical Bayes curve. In another case it is found that the maximum justifiable degree of broadening of the RSS curve is to increase α from 5% to 12%, which is significantly less than the 20% value recommended in the popular approach.  相似文献   

8.
Woody M. Liao 《决策科学》1979,10(1):116-125
Learning curves have important implications for managerial planning and control. This paper considers the effect of learning on managerial planning models for productmix problems that can be handled by a linear-programming formulation. An approach to incorporate learning effects in the planning model is proposed in this paper. The feasibility and superiority of the proposed approach over the traditional approach are discussed through the use of a linear-programming problem.  相似文献   

9.
10.
Pregnant CD-1 mice were exposed to cortisone acetate at doses ranging from 20 to 100 mg/kg/ day on days 10-13 by oral and intramuscular routes. Multiple replicate assays were conducted under identical conditions to assess the reproducibility of the dose–response curve for cleft palate. The data were fitted to the probit, logistic, multistage or Armitage-Doll, and Weibull dose-response model separately for each route of exposure. The curves were then tested for parallel slopes (probit and logistic models) or coincidence of model parameters (multistage and Weibull models). The 19 replicate experiments had a wide range of slope estimates, wider for the oral than for the intramuscular experiments. For all models and both routes of exposure the null hypothesis of equality of slopes was rejected at a significant level of p < 0.001. For the intramuscular group of replicates, rejection of slope equality could in part be explained by not maintaining a standard dosing regime. The rejection of equivalence of dose-response curves from replicate studies showed that it is difficult to reproduce dose-response data of a single study within the limits defined by the dose-response model. This has important consequences for quantitative risk assessment, public health measures, or development of mechanistic theories which are typically based on a single animal bioassay.  相似文献   

11.
This note suggests the use of Bézier curves to model probability distributions on computers. This represents an approach completely different from the current practice, which mostly employs parametric families or piece-wise polynomials. The Bézier curves combine simplicity and flexibility with an easy manipulation method, allowing more accurate curves to be modeled to data.  相似文献   

12.
Data from a human feeding trial with healthy men were used to develop a dose-response model for 13 strains of Salmonella and to determine the effects of strain variation on the shape of the dose-response curve. Dose-response data for individual strains were fit to a three-phase linear model to determine minimum, median, and maximum illness doses, which were used to define Pert distributions in a computer simulation model. Pert distributions for illness dose of individual strains were combined in an Excel spreadsheet using a discrete distribution to model strain prevalence. In addition, a discrete distribution was used to model dose groups and thus create a model that simulated human feeding trials. During simulation of the model with @Risk, an illness dose and a dose consumed were randomly assigned to each consumption event in the simulated feeding trial and if the illness dose was greater than the dose consumed then the model predicted no illness, otherwise the model predicted that an illness would occur. To verify the dose-response model predictions, the original feeding trial was simulated. The dose-response model predicted a median of 69 (range of 43-101) illnesses compared to 74 in the original trial. Thus, its predictions were in agreement with the data used to develop it. However, predictions of the model are only valid for eggnog, healthy men, and the strains and doses of Salmonella used to develop it. When multiple strains of Salmonella were simulated together, the predicted dose-response curves were irregular in shape. Thus, the sigmoid shape of dose-response curves in feeding trials with one strain of Salmonella may not accurately reflect dose response in naturally contaminated food where multiple strains may be present.  相似文献   

13.
While many IT security incidents result in relatively minor operational disruptions or minimal recovery costs, occasionally high-impact security breaches can have catastrophic effects on the firm. Unfortunately, measuring security risk and planning for countermeasures or mitigation is a difficult task. Past research has suggested risk metrics which may be beneficial in understanding and planning for security incidents, but most of these metrics are aimed at identifying expected overall loss and do not directly address the identification of, or planning for, sparse events which might result in high-impact loss. The use of an upper percentile value or some other worst-case measure has been widely discussed in the literature as a means of stochastic optimization, but has not been applied to this decision domain. A key requirement in security planning for any threat scenario, expected or otherwise, is the ability to choose countermeasures optimally with regard to tradeoffs between countermeasure cost and remaining risk. Most of the planning models in the literature are qualitative, and none that we are aware of allow for the optimal determination of these tradeoffs. Therefore, we develop a model for optimally choosing countermeasures to block or mitigate security attacks in the presence of a given threat level profile. We utilize this model to examine scenarios under both expected threat levels and worst-case levels, and develop budget-dependent risk curves. These curves demonstrate the tradeoffs which occur if decision makers divert budgets away from planning for ordinary risk in an effort to mitigate the effects of potential high-impact outcomes.  相似文献   

14.
Nonlinear hazard models are used to examine temporal trends in the age-specific mortality risks of chronic obstructive lung diseases for the U.S. population. These hazard functions are fit to age-specific mortality rates for 1968 and 1977 for four race/sex groups. Changes in the parameters of these models are used to assess two types of differences in the age pattern of the rates between 1968 and 1977. The first measure of trend in the age-specific mortality rates is the temporal change in the proportionality constant in the function used to model their age variation. By allowing only this proportionality parameter to vary between 1968 and 1977, it is possible to determine an age-constant percentage increase or decrease. The second measure reflects the absolute displacement in terms of years of life of the fitted mortality curves for the two time points. This second index can be interpreted as the acceleration or deceleration of mortality risks over the life span, i.e., the number of years that is needed for mortality rates to achieve the same level as in the comparison group. The analysis showed that the age changes in chronic obstructive lung disease mortality rates differed by race/sex group and for both measures of change over the period. Adjustment of the fitted curves for the effects of individual variability in risk was significant for three of four groups.  相似文献   

15.
This paper combines learning curves with a PERT network to produce a dynamic PERT model. The dynamic model takes cognizance of the fact that many projects are of a repetitive nature and that the network may vary between runs of the project through the addition or deletion of activities attendant to producing variations of a basic model. Thus, on any given run, the activities comprising a network will exhibit varying degrees of repetitiveness. The proposed model treats the estimated completion times of activities comprising the network as a function of (1) the number of times the various activities have been repeated on prior runs of the project, and (2) the learning rate attendant to each activity. Thus, the estimated completion time for a run through the project changes as additional units are produced. A sixteen event PERT network is simulated (using the proposed dynamic model) through twenty runs of a project. The simulation is conducted under three situations; namely, with learning taking place on (1) only noncritical activities, (2) only critical activities, and (3) all activities. In all three cases the results are compared to the static PERT model. The implications of the proposed model for improved decision making are presented in the concluding remarks of the paper.  相似文献   

16.
The common approach to balancing mixed-model assembly lines assumes that the line operators are well trained and that the learning effect is negligible. The assumption is that the line operates in steady state over a long period of time. Time-based competition and frequent design changes in many products make this assumption incorrect, and the effect of learning on mixed-model lines should not be neglected. We defined start-up period and developed a model for the line design during start-up. It can be used to evaluate a proposed line design or to develop a feasible line design and to estimate its cost. This proposed model integrates mixed-model learning curves with aggregate planning under learning and a mixed-model line design into a comprehensive framework designed to minimize the total cost of the line during the start-up period.  相似文献   

17.
We describe geometric invariants that characterize the shape of curves and surfaces in 3D space: curvature, Gauss integrals and moments. We apply these invariants to neuroimaging data to determine if they have application for automatically classifying and parcellating cortical data. The curves of sulci and gyri on the cortical surface can be obtained by reconstructing cortical surface representations of the human brain from magnetic resonance imaging (MRI) data. We reconstructed gray matter surfaces for 15 subjects, traced 10 sulcal curves on each surface and computed geometric invariants for each curve. These geometric features were used classify the curves into sulcal and hemispheric classes. The best classification results were obtained when moment-based features were computed on the sulcal curves in native space. Gauss integral measures showed that they were useful for differentiating the hemispheric location of a single sulcus. These promising results may indicate that moment invariants are useful for characterizing shape on a global scale. Gauss integral invariants are potentially useful measures for characterizing cortical shape on a local, rather than global scale. Gauss integrals have found biological significance in characterizing proteins so it is worthwhile to consider their possible application to neuroscientific data.  相似文献   

18.
A mathematical model of receptor-mediated gene expression that includes receptor binding of natural and xenobiotic ligands, protein synthesis and degradation, and metabolism of the xenobiotic ligand was created to identify the determinants of the shape of the dose-response profile. Values of the model's parameters were varied to reflect alternative mechanisms of expression of the protein. These assumptions had dramatic effects on the computed response to a bolus dose of the xenobiotic ligand. If all processes in the model exhibit hyperbolic kinetics, the dose-response curves can appear sigmoidal but actually be linear with a positive slope at low doses. The slope of the curve only approached zero at low dose, indicative of a threshold for response, if binding of the xenobiotic ligand to the receptor exhibited positive cooperativity (ligand binding at one site increases the affinity for ligand at another binding site on the receptor). Positive cooperativity in the rate-limiting step of protein synthesis produced dose-response curves which were "U-shaped" at low doses, also indicative of a threshold. Positive cooperativity in the metabolism of the xenobiotic ligand produced dose-response curves that increased more rapidly than linearly with increasing dose. The model illustrates the fact that response cannot be predicted from qualitative mechanistic arguments alone; any assessment of risk to health from xenobiotic chemicals must be based on a detailed quantitative examination of the kinetic behavior of each chemical species individually.  相似文献   

19.
Ames et al. have proposed a new model for evaluating carcinogenic hazards in the environment. They advocate ranking possible carcinogens on the basis of the TD50, the estimated dose at which 50% of the test animals would get tumors, and extrapolating that ranking to all other doses. We argue that implicit in this methodology is a simplistic and inappropriate statistical model. All carcinogens are assumed to act similarly and to have dose-response curves of the same shape that differ only in the value of one parameter. We show by counterexample that the rank order of cancer potencies for two chemicals can change over a reasonable range of doses. Ames et al.'s use of these TD50 ranks to compare the hazards from low level exposures to contaminants in our food and environment is wholly inappropriate and inaccurate. Their dismissal of public health concern for environmental exposures, in general, based on these comparisons, is not supported by the data.  相似文献   

20.
This research paper proposes new explicit formulas to compute the Tate pairing on Jacobi quartic elliptic curves. We state the first geometric interpretation of the group law on Jacobi quartic curves by presenting the functions which arise in the addition and doubling. We draw together the best possible optimization that can be used to efficiently evaluate the Tate pairing using Jacobi quartic curves. They are competitive with all published formulas for Tate pairing computation using Short Weierstrass or Twisted Edwards curves. Finally we present several examples of pairing-friendly Jacobi quartic elliptic curves which provide optimal Tate pairing.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号