首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 514 毫秒
1.
Obvious spatial infection patterns are often observed in cases associated with airborne transmissible diseases. Existing quantitative infection risk assessment models analyze the observed cases by assuming a homogeneous infectious particle concentration and ignore the spatial infection pattern, which may cause errors. This study aims at developing an approach to analyze spatial infection patterns associated with infectious respiratory diseases or other airborne transmissible diseases using infection risk assessment and likelihood estimation. Mathematical likelihood, based on binomial probability, was used to formulate the retrospective component with some additional mathematical treatments. Together with an infection risk assessment model that can address spatial heterogeneity, the method can be used to analyze the spatial infection pattern and retrospectively estimate the influencing parameters causing the cases, such as the infectious source strength of the pathogen. A Varicella outbreak was selected to demonstrate the use of the new approach. The infectious source strength estimated by the Wells‐Riley concept using the likelihood estimation was compared with the estimation using the existing method. It was found that the maximum likelihood estimation matches the epidemiological observation of the outbreak case much better than the estimation under the assumption of homogeneous infectious particle concentration. Influencing parameters retrospectively estimated using the new approach can be used as input parameters in quantitative infection risk assessment of the disease under other scenarios. The approach developed in this study can also serve as an epidemiological tool in outbreak investigation. Limitations and further developments are also discussed.  相似文献   

2.
Recent concern with the potential for stray carbon fibers to damage electronic equipment and cause economic losses has led to the development of advanced risk-assessment methods. Risk assessment often requires the synthesis of risk profiles which represent the probability distribution of total annual losses due to a certain set of events or activities. A number of alternative probabilistic models are presented which the authors have used to develop such profiles. Examples are given of applications of these methods to assessment of risk due to conductive fibers released from aircraft or automobile fires. These assessments usually involve a two-stage approach: estimation of losses for several subclassifications of the overall process, and synthesis of the results into an aggregate risk profile. The methodology presented is capable of treating a wide variety of situations involving sequences of random physical events.  相似文献   

3.

Vendor rating can be done using the Analytic Hierarchy Process (AHP) by a single decision maker or a group of decision makers. This approach may suffer from some drawbacks including bias in estimation process. The proposed methodology in this paper involves estimation by a group on an individual basis following the principle of anonymity. A control chart is constructed with an upper control limit and a lower control limit. Implementation ofthiscontrol chart will take into account the dynamic nature of vendor performance and also can be used for continuous monitoring of the vendor performance. This procedure can be used for a single vendor as well as for multiple vendor rating.  相似文献   

4.
In general, due to inherently high complexity, carbon prices simultaneously contain linear and nonlinear patterns. Although the traditional autoregressive integrated moving average (ARIMA) model has been one of the most popular linear models in time series forecasting, the ARIMA model cannot capture nonlinear patterns. The least squares support vector machine (LSSVM), a novel neural network technique, has been successfully applied in solving nonlinear regression estimation problems. Therefore, we propose a novel hybrid methodology that exploits the unique strength of the ARIMA and LSSVM models in forecasting carbon prices. Additionally, particle swarm optimization (PSO) is used to find the optimal parameters of LSSVM in order to improve the prediction accuracy. For verification and testing, two main future carbon prices under the EU ETS were used to examine the forecasting ability of the proposed hybrid methodology. The empirical results obtained demonstrate the appeal of the proposed hybrid methodology for carbon price forecasting.  相似文献   

5.
Many approaches to estimation of panel models are based on an average or integrated likelihood that assigns weights to different values of the individual effects. Fixed effects, random effects, and Bayesian approaches all fall into this category. We provide a characterization of the class of weights (or priors) that produce estimators that are first‐order unbiased. We show that such bias‐reducing weights will depend on the data in general unless an orthogonal reparameterization or an essentially equivalent condition is available. Two intuitively appealing weighting schemes are discussed. We argue that asymptotically valid confidence intervals can be read from the posterior distribution of the common parameters when N and T grow at the same rate. Next, we show that random effects estimators are not bias reducing in general and we discuss important exceptions. Moreover, the bias depends on the Kullback–Leibler distance between the population distribution of the effects and its best approximation in the random effects family. Finally, we show that, in general, standard random effects estimation of marginal effects is inconsistent for large T, whereas the posterior mean of the marginal effect is large‐T consistent, and we provide conditions for bias reduction. Some examples and Monte Carlo experiments illustrate the results.  相似文献   

6.
This article argues that shareholder primacy cannot be defended on the grounds that there is something special about the position of shareholders that grounds a right to preferential treatment on part of management. The notions of property and contract, traditionally thought to ground such a right, are now widely recognized as incapable of playing that role. This leaves shareholder theorists with two options. They can either abandon the project of arguing for their view on broadly deontological grounds and try to advance consequentialist arguments instead or they can search for other morally relevant properties that could ground shareholder rights. The most sustained argument in the latter vein is Marcoux's attempt to show that the vulnerability of shareholders mandates that managers are their fiduciaries. I show that this argument leads to the unacceptable conclusion that it would be unethical for corporations to make incomplete contracts with nonshareholding stakeholders.  相似文献   

7.
Omitted variables create endogeneity and thus bias the estimation of the causal effect of measured variables on outcomes. Such measured variables are ubiquitous and include perceptions, attitudes, emotions, behaviors, and choices. Even experimental studies are not immune to the endogeneity problem. I propose a solution to this challenge: Experimentally randomized instrumental variables (ERIVs), which can correct for endogeneity bias via instrumental variable estimation. Such ERIVs can be generated in laboratory or field settings. Using perceptions as an example of a measured variable, I examine 74 recent articles from two top-tier management journals. The estimation methods commonly used exposed estimates to potential endogeneity bias; yet, authors incorrectly interpreted the estimated coefficients as causal in all cases. Then I demonstrate the mechanics of the ERIV procedure using simulated data and show how researchers can apply this methodology in a real experimental context.  相似文献   

8.
This paper presents a solution to an important econometric problem, namely the root n consistent estimation of nonlinear models with measurement errors in the explanatory variables, when one repeated observation of each mismeasured regressor is available. While a root n consistent estimator has been derived for polynomial specifications (see Hausman, Ichimura, Newey, and Powell (1991)), such an estimator for general nonlinear specifications has so far not been available. Using the additional information provided by the repeated observation, the suggested estimator separates the measurement error from the “true” value of the regressors thanks to a useful property of the Fourier transform: The Fourier transform converts the integral equations that relate the distribution of the unobserved “true” variables to the observed variables measured with error into algebraic equations. The solution to these equations yields enough information to identify arbitrary moments of the “true,” unobserved variables. The value of these moments can then be used to construct any estimator that can be written in terms of moments, including traditional linear and nonlinear least squares estimators, or general extremum estimators. The proposed estimator is shown to admit a representation in terms of an influence function, thus establishing its root n consistency and asymptotic normality. Monte Carlo evidence and an application to Engel curve estimation illustrate the usefulness of this new approach.  相似文献   

9.
目前,加权平均法是一种比较常见的满意度调查结果的汇总方法,但是这种方法的前提条件是决策者的偏好结构满足加性独立条件,否则需要采用非线性综合方法。本文旨在考虑决策者偏好不满足加性独立条件下,将用户满意度抽样调查过程中产生的置信度和置信区间与调查问卷中的用户不确定的评价结果统一进行考虑,并采用mass函数值为区间数的证据推理方法分析基于抽样调查得到的以置信区间表示的用户满意度调查的结果综合问题。最后以某网络信息中心用户满意度调查为例展开实证分析。  相似文献   

10.
This study illustrates a newly developed methodology, as a part of the U.S. EPA ecological risk assessment (ERA) framework, to predict exposure concentrations in a marine environment due to underwater release of oil and gas. It combines the hydrodynamics of underwater blowout, weathering algorithms, and multimedia fate and transport to measure the exposure concentration. Naphthalene and methane are used as surrogate compounds for oil and gas, respectively. Uncertainties are accounted for in multimedia input parameters in the analysis. The 95th percentile of the exposure concentration (EC95%) is taken as the representative exposure concentration for the risk estimation. A bootstrapping method is utilized to characterize EC95% and associated uncertainty. The toxicity data of 19 species available in the literature are used to calculate the 5th percentile of the predicted no observed effect concentration (PNEC5%) by employing the bootstrapping method. The risk is characterized by transforming the risk quotient (RQ), which is the ratio of EC95% to PNEC5%, into a cumulative risk distribution. This article describes a probabilistic basis for the ERA, which is essential from risk management and decision‐making viewpoints. Two case studies of underwater oil and gas mixture release, and oil release with no gaseous mixture are used to show the systematic implementation of the methodology, elements of ERA, and the probabilistic method in assessing and characterizing the risk.  相似文献   

11.
Losses due to natural hazard events can be extraordinarily high and difficult to cope with. Therefore, there is considerable interest to estimate the potential impact of current and future extreme events at all scales in as much detail as possible. As hazards typically spread over wider areas, risk assessment must take into account interrelations between regions. Neglecting such interdependencies can lead to a severe underestimation of potential losses, especially for extreme events. This underestimation of extreme risk can lead to the failure of riskmanagement strategies when they are most needed, namely, in times of unprecedented events. In this article, we suggest a methodology to incorporate such interdependencies in risk via the use of copulas. We demonstrate that by coupling losses, dependencies can be incorporated in risk analysis, avoiding the underestimation of risk. Based on maximum discharge data of river basins and stream networks, we present and discuss different ways to couple loss distributions of basins while explicitly incorporating tail dependencies. We distinguish between coupling methods that require river structure data for the analysis and those that do not. For the later approach we propose a minimax algorithm to choose coupled basin pairs so that the underestimation of risk is avoided and the use of river structure data is not needed. The proposed methodology is especially useful for large‐scale analysis and we motivate and apply our method using the case of Romania. The approach can be easily extended to other countries and natural hazards.  相似文献   

12.
The inclusion of deep tissue lymph nodes (DTLNs) or nonvisceral lymph nodes contaminated with Salmonella in wholesale fresh ground pork (WFGP) production may pose risks to public health. To assess the relative contribution of DTLNs to human salmonellosis occurrence associated with ground pork consumption and to investigate potential critical control points in the slaughter‐to‐table continuum for the control of human salmonellosis in the United States, a quantitative microbial risk assessment (QMRA) model was established. The model predicted an average of 45 cases of salmonellosis (95% CI = [19, 71]) per 100,000 Americans annually due to WFGP consumption. Sensitivity analysis of all stochastic input variables showed that cooking temperature was the most influential parameter for reducing salmonellosis cases associated with WFGP meals, followed by storage temperature and Salmonella concentration on contaminated carcass surface before fabrication. The input variables were grouped to represent three main factors along the slaughter‐to‐table chain influencing Salmonella doses ingested via WFGP meals: DTLN‐related factors, factors at processing other than DTLNs, and consumer‐related factors. The evaluation of the impact of each group of factors by second‐order Monte Carlo simulation showed that DTLN‐related factors had the lowest impact on the risk estimate among the three groups of factors. These findings indicate that interventions to reduce Salmonella contamination in DTLNs or to remove DTLNs from WFGP products may be less critical for reducing human infections attributable to ground pork than improving consumers’ cooking habits or interventions of carcass decontamination at processing.  相似文献   

13.
We propose a framework for out‐of‐sample predictive ability testing and forecast selection designed for use in the realistic situation in which the forecasting model is possibly misspecified, due to unmodeled dynamics, unmodeled heterogeneity, incorrect functional form, or any combination of these. Relative to the existing literature (Diebold and Mariano (1995) and West (1996)), we introduce two main innovations: (i) We derive our tests in an environment where the finite sample properties of the estimators on which the forecasts may depend are preserved asymptotically. (ii) We accommodate conditional evaluation objectives (can we predict which forecast will be more accurate at a future date?), which nest unconditional objectives (which forecast was more accurate on average?), that have been the sole focus of previous literature. As a result of (i), our tests have several advantages: they capture the effect of estimation uncertainty on relative forecast performance, they can handle forecasts based on both nested and nonnested models, they allow the forecasts to be produced by general estimation methods, and they are easy to compute. Although both unconditional and conditional approaches are informative, conditioning can help fine‐tune the forecast selection to current economic conditions. To this end, we propose a two‐step decision rule that uses current information to select the best forecast for the future date of interest. We illustrate the usefulness of our approach by comparing forecasts from leading parameter‐reduction methods for macroeconomic forecasting using a large number of predictors.  相似文献   

14.
Environmental tobacco smoke (ETS) is a major contributor to indoor human exposures to fine particulate matter of 2.5 μm or smaller (PM2.5). The Stochastic Human Exposure and Dose Simulation for Particulate Matter (SHEDS‐PM) Model developed by the U.S. Environmental Protection Agency estimates distributions of outdoor and indoor PM2.5 exposure for a specified population based on ambient concentrations and indoor emissions sources. A critical assessment was conducted of the methodology and data used in SHEDS‐PM for estimation of indoor exposure to ETS. For the residential microenvironment, SHEDS uses a mass‐balance approach, which is comparable to best practices. The default inputs in SHEDS‐PM were reviewed and more recent and extensive data sources were identified. Sensitivity analysis was used to determine which inputs should be prioritized for updating. Data regarding the proportion of smokers and “other smokers” and cigarette emission rate were found to be important. SHEDS‐PM does not currently account for in‐vehicle ETS exposure; however, in‐vehicle ETS‐related PM2.5 levels can exceed those in residential microenvironments by a factor of 10 or more. Therefore, a mass‐balance‐based methodology for estimating in‐vehicle ETS PM2.5 concentration is evaluated. Recommendations are made regarding updating of input data and algorithms related to ETS exposure in the SHEDS‐PM model. Interindividual variability for ETS exposure was quantified. Geographic variability in ETS exposure was quantified based on the varying prevalence of smokers in five selected locations in the United States.  相似文献   

15.
Biological weapons are considered, by the international community, as a weapon of mass destruction. When the Biological Weapons Convention (BWC) was negotiated during the 1960s and early 1970s, the negotiators considered unnecessary the establishment of an international organisation to supervise the implementation of the Convention’s provisions by the State parties. It is important to highlight that since the entry into force of the BWC, the international situation has significantly changed. For this reason perhaps the moment has arrived to consider again the proposal of settling down such an organisation in the framework of the strengthening the BWC.  相似文献   

16.
We consider the scheduling of ground station support times to low Earth orbit (LEO) satellites with overlapping visibilities. LEO satellites typically complete a revolution around the Earth in less than four hours at an altitude of a few hundred miles and are part of the critical infrastructure for natural resource management, crop yield estimation, meteorology, flood control, communication, and space research. Because these satellites are quite expensive to launch and operate, utilizing them in the best possible manner is of paramount importance for the agencies that own them. A ground station provides support time to a satellite to perform a variety of tasks when the satellite is visible to the station over a prespecified planning horizon; the payoff from providing such support is a function of the support time. When two or more satellites pass over the ground station, their visibility time windows may overlap. Thus, under overlapping visibilities, a relevant problem is that of scheduling ground station support time for each satellite with the objective of maximizing the total utility generated from supporting the satellites. We propose four basic scheduling models to address a variety of scenarios and investigate their computational complexities. For each model, we also identify special cases that are polynomially solvable.  相似文献   

17.
在软件项目开发过程中,准确估算出软件成本在提高软件质量和保障软件成功开发方面起到重要支撑作用。针对软件项目历史数据库中部分属性在项目开发初期难以给予精确数值(仅仅能给出模糊数),而已有软件成本估算模型不能很好地处理模糊信息的问题,本文在基于案例推理模型(CBR)基础上集成广义模糊数,提出了基于广义模糊数的CBR模型。使用基于广义模糊数的相似度度量方法代替传统CBR模型中采用的欧式距离等相似度度量方法,采用模糊C均值聚类(FCM)方法将已有软件项目历史数据库中的精确数值进行模糊化处理,以匹配新项目中的模糊数。进一步采用粒子群算法(PSO)来优化属性的权重,构建基于广义模糊数的加权CBR模型。最终在实验中采用Desharnais数据来检验构建模型的有效性。实证结果表明,在与常用的欧式距离CBR模型相比,构建的基于广义模糊数的加权CBR模型能有效提高估算精度,采用PSO优化属性权重能提高模型的估算精度。  相似文献   

18.
As high-speed networks have proliferated across the globe, their topologies have become sparser due to the increased reliability of components and cost considerations. Reliability has been a traditional goal within network design optimization. An alternative design consideration, network resilience, has not been studied or incorporated into network designs nearly as much. The authors propose a methodology for the difficult estimation of traffic efficiency (TE), a measure of network resilience, and a hybrid genetic algorithm to design networks using this measure.  相似文献   

19.
This article describes the development of a generic loss assessment methodology, which is applicable to earthquake and windstorm perils worldwide. The latest information regarding hazard estimation is first integrated with the parameters that best describe the intensity of the action of both windstorms and earthquakes on building structures, for events with defined average return periods or recurrence intervals. The subsequent evaluation of building vulnerability (damageability) under the action of both earthquake and windstorm loadings utilizes information on damage and loss from past events, along with an assessment of the key building properties (including age and quality of design and construction), to assess information about the ability of buildings to withstand such loadings and hence to assign a building type to the particular risk or portfolio of risks. This predicted damage information is then translated into risk-specific mathematical vulnerability functions, which enable numerical evaluation of the probability of building damage arising at various defined levels. By assigning cost factors to the defined damage levels, the associated computation of total loss at a given level of hazard may be achieved. This developed methodology is universal in the sense that it may be applied successfully to buildings situated in a variety of earthquake and windstorm environments, ranging from very low to extreme levels of hazard. As a loss prediction tool, it enables accurate estimation of losses from potential scenario events linked to defined return periods and, hence, can greatly assist risk assessment and planning.  相似文献   

20.
A combinatorial optimization problem, called the Bandpass Problem, is introduced. Given a rectangular matrix A of binary elements {0,1} and a positive integer B called the Bandpass Number, a set of B consecutive non-zero elements in any column is called a Bandpass. No two bandpasses in the same column can have common rows. The Bandpass problem consists of finding an optimal permutation of rows of the matrix, which produces the maximum total number of bandpasses having the same given bandpass number in all columns. This combinatorial problem arises in considering the optimal packing of information flows on different wavelengths into groups to obtain the highest available cost reduction in design and operating the optical communication networks using wavelength division multiplexing technology. Integer programming models of two versions of the bandpass problems are developed. For a matrix A with three or more columns the Bandpass problem is proved to be NP-hard. For matrices with two or one column a polynomial algorithm solving the problem to optimality is presented. For the general case fast performing heuristic polynomial algorithms are presented, which provide near optimal solutions, acceptable for applications. High quality of the generated heuristic solutions has been confirmed in the extensive computational experiments. As an NP-hard combinatorial optimization problem with important applications the Bandpass problem offers a challenge for researchers to develop efficient computational solution methods. To encourage the further research a Library of Bandpass Problems has been developed. The Library is open to public and consists of 90 problems of different sizes (numbers of rows, columns and density of non-zero elements of matrix A and bandpass number B), half of them with known optimal solutions and the second half, without.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号