首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 411 毫秒
1.
This article investigates the use of dynamic laboratory simulations as a tool for studying decisions to prepare for hurricane threats. A prototype web‐based simulation named Stormview is described that allows individuals to experience the approach of a hurricane in a computer‐based environment. In Stormview participants can gather storm information through various media, hear the opinions of neighbors, and indicate intentions to take protective action. We illustrate how the ability to exert experimental control over the information viewed by participants can be used to provide insights into decision making that would be difficult to gain from field studies, such as how preparedness decisions are affected by the nature of news coverage of prior storms, how a storm's movement is depicted in graphics, and the content of word‐of‐mouth communications. Data from an initial application involving a sample of Florida residents reveal a number of unexpected findings about hurricane risk response. Participants who viewed forecast graphics, which contained track lines depicting the most likely path of the storm, for example, had higher levels of preparation than those who saw graphics that showed only uncertainty cones—even among those living far from the predicted center path. Similarly, the participants who were most likely to express worry about an approaching storm and fastest to undertake preparatory action were those who, ironically, had never experienced one. Finally, external validity is evidenced by a close rank‐order correspondence between patterns of information use revealed in the lab and that found in previous cross‐sectional field studies.  相似文献   

2.
The National Weather Service has adopted warning polygons that more specifically indicate the risk area than its previous county‐wide warnings. However, these polygons are not defined in terms of numerical strike probabilities (ps). To better understand people's interpretations of warning polygons, 167 participants were shown 23 hypothetical scenarios in one of three information conditions—polygon‐only (Condition A), polygon + tornadic storm cell (Condition B), and polygon + tornadic storm cell + flanking nontornadic storm cells (Condition C). Participants judged each polygon's ps and reported the likelihood of taking nine different response actions. The polygon‐only condition replicated the results of previous studies; ps was highest at the polygon's centroid and declined in all directions from there. The two conditions displaying storm cells differed from the polygon‐only condition only in having ps just as high at the polygon's edge nearest the storm cell as at its centroid. Overall, ps values were positively correlated with expectations of continuing normal activities, seeking information from social sources, seeking shelter, and evacuating by car. These results indicate that participants make more appropriate ps judgments when polygons are presented in their natural context of radar displays than when they are presented in isolation. However, the fact that ps judgments had moderately positive correlations with both sheltering (a generally appropriate response) and evacuation (a generally inappropriate response) suggests that experiment participants experience the same ambivalence about these two protective actions as people threatened by actual tornadoes.  相似文献   

3.
Research on the etiology of chronic pulmonary disease (COPD), an irreversible degenerative lung disease affecting 15% to 20% of smokers, has blossomed over the past half‐century. Profound new insights have emerged from a combination of in vitro and –omics studies on affected lung cell populations (including cytotoxic CD8+ T lymphocytes, regulatory CD4+ helper T cells, dendritic cells, alveolar macrophages and neutrophils, alveolar and bronchiolar epithelial cells, goblet cells, and fibroblasts) and extracellular matrix components (especially, elastin and collagen fibers); in vivo studies on wild‐type and genetically engineered mice and other rodents; clinical investigation of cell‐ and molecular‐level changes in asymptomatic smokers and COPD patients; genetic studies of susceptible and rapidly‐progressing phenotypes (both human and animal); biomarker studies of enzyme and protein degradation products in induced sputum, bronchiolar lavage, urine, and blood; and epidemiological and clinical investigations of the time course of disease progression. To this rich mix of data, we add a relatively simple in silico computational model that incorporates recent insights into COPD disease causation and progression. Our model explains irreversible degeneration of lung tissue as resulting from a cascade of positive feedback loops: a macrophage inflammation loop, a neutrophil inflammation loop, and an alveolar epithelial cell apoptosis loop. Unrepaired damage results in clinical symptoms. The resulting model illustrates how to simplify and make more understandable the main aspects of the very complex dynamics of COPD initiation and progression, as well as how to predict the effects on risk of interventions that affect specific biological responses.  相似文献   

4.
Criminal justice system policy makers have recently begun to utilize sophisticated techniques of analysis to explore questions of resource allocation and program evaluation.1, 2, 6, 8, 10 All too often, however, the benefits of such techniques have been denied to the large proportion of criminal justice units which lack access to extensive computer facilities and large technical staffs. This paper illustrates how one important crime data time series property—seasonal variation— may be adjusted for and utilized with some fairly simple procedures. The police operations and planning implications of the presence of seasonality in crime data are then examined. The authors develop their discussion with reference to seasonal variation in monthly data on major crimes for a large United States urban area, Miami, Florida. The time period of the study is 1949–1972.  相似文献   

5.

E-commerce is growing rapidly, but a majority of the Internet users is still hesitant to become involved in it. One big hurdle is lack of trust. This paper deals with trust in ecommerce and measures to increase it. It discusses one particular so-called web assurance service for business-to-consumer commerce in detail. It also analyses if such services remain viable in an era of more mature e-commerce, and how they should be adjusted to the business-to-business environment.  相似文献   

6.
基于蚁群算法的群体用户兴趣导航路径发现   总被引:3,自引:0,他引:3  
在电子商务的发展进程中,如何准确理解用户访问网站的行为是一个紧迫的问题.Web使用挖掘是解决该问题的重要研究方法.发现用户的兴趣导航模式是Web使用挖掘的一个重要研究领域,也是优化Web站点框架设计的根本方法.在本文中,我们把Web用户看成是人工的蚂蚁,应用蚁群算法来发现用户的导航模式.首先,建立了一个Web站点模型;然后基于蚁群算法和Web日志数据建立了一个用户导航模型;最后,设计了一个算法,将所有的访问用户视为整体来挖掘他们偏好的导航路径.实验结果表明该方法能准确反映出用户的浏览兴趣.  相似文献   

7.
Protection motivation theory states individuals conduct threat and coping appraisals when deciding how to respond to perceived risks. However, that model does not adequately explain today's risk culture, where engaging in recommended behaviors may create a separate set of real or perceived secondary risks. We argue for and then demonstrate the need for a new model accounting for a secondary threat appraisal, which we call secondary risk theory. In an online experiment, 1,246 participants indicated their intention to take a vaccine after reading about the likelihood and severity of side effects. We manipulated likelihood and severity in a 2 × 2 between‐subjects design and examined how well secondary risk theory predicts vaccination intention compared to protection motivation theory. Protection motivation theory performed better when the likelihood and severity of side effects were both low (R2 = 0.30) versus high (R2 = 0.15). In contrast, secondary risk theory performed similarly when the likelihood and severity of side effects were both low (R2 = 0.42) or high (R2 = 0.45). But the latter figure is a large improvement over protection motivation theory, suggesting the usefulness of secondary risk theory when individuals perceive a high secondary threat.  相似文献   

8.
Contemporary studies conducted by the U.S. Army Corps of Engineers estimate probability distributions of flooding on the interior of ring levee systems by estimating surge exceedances at points along levee system boundaries, calculating overtopping volumes generated by this surface, then passing the resulting volumes of water through a drainage model to calculate interior flood depths. This approach may not accurately represent the exceedance probability of flood depths within the system interior; a storm producing 100‐year surge at one point is unlikely to simultaneously produce 100‐year surge levels everywhere around the system exterior. A conceptually preferred approach estimates surge and waves associated with a large set of storms. Each storm is run through the interior model separately, and the resulting flood depths are weighted by a parameterized likelihood of each synthetic storm. This results in an empirical distribution of flood depths accounting for geospatial variation in any individual storm's characteristics. This method can also better account for the probability of levee breaches or other system failures. The two methods can produce different estimates of flood depth exceedances and damage when applied to storm surge flooding in coastal Louisiana. Even differences in flood depth exceedances of less than 0.2 m can still produce large differences in projected damage. This article identifies and discusses differences in estimated flood depths and damage produced by each method within multiple Louisiana protection systems. The novel coupled dynamics approach represents a step toward enabling risk‐based design standards.  相似文献   

9.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   

10.
一个园区网上的用户信息管理系统   总被引:2,自引:0,他引:2  
作为一个连入Internet的大型园区网络AMMSNet,用户通过MSProxy2.0作为代理服务器对Internet进行访问,以MSExchange5.5提供邮件服务。在这种模式下,如何对用户的基本信息进行维护管理,对用户的访问实施监控并对其访问信息进行长期保存,完成用户数据流量的统计,实现用户管理的自动化,一直是Internet服务商和企业级网络管理员需要解决的首要问题。本文以某大型园区网络为例,介绍了利用C Builder与SQLServer数据库开发的Internet用户信息管理系统,以实现对用户访问Internet的一系列自动化管理。  相似文献   

11.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is fraught with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 billion dollars to 3 billion dollars in losses late on the 12th to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm struck the resort areas of Charlotte Harbor and moved across the densely populated central part of the state, with early poststorm estimates in the 28 dollars to 31 billion dollars range, and final estimates converging at 15 billion dollars as the actual intensity at landfall became apparent. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has a great appreciation for the role of computer models in projecting losses from hurricanes. The FCHLPM contracts with a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a sophisticated computer model based on the Holland wind field. Sensitivity analyses presented in this article utilize standardized regression coefficients to quantify the contribution of the computer input variables to the magnitude of the wind speed.  相似文献   

12.
Leptospirosis is a preeminent zoonotic disease concentrated in tropical areas, and prevalent in both industrialized and rural settings. Dose‐response models were generated from 22 data sets reported in 10 different studies. All of the selected studies used rodent subjects, primarily hamsters, with the predominant endpoint as mortality with the challenge strain administered intraperitoneally. Dose‐response models based on a single evaluation postinfection displayed median lethal dose (LD50) estimates that ranged between 1 and 107 leptospirae depending upon the strain's virulence and the period elapsed since the initial exposure inoculation. Twelve of the 22 data sets measured the number of affected subjects daily over an extended period, so dose‐response models with time‐dependent parameters were estimated. Pooling between data sets produced seven common dose‐response models and one time‐dependent model. These pooled common models had data sets with different test subject hosts, and between disparate leptospiral strains tested on identical hosts. Comparative modeling was done with parallel tests to test the effects of a single different variable of either strain or test host and quantify the difference by calculating a dose multiplication factor. Statistical pooling implies that the mechanistic processes of leptospirosis can be represented by the same dose‐response model for different experimental infection tests even though they may involve different host species, routes, and leptospiral strains, although the cause of this pathophysiological phenomenon has not yet been identified.  相似文献   

13.
In the summer of 2017, several European Union Member States were involved in a food alert caused by the presence of fipronil pesticide residues in chicken eggs. The food alert became a major news and received wide coverage both in the mass media and on the Internet. This article describes a study that analyzed how the Italian online information sources represented the fipronil alert, using web monitoring techniques and both manual and automatic content analysis methods. The results indicate that the alert was amplified because general news media could represent the alert within the frame of a political scandal, and because different social actors exploited the case. However, online information sources correctly communicated that the risks for consumers were low, reporting mainly what was officially communicated by the Italian health authorities. The study provides empirical evidence on how the online information sources represent food risks and food alerts and offers useful indications for health authorities in charge of the public communication of food risks.  相似文献   

14.
Yoke Heng Wong 《Risk analysis》2011,31(12):1872-1882
Road tunnels are vital infrastructures providing underground vehicular passageways for commuters and motorists. Various quantitative risk assessment (QRA) models have recently been developed and employed to evaluate the safety levels of road tunnels in terms of societal risk (as measured by the F/N curve). For a particular road tunnel, traffic volume and proportion of heavy goods vehicles (HGVs) are two adjustable parameters that may significantly affect the societal risk, and are thus very useful in implementing risk reduction solutions. To evaluate the impact the two contributing factors have on the risk, this article first presents an approach that employs a QRA model to generate societal risk for a series of possible combinations of the two factors. Some combinations may result in F/N curves that do not fulfill a predetermined safety target. This article thus proposes an “excess risk index” in order to quantify the road tunnel risk magnitudes that do not pass the safety target. The two‐factor impact analysis can be illustrated by a contour chart based on the excess risk. Finally, the methodology has been applied to Singapore's KPE road tunnel and the results show that in terms of meeting the test safety target for societal risk, the traffic capacity of the tunnel should be no more than 1,200 vehs/h/lane, with a maximum proportion of 18% HGVs.  相似文献   

15.
Despite improvements in forecasting extreme weather events, noncompliance with weather warnings among the public remains a problem. Although there are likely many reasons for noncompliance with weather warnings, one important factor might be people's past experiences with false alarms. The research presented here explores the role of false alarms in weather‐related decision making. Over a series of trials, participants used an overnight low temperature forecast and advice from a decision aid to decide whether to apply salt treatment to a town's roads to prevent icy conditions or take the risk of withholding treatment, which resulted in a large penalty when freezing temperatures occurred. The decision aid gave treatment recommendations, some of which were false alarms, i.e., treatment was recommended but observed temperatures were above freezing. The rate at which the advice resulted in false alarms was manipulated between groups. Results suggest that very high and very low false alarm rates led to inferior decision making, but that lowering the false alarm rate slightly did not significantly affect compliance or decision quality. However, adding a probabilistic uncertainty estimate in the forecasts improved both compliance and decision quality. These findings carry implications about how weather warnings should be communicated to the public.  相似文献   

16.
Self‐driving vehicles (SDVs) promise to considerably reduce traffic crashes. One pressing concern facing the public, automakers, and governments is “How safe is safe enough for SDVs?” To answer this question, a new expressed‐preference approach was proposed for the first time to determine the socially acceptable risk of SDVs. In our between‐subject survey (N = 499), we determined the respondents’ risk‐acceptance rate of scenarios with varying traffic‐risk frequencies to examine the logarithmic relationships between the traffic‐risk frequency and risk‐acceptance rate. Logarithmic regression models of SDVs were compared to those of human‐driven vehicles (HDVs); the results showed that SDVs were required to be safer than HDVs. Given the same traffic‐risk‐acceptance rates for SDVs and HDVs, their associated acceptable risk frequencies of SDVs and HDVs were predicted and compared. Two risk‐acceptance criteria emerged: the tolerable risk criterion, which indicates that SDVs should be four to five times as safe as HDVs, and the broadly acceptable risk criterion, which suggests that half of the respondents hoped that the traffic risk of SDVs would be two orders of magnitude lower than the current estimated traffic risk. The approach and these results could provide insights for government regulatory authorities for establishing clear safety requirements for SDVs.  相似文献   

17.
《Risk analysis》2018,38(4):724-754
A bounding risk assessment is presented that evaluates possible human health risk from a hypothetical scenario involving a 10,000‐gallon release of flowback water from horizontal fracturing of Marcellus Shale. The water is assumed to be spilled on the ground, infiltrates into groundwater that is a source of drinking water, and an adult and child located downgradient drink the groundwater. Key uncertainties in estimating risk are given explicit quantitative treatment using Monte Carlo analysis. Chemicals that contribute significantly to estimated health risks are identified, as are key uncertainties and variables to which risk estimates are sensitive. The results show that hypothetical exposure via drinking water impacted by chemicals in Marcellus Shale flowback water, assumed to be spilled onto the ground surface, results in predicted bounds between 10−10 and 10−6 (for both adult and child receptors) for excess lifetime cancer risk. Cumulative hazard indices (HICUMULATIVE) resulting from these hypothetical exposures have predicted bounds (5th to 95th percentile) between 0.02 and 35 for assumed adult receptors and 0.1 and 146 for assumed child receptors. Predicted health risks are dominated by noncancer endpoints related to ingestion of barium and lithium in impacted groundwater. Hazard indices above unity are largely related to exposure to lithium. Salinity taste thresholds are likely to be exceeded before drinking water exposures result in adverse health effects. The findings provide focus for policy discussions concerning flowback water risk management. They also indicate ways to improve the ability to estimate health risks from drinking water impacted by a flowback water spill (i.e., reducing uncertainty).  相似文献   

18.
Numerical uncertainty ranges are often used to convey the precision of a forecast. In three studies, we examined how users perceive the distribution underlying numerical ranges and test specific hypotheses about the display characteristics that affect these perceptions. We discuss five primary conclusions from these studies: (1) substantial variation exists in how people perceive the distribution underlying numerical ranges; (2) distributional perceptions appear similar whether the uncertain variable is a probability or an outcome; (3) the variation in distributional perceptions is due in part to individual differences in numeracy, with more numerate individuals more likely to perceive the distribution as roughly normal; (4) the variation is also due in part to the presence versus absence of common cues used to convey the correct interpretation (e.g., including a best estimate increases perceptions that the distribution is roughly normal); and (5) simple graphical representations can decrease the variance in distributional perceptions. These results point toward significant opportunities to improve uncertainty communication in climate change and other domains.  相似文献   

19.
A new technique for deriving exogenous components of mortality risks from national vital statistics has been developed. Each observed death rate Dij (where i corresponds to calendar time (year or interval of years) and j denotes the number of corresponding age group) was represented as Dij=Aj+BiCj, and unknown quantities Aj, Bi, and Cj were estimated by a special procedure using the least-squares principle. The coefficients of variation do not exceed 10%. It is shown that the term Aj can be interpreted as the endogenous and the second term BiCj as the exogenous components of the death rate. The aggregate of endogenous components Aj can be described by a regression function, corresponding to the Gompertz-Makeham law, A(τ) =γ+β· eατ, where γ, β, and α are constants, τ is age, AττAττAj, and τj, is the value of age τ in jth age group. The coefficients of variation for such a representation does not exceed 4%. An analysis of exogenous risk levels in the Moscow and Russian populations during 1980–1995 shows that since 1992 all components of exogenous risk in the Moscow population had been increasing up to 1994. The greatest contribution to the total level of exogenous risk was lethal diseases, and their death rate was 387 deaths per 100,000 persons in 1994, i.e., 61.9% of all deaths. The dynamics of exogenous mortality risk change during 1990–1994 in the Moscow population and in the Russian population without Moscow had been identical: the risk had been increasing, and its value in the Russian population had been higher than that in the Moscow population.  相似文献   

20.
《Risk analysis》2018,38(6):1128-1142
Lumber Liquidators (LL) Chinese‐manufactured laminate flooring (CLF) has been installed in >400,000 U.S. homes over the last decade. To characterize potential associated formaldehyde exposures and cancer risks, chamber emissions data were collected from 399 new LL CLF, and from LL CLF installed in 899 homes in which measured aggregate indoor formaldehyde concentrations exceeded 100 μg/m3 from a total of 17,867 homes screened. Data from both sources were combined to characterize LL CLF flooring‐associated formaldehyde emissions from new boards and installed boards. New flooring had an average (±SD ) emission rate of 61.3 ± 52.1 μg/m2‐hour; >one‐year installed boards had ∼threefold lower emission rates. Estimated emission rates for the 899 homes and corresponding data from questionnaires were used as inputs to a single‐compartment, steady‐state mass‐balance model to estimate corresponding residence‐specific TWA formaldehyde concentrations and potential resident exposures. Only ∼0.7% of those homes had estimated acute formaldehyde concentrations >100 μg/m3 immediately after LL CLF installation. The TWA daily formaldehyde inhalation exposure within the 899 homes was estimated to be 17 μg/day using California Proposition 65 default methods to extrapolate cancer risk (below the regulation “no significant risk level” of 40 μg/day). Using a U.S. Environmental Protection Agency linear cancer risk model, 50th and 95th percentile values of expected lifetime cancer risk for residents of these homes were estimated to be 0.33 and 1.2 per 100,000 exposed, respectively. Based on more recent data and verified nonlinear cancer risk assessment models, LL CLF formaldehyde emissions pose virtually no cancer risk to affected consumers.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号