首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
An experiment examined the ability of five graphical displays to communicate uncertainty information when end users were under cognitive load (i.e., remembering an eight‐digit number). The extent to which people could accurately derive information from the graphs and the adequacy of decisions about optimal behaviors based on the graphs were assessed across eight scenarios in which probabilistic outcomes were described. Results indicated that the load manipulation did not have an overall effect on derivation of information from the graphs (i.e., mean and probability estimation) but did suppress the ability to optimize behavioral choices based on the graph. Cognitive load affected people's use of some graphical displays (basic probability distribution function) more than others. Overall, the research suggests that interpreting basic characteristics of uncertainty data is unharmed under conditions of limited cognitive resources, whereas more deliberative processing is negatively affected.  相似文献   

2.
This paper argues the need for a back-to-basics approach in computer literacy education. It then presents such an approach. This approach has worked well both in courses designed for majors in computer science and in information systems, as well as in a course designed for nonmajors. The current view of literacy as indicated by courses taught at universities and in industry around the country is that literacy means learning about the computer (lots of terminology) and learning to use microcomputer application packages. It might include a glimpse of programming, but the feeling seems to be that computer users do not need to understand more than packages. We contend that the current approach is not literacy, does not prepare users to use the computer effectively, that the function of the universities is not to teach skills, and that such skills are in fact taught more efficiently if understanding that is true literacy, precedes the teaching of skills.  相似文献   

3.
The "psychometric paradigm" developed by Slovic, Fischhoff, and Lichtenstein was a landmark in research about public attitudes toward risks. One problem with this work, however, was that (at least initially) it did not attempt to distinguish between individuals or groups of people, except "experts" vs. "lay people." This paradigm produced a "cognitive map" of hazards, and the assumption seemed to be that the characteristics identified were inherent attributes of risk. This paper examines the validity of this assumption. A questionnaire survey similar to those designed by Slovic et al. was conducted, but the data were analyzed at both the aggregate level, using mean scores, and at the level of individuals ( N = 131 Norwich residents). The results reported here demonstrate that (1) individuals vary in their perception of the same risk issue; (2) individuals vary in their rating of the same risk characteristics on the same risk issue; and (3) some of the strong intercorrelations observed between risk characteristics at the aggregate level are not supported when the same data are analysed at the level of individuals. Despite these findings, the relationship between risk characteristics and risk perceptions inferred by the psychometric paradigm did hold true at the level of individuals, for most—but not all—of the characteristics. In particular, the relationship between "lack of knowledge to those exposed" and risk perceptions appears to be a complex one, a finding which has important implications for risk communication strategies.  相似文献   

4.
A Survey of Approaches for Assessing and Managing the Risk of Extremes   总被引:8,自引:0,他引:8  
In this paper, we review methods for assessing and managing the risk of extreme events, where extreme events are defined to be rare, severe, and outside the normal range of experience of the system in question. First, we discuss several systematic approaches for identifying possible extreme events. We then discuss some issues related to risk assessment of extreme events, including what type of output is needed (e.g., a single probability vs. a probability distribution), and alternatives to the probabilistic approach. Next, we present a number of probabilistic methods. These include: guidelines for eliciting informative probability distributions from experts; maximum entropy distributions; extreme value theory; other approaches for constructing prior distributions (such as reference or noninformative priors); the use of modeling and decomposition to estimate the probability (or distribution) of interest; and bounding methods. Finally, we briefly discuss several approaches for managing the risk of extreme events, and conclude with recommendations and directions for future research.  相似文献   

5.
Bayesian Forecasting via Deterministic Model   总被引:1,自引:0,他引:1  
Rational decision making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of two Bayesian methods for producing a probabilistic forecast via any deterministic model. The Bayesian Processor of Forecast (BPF) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution. The BFS is compared with Monte Carlo simulation and ensemble forecasting technique, none of which can alone produce a probabilistic forecast that quantifies the total uncertainty, but each can serve as a component of the BFS.  相似文献   

6.
7.
For a Boolean function given by a Boolean formula (or a binary circuit) S we discuss the problem of building a Boolean formula (binary circuit) of minimal size, which computes the function g equivalent to , or -equivalent to , i.e., . In this paper we prove that if P NP then this problem can not be approximated with a good approximation ratio by a polynomial time algorithm.  相似文献   

8.
The author suggests a Weberian methodology, based on theories of democracy and organization, for assessing normative implications of public organizations. How different organizational models contribute to (re)create democracy and legitimacy is scrutinized with reference to a Swedish IT program. The conclusion is that a system management organization will be an appropriate choice for dealing with tame problems, but it will at the same time promote an elitist democratization. In contrast, a development organization will be more appropriate in dealing with complex problems, and it will most likely promote discursive democratization.  相似文献   

9.
Let \(G = (V;E)\) be a simple graph with vertex set \(V\) and edge set \(E\). A signed mixed Roman dominating function (SMRDF) of \(G\) is a function \(f: V\cup E\rightarrow \{-1,1,2\}\) satisfying the conditions that (i) \(\sum _{y\in N_m[x]}f(y)\ge 1\) for each \(x\in V\cup E\), where \(N_m[x]\) is the set, called mixed closed neighborhood of \(x\), consists of \(x\) and the elements of \(V\cup E\) adjacent or incident to \(x\) (ii) every element \(x\in V\cup E\) for which \(f(x) = -1\) is adjacent or incident to at least one element \(y\in V\cup E\) for which \(f(y) = 2\). The weight of a SMRDF \(f\) is \(\omega (f)=\sum _{x\in V\cup E}f(x)\). The signed mixed Roman domination number \(\gamma _{sR}^*(G)\) of \(G\) is the minimum weight of a SMRDF of \(G\). In this paper we initiate the study of the signed mixed Roman domination number and we present bounds for this parameter. In particular, we determine this parameter for some classes of graphs.  相似文献   

10.
When individual statistics are aggregated through a strictly monotone function to an aggregate statistic, common knowledge of the value of the aggregate statistic does not imply, in general, that the individual statistics are either equal or constant. This paper discusses circumstances where constancy and equality both hold. The first case arises when partitions are independently drawn, and each individual's information is determined by their own partition and some public signal. In this case common knowledge of the value of the aggregator function implies (with probability one) that the individual statistics are constant, so that in the case where the individual statistics have the same expected value, they must all be equal. The second circumstance is where private statistics are related: affiliation of individual statistics and a lattice condition imply that the individual statistics are equal when the value of the aggregate statistic is common knowledge.  相似文献   

11.
This paper argues that modularity of knowledge andtechnologies has important implications for the locusof inventive activities. This is because modularityallows for a separation of the innovation process intwo main activities: The production of basic(standardised) modules, and their combination toproduce variants of technologies or product designsthat are better suited to the special needs ofindividual users or markets. This gives rise to adivision of labour whereby the production of moduleswill be performed by specialised upstream suppliers(who enjoy economies of scale), while the combinationof modules will be performed by firms furtherdownstream or by the users themselves. We then suggestthat this pattern can explain a variety of phenomenasuch as why users co-produce their innovations, andhow small regions can support innovative activitydespite the apparent efficiency advantage of largerregions.  相似文献   

12.
An analysis of the uncertainty in guidelines for the ingestion of methylmercury (MeHg) due to human pharmacokinetic variability was conducted using a physiologically based pharmacokinetic (PBPK) model that describes MeHg kinetics in the pregnant human and fetus. Two alternative derivations of an ingestion guideline for MeHg were considered: the U.S. Environmental Protection Agency reference dose (RfD) of 0.1 g/kg/day derived from studies of an Iraqi grain poisoning episode, and the Agency for Toxic Substances and Disease Registry chronic oral minimal risk level (MRL) of 0.5 g/kg/day based on studies of a fish-eating population in the Seychelles Islands. Calculation of an ingestion guideline for MeHg from either of these epidemiological studies requires calculation of a dose conversion factor (DCF) relating a hair mercury concentration to a chronic MeHg ingestion rate. To evaluate the uncertainty in this DCF across the population of U.S. women of child-bearing age, Monte Carlo analyses were performed in which distributions for each of the parameters in the PBPK model were randomly sampled 1000 times. The 1st and 5th percentiles of the resulting distribution of DCFs were a factor of 1.8 and 1.5 below the median, respectively. This estimate of variability is consistent with, but somewhat less than, previous analyses performed with empirical, one-compartment pharmacokinetic models. The use of a consistent factor in both guidelines of 1.5 for pharmacokinetic variability in the DCF, and keeping all other aspects of the derivations unchanged, would result in an RfD of 0.2 g/kg/day and an MRL of 0.3 g/kg/day.  相似文献   

13.
This paper considers the minimax regret vertex 2-sink location problem in a dynamic path network with positive edge lengths and uniform edge capacity. Let \(P\) be an undirected path graph of \(n\) vertices, and the weight (initial supply) of every vertex is known as an interval. The problem is to find two vertices \(x\) and \(y\) as two sinks on the path such that all the weights can evacuate to \(x\) and \(y\) with minimum regret of evacuation time in case of an emergency for any possible weight distribution. We present an \(O(n^3\log n)\) time algorithm.  相似文献   

14.
The findings indicated that the economic environment and the cultural and religious orientations of managers in Saudi Arabia significantly influenced their scores on Machiavellianism and the relationships between their needs and leadership styles. In comparison to the U.S. norms the Saudi Arabian managers were found to be lower on Machiavellianism. Need for achievement was found to be positively related to need for power and structure dimension of leadership. The findings also showed Machiavellianism to be positively related to need for power and negatively related to consideration dimension of leadership. The findings are discussed in the context of a fast-growing economy and a highly religious and a traditional society.  相似文献   

15.
Health care professionals are a major source of risk communications, but their estimation of risks may be compromised by systematic biases. We examined fuzzy-trace theory's predictions of professionals' biases in risk estimation for sexually transmitted infections (STIs) linked to: knowledge deficits (producing underestimation of STI risk, re-infection, and gender differences), gist-based mental representation of risk categories (producing overestimation of condom effectiveness for psychologically atypical but prevalent infections), retrieval failure for risk knowledge (producing greater risk underestimation when STIs are not specified), and processing interference involving combining risk estimates (producing biases in post-test estimation of infection, regardless of knowledge). One-hundred-seventy-four subjects (experts attending a national workshop, physicians, other health care professionals, and students) estimated the risk of teenagers contracting STIs, re-infection rates for males and females, and condom effectiveness in reducing infection risk. Retrieval was manipulated by asking estimation questions in two formats, a specific format that "unpacked" the STI category (infection types) and a global format that did not provide specific cues. Requesting estimates of infection risk after relevant knowledge was directly provided, isolating processing effects, assessed processing biases. As predicted, all groups of professionals underestimated the risk of STI transmission, re-infection, and gender differences, and overestimated the effectiveness of condoms, relative to published estimates. However, when questions provided better retrieval supports (specified format), estimation bias decreased. All groups of professionals also suffered from predicted processing biases. Although knowledge deficits contribute to estimation biases, the research showed that biases are also linked to fuzzy representations, retrieval failures, and processing errors Hence, interventions that are designed to improve risk perception among professionals must incorporate more than knowledge dissemination. They should also provide support for information representation, effective retrieval, and accurate processing.  相似文献   

16.
The study attempts to identify the most essential components, which contribute to the clients satisfaction with the quality service of three federal institutions in the United Arab Emirates public sector. The study reveals that clean office, job knowledge, respect for client, and clarity of regulations were the most important dimensions determining the quality services in the surveyed institutions. Suggestions and recommendations are included in the study.  相似文献   

17.
This article employs an institutional perspective in formulating predictions about the ethical futures of privatization partnerships. Although this paper focuses on ethical concerns in the U.S. public sector, it incorporates a multinational dimension in (a) comparing the meaning of privatization among societies and (b) probing privatization financing in the global economy. Five assumptions that flow from institutional reasoning are made explicit as supports for subsequent predictions. The institutional logic shifts privatization conversation away from conventional debate about competition and efficiency toward centralizing forces in both sectors in response to globalization. In that regard, this study identifies the systemic erosion of (local) community integrity as the key privatization problem of the future.  相似文献   

18.
Living microbes are discrete, not homogeneously distributed in environmental media, and the form of the distribution of their counts in drinking water has not been well established. However, this count may "scale" or range over orders of magnitude over time, in which case data representing the tail of the distribution, and governing the mean, would be represented only in impractically long data records. In the absence of such data, knowledge of the general form of the full distribution could be used to estimate the true mean accounting for low-probability, high-consequence count events and provide a basis for a general environmental dose-response function. In this article, a new theoretical discrete growth distribution (DGD) is proposed for discrete counts in environmental media and other discrete growth systems. The term growth refers not to microbial growth but to a general abiotic first-order growth/decay of outcome sizes in many complex systems. The emergence and stability of the DGD in such systems, defined in simultaneous work, are also described. The DGD is then initially verified versus 12 of 12 simulated long-term drinking water and short-term treated and untreated water microbial count data sets. The alternative Poisson lognormal (PLN) distribution was rejected for 2 (17%) of the 12 data sets with 95% confidence and, like other competitive distributions, was not found stable (in simultaneous work). Sample averages are compared with means assessed from the fitted DGD, with varying results. Broader validation of the DGD for discrete counts arising as outcomes of mathematical growth systems is suggested.  相似文献   

19.
The worst-case behavior of the critical path (CP) algorithm for multiprocessor scheduling with an out-tree task dependency structure is studied. The out-tree is not known in advance and the tasks are released on-line over time (each task is released at the completion time of its direct predecessor task in the out-tree). For each task, the processing time and the remainder (the length of the longest chain of the future tasks headed by this task) become known at its release time. The tight worst-case ratio and absolute error are derived for this strongly clairvoyant on-line model. For out-trees with a specific simple structure, essentially better worst-case ratio and absolute error are derived. Our bounds are given in terms of t max, the length of the longest chain in the out-tree, and it is shown that the worst-case ratio asymptotically approaches 2 for large t max when the number of processors , where is an integer close to . A non-clairvoyant on-line version (without knowledge of task processing time and remainder at the release time of the task) is also considered and is shown that the worst-case behavior of width-first search is better or the same as that of the depth-first search.  相似文献   

20.
Currently, a binary alarm system is used in the United States to issue deterministic warning polygons in case of tornado events. To enhance the effectiveness of the weather information, a likelihood alarm system, which uses a tool called probabilistic hazard information (PHI), is being developed at National Severe Storms Laboratory to issue probabilistic information about the threat. This study aims to investigate the effects of providing the uncertainty information about a tornado occurrence through the PHI's graphical swath on laypeople's concern, fear, and protective action, as compared with providing the warning information with the deterministic polygon. The displays of color‐coded swaths and deterministic polygons were shown to subjects. Some displays had a blue background denoting the probability of any tornado formation in the general area. Participants were asked to report their levels of concern, fear, and protective action at randomly chosen locations within each of seven designated levels on each display. Analysis of a three‐stage nested design showed that providing the uncertainty information via the PHI would appropriately increase recipients’ levels of concern, fear, and protective action in highly dangerous scenarios, with a more than 60% chance of being affected by the threat, as compared with deterministic polygons. The blue background and the color‐coding type did not have a significant effect on the people's cognition of the threat and reaction to it. This study shows that using a likelihood alarm system leads to more conscious decision making by the weather information recipients and enhances the system safety.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号