首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 154 毫秒
1.
An analysis of the uncertainty in guidelines for the ingestion of methylmercury (MeHg) due to human pharmacokinetic variability was conducted using a physiologically based pharmacokinetic (PBPK) model that describes MeHg kinetics in the pregnant human and fetus. Two alternative derivations of an ingestion guideline for MeHg were considered: the U.S. Environmental Protection Agency reference dose (RfD) of 0.1 g/kg/day derived from studies of an Iraqi grain poisoning episode, and the Agency for Toxic Substances and Disease Registry chronic oral minimal risk level (MRL) of 0.5 g/kg/day based on studies of a fish-eating population in the Seychelles Islands. Calculation of an ingestion guideline for MeHg from either of these epidemiological studies requires calculation of a dose conversion factor (DCF) relating a hair mercury concentration to a chronic MeHg ingestion rate. To evaluate the uncertainty in this DCF across the population of U.S. women of child-bearing age, Monte Carlo analyses were performed in which distributions for each of the parameters in the PBPK model were randomly sampled 1000 times. The 1st and 5th percentiles of the resulting distribution of DCFs were a factor of 1.8 and 1.5 below the median, respectively. This estimate of variability is consistent with, but somewhat less than, previous analyses performed with empirical, one-compartment pharmacokinetic models. The use of a consistent factor in both guidelines of 1.5 for pharmacokinetic variability in the DCF, and keeping all other aspects of the derivations unchanged, would result in an RfD of 0.2 g/kg/day and an MRL of 0.3 g/kg/day.  相似文献   

2.
Putzrath  Resha M.  Wilson  James D. 《Risk analysis》1999,19(2):231-247
We investigated the way results of human health risk assessments are used, and the theory used to describe those methods, sometimes called the NAS paradigm. Contrary to a key tenet of that theory, current methods have strictly limited utility. The characterizations now considered standard, Safety Indices such as Acceptable Daily Intake, Reference Dose, and so on, usefully inform only decisions that require a choice between two policy alternatives (e.g., approve a food additive or not), decided solely on the basis of a finding of safety. Risk is characterized as the quotient of one of these Safety Indices divided by an estimate of exposure: a quotient greater than one implies that the situation may be considered safe. Such decisions are very widespread, both in the U. S. federal government and elsewhere. No current method is universal; different policies lead to different practices, for example, in California's Proposition 65, where statutory provisions specify some practices. Further, an important kind of human health risk assessment is not recognized by this theory: this kind characterizes risk as likelihood of harm, given estimates of exposure consequent to various decision choices. Likelihood estimates are necessary whenever decision makers have many possible decision choices and must weigh more than two societal values, such as in EPA's implementation of conventional air pollutants. These estimates can not be derived using current methods; different methods are needed. Our analysis suggests changes needed in both the theory and practice of human health risk assessment, and how what is done is depicted.  相似文献   

3.
Context in the Risk Assessment of Digital Systems   总被引:1,自引:0,他引:1  
As the use of digital computers for instrumentation and control of safety-critical systems has increased, there has been a growing debate over the issue of whether probabilistic risk assessment techniques can be applied to these systems. This debate has centered on the issue of whether software failures can be modeled probabilistically. This paper describes a context-based approach to software risk assessment that explicitly recognizes the fact that the behavior of software is not probabilistic. The source of the perceived uncertainty in its behavior results from both the input to the software as well as the application and environment in which the software is operating. Failures occur as the result of encountering some context for which the software was not properly designed, as opposed to the software simply failing randomly. The paper elaborates on the concept of error-forcing context as it applies to software. It also illustrates a methodology which utilizes event trees, fault trees, and the Dynamic Flowgraph Methodology (DFM) to identify error-forcing contexts for software in the form of fault tree prime implicants.  相似文献   

4.
Breakpoint graph decomposition is a crucial step in all recent approximation algorithms for SORTING BY REVERSALS, which is one of the best-known algorithmic problems in computational molecular biology. Caprara and Rizzi recently improved the approximation ratio for breakpoint graph decomposition from to + 1.4348 + , for any positive . In this paper, we extend the techniques of Caprara and Rizzi and incorporate a balancing argument to further improve the approximation ratio to + 1.4193 + , for any positive . These improvements imply improved approximation results for SORTING BY REVERSALS for almost all random permutations.  相似文献   

5.
Health Risk Assessment of a Modern Municipal Waste Incinerator   总被引:2,自引:0,他引:2  
During the modernization of the municipal waste incinerator (MWI, maximum capacity of 180,000 tons per year) of Metropolitan Grenoble (405,000 inhabitants), in France, a risk assessment was conducted, based on four tracer pollutants: two volatile organic compounds (benzene and 1, 1, 1 trichloroethane) and two heavy metals (nickel and cadmium, measured in particles). A Gaussian plume dispersion model, applied to maximum emissions measured at the MWI stacks, was used to estimate the distribution of these pollutants in the atmosphere throughout the metropolitan area. A random sample telephone survey (570 subjects) gathered data on time-activity patterns, according to demographic characteristics of the population. Life-long exposure was assessed as a time-weighted average of ambient air concentrations. Inhalation alone was considered because, in the Grenoble urban setting, other routes of exposure are not likely. A Monte Carlo simulation was used to describe probability distributions of exposures and risks. The median of the life-long personal exposures distribution to MWI benzene was 3.2·10–5 g/m3 (20th and 80th percentiles = 1.5·10–5 and 6.5·10–5 g/m3), yielding a 2.6·10–10 carcinogenic risk (1.2·10–10–5.4·10–10). For nickel, the corresponding life-time exposure and cancer risk were 1.8·10–4 g/m3 (0.9.10–4 – 3.6·10–4 g/m3) and 8.6·10–8 (4.3·10–8–17.3·10–8); for cadmium they were respectively 8.3·10–6 g/m3 (4.0·10–6–17.6·10–6) and 1.5·10–8 (7.2·10–9–3.1·10–8). Inhalation exposure to cadmium emitted by the MWI represented less than 1% of the WHO Air Quality Guideline (5 ng/m3), while there was a margin of exposure of more than 109 between the NOAEL (150 ppm) and exposure estimates to trichloroethane. Neither dioxins nor mercury, a volatile metal, were measured. This could lessen the attributable life-long risks estimated. The minute (VOCs and cadmium) to moderate (nickel) exposure and risk estimates are in accord with other studies on modern MWIs meeting recent emission regulations, however.  相似文献   

6.
This article reports results of a study of some 200 college-aged students at California State University. Ethical values are measured using a subset of the well-known and frequently used Rokeach Value Survey. Using nonparametric statistical analysis, four value measures, and four different consistent tests of significance and probability, the research data, perhaps disappointedly for many observers including the authors, reveal that there is no relationship between college grade point average and student ethics. Statistical analysis was done on g.p.a. splits of less than 3.0 versus 3.0 or more and also on g.p.a. data for 2.5 or less versus 3.5 or more. In all cases, there are no significant relationships between high or low grade point averages and scores on ethical value rankings.  相似文献   

7.
The paper examines the factors that influence socially responsible decision making by individuals. The study found four social responsibility styles: Playing it Safe, Weather The Storm, Problem to Solve, and Hope it Goes Away. These styles describe individuals on the basis of decision style, propensity for risk, and coping style. The styles explain why people with different values might come to the same decision in the same circumstances.  相似文献   

8.
Ethylene oxide (EO) research has significantly increased since the 1980s, when regulatory risk assessments were last completed on the basis of the animal cancer chronic bioassays. In tandem with the new scientific understanding, there have been evolutionary changes in regulatory risk assessment guidelines, that encourage flexibility and greater use of scientific information. The results of an updated meta-analysis of the findings from 10 unique EO study cohorts from five countries, including nearly 33,000 workers, and over 800 cancers are presented, indicating that EO does not cause increased risk of cancers overall or of brain, stomach or pancreatic cancers. The findings for leukemia and non-Hodgkin's lymphoma (NHL) are inconclusive. Two studies with the requisite attributes of size, individual exposure estimates and follow up are the basis for dose-response modeling and added lifetime risk predictions under environmental and occupational exposure scenarios and a variety of plausible alternative assumptions. A point of departure analysis, with various margins of exposure, is also illustrated using human data. The two datasets produce remarkably similar leukemia added risk predictions, orders of magnitude lower than prior animal-based predictions under conservative, default assumptions, with risks on the order of 1 × 10–6 or lower for exposures in the low ppb range. Inconsistent results for lymphoid tumors, a non-standard grouping using histologic information from death certificates, are discussed. This assessment demonstrates the applicability of the current risk assessment paradigm to epidemiological data.  相似文献   

9.
Inter-service rivalry over budget allocations between the Japanese Imperial Navy and the Imperial Army played a crucial role in the genesis of World War Two in the Pacific. The adoption of a nanshin (southward advance) strategy by the Navy may be explained as an attempt to maximize its budget leading directly to the attack on Pearl Harbor in 1941. To date, this argument has been presented in the form of historical narrative without any explanatory theoretical framework. The present paper seeks to place inter-service budgetary rivalry within the context of public choice theory to enhance understanding of this historical perspective.  相似文献   

10.
Given a set of points P in a metric space, let l(P) denote the ratio of lengths between the shortest k-edge-connected Steiner network and the shortest k-edge-connected spanning network on P, and let r = inf l(P) P for k 1. In this paper, we show that in any metric space, r 3/4 for k 2, and there exists a polynomial-time -approximation for the shortest k-edge-connected Steiner network, where = 2 for even k and = 2 + 4/(3k) for odd k. In the Euclidean plane, and .  相似文献   

11.
In this paper the problem of high-level nuclear waste disposal is viewed as a five-stage, cascaded decision problem. The first four of these decisions having essentially been made, the work of recent years has been focused on the fifth stage, which concerns specifics of the repository design. The probabilistic performance assessment (PPA) work is viewed as the outcome prediction for this stage, and the site characterization work as the information gathering option. This brief examination of the proposed Yucca Mountain repository through a decision analysis framework resulted in three conclusions: (1) A decision theory approach to the process of selecting and characterizing Yucca Mountain would enhance public understanding of the issues and solutions to high-level waste management; (2) engineered systems are an attractive alternative to offset uncertainties in the containment capability of the natural setting and should receive greater emphasis in the design of the repository; and (3) a strategy of waste management should be adopted, as opposed to waste disposal, as it allows for incremental confirmation and confidence building of a permanent solution to the high-level waste problem.  相似文献   

12.
13.
We study one of the most basic online scheduling models, online one machine scheduling with delivery times where jobs arrive over time. We provide the first randomized algorithm for this model, show that it is 1.55370-competitive and show that this analysis is tight. The best possible deterministic algorithm is 1.61803-competitive. Our algorithm is a distribution between two deterministic algorithms. We show that any such algorithm is no better than 1.5-competitive. To our knowledge, this is the first lower bound proof for a distribution between two deterministic algorithms.  相似文献   

14.
O'Connor  Robert E.  Bord  Richard J.  Fisher  Ann 《Risk analysis》1999,19(3):461-471
The research reported here examines the relationship between risk perceptions and willingness to address climate change. The data are a national sample of 1225 mail surveys that include measures of risk perceptions and knowledge tied to climate change, support for voluntary and government actions to address the problem, general environmental beliefs, and demographic variables. Risk perceptions matter in predicting behavioral intentions. Risk perceptions are not a surrogate for general environmental beliefs, but have their own power to account for behavioral intentions. There are four secondary conclusions. First, behavioral intentions regarding climate change are complex and intriguing. People are neither nonbelievers who will take no initiatives themselves and oppose all government efforts, nor are they believers who promise both to make personal efforts and to vote for every government proposal that promises to address climate change. Second, there are separate demographic sources for voluntary actions compared with voting intentions. Third, recognizing the causes of global warming is a powerful predictor of behavioral intentions independent from believing that climate change will happen and have bad consequences. Finally, the success of the risk perception variables to account for behavioral intentions should encourage greater attention to risk perceptions as independent variables. Risk perceptions and knowledge, however, share the stage with general environmental beliefs and demographic characteristics. Although related, risk perceptions, knowledge, and general environmental beliefs are somewhat independent predictors of behavioral intentions.  相似文献   

15.
This paper uses a Rokeach Value Survey methodology to again ask the question, now in the mid 1990s, whether business student ethics are different from non-business student ethics. Additionally, the paper addresses the question of whether a course can alter or change student ethics and values during a semester. Thirdly, this paper attempts to operationalize and empirically test and measure the new ethical concepts of moral management and moral maximization.  相似文献   

16.
We present a few comments on the paper Attacking the market split problem with lattice point enumeration by A. Wasserman, published in Journal of Combinatorial Optimization, vol. 6, pp. 5–16, 2002.  相似文献   

17.
This paper is a challenge from a pair of lifelong technical specialists in risk assessment for the risk-management community to better define social decision criteria for risk acceptance vs. risk control in relation to the issues of variability and uncertainty. To stimulate discussion, we offer a variety of straw man proposals about where we think variability and uncertainty are likely to matter for different types of social policy considerations in the context of a few different kinds of decisions. In particular, we draw on recent presentations of uncertainty and variability data that have been offered by EPA in the context of the consideration of revised ambient air quality standards under the Clean Air Act.  相似文献   

18.
For a Boolean function given by a Boolean formula (or a binary circuit) S we discuss the problem of building a Boolean formula (binary circuit) of minimal size, which computes the function g equivalent to , or -equivalent to , i.e., . In this paper we prove that if P NP then this problem can not be approximated with a good approximation ratio by a polynomial time algorithm.  相似文献   

19.
Decision Support for Airline Schedule Planning   总被引:2,自引:0,他引:2  
Since the 1950s, the operations research community has developed a large number of computer models to aid in the solution of airline scheduling problems. One notable characteristic of these contributions is that each algorithm was developed with its own input and output structures, user interface, and hardware and software requirements. The result is that many of these contributions are under-utilized because they are cumbersome to use, not integrated with the other airline's systems, and not connected across all functions of the airline (from planning to operations control). What was needed to make these contributions effective was a scheduling environment with a systematic interaction between the human, standardized databases across all functions of the airline, powerful desktop workstations for decision support, a standardized interactive graphical user interface for schedule editing, and the operations research techniques for optimization. This paper reports on the application of the integration of computer science and operations research in a decision support system for airline schedule planning. The application integrates a graphical user interface and the database with the schedule optimization algorithms.  相似文献   

20.
For a multigraph G = (V, E), let s V be a designated vertex which has an even degree, and let G (V – s) denote min{c G(X) | Ø X V – s}, where c G(X) denotes the size of cut X. Splitting two adjacent edges (s, u) and (s, v) means deleting these edges and adding a new edge (u, v). For an integer k, splitting two edges e 1 and e 2 incident to s is called (k, s)-feasible if G(V – s) k holds in the resulting graph G. In this paper, we prove that, for a planar graph G and an even k or k = 3 with k G (V – s), there exists a complete (k, s)-feasible splitting at s such that the resulting graph G is still planar, and present an O(n 3 log n) time algorithm for finding such a splitting, where n = |V|. However, for every odd k 5, there is a planar graph G with a vertex s which has no complete (k, s)-feasible and planarity-preserving splitting. As an application of this result, we show that for an outerplanar graph G and an even integer k the problem of optimally augmenting G to a k-edge-connected planar graph can be solved in O(n 3 log n) time.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号