首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The use of benchmark dose (BMD) calculations for dichotomous or continuous responses is well established in the risk assessment of cancer and noncancer endpoints. In some cases, responses to exposure are categorized in terms of ordinal severity effects such as none, mild, adverse, and severe. Such responses can be assessed using categorical regression (CATREG) analysis. However, while CATREG has been employed to compare the benchmark approach and the no‐adverse‐effect‐level (NOAEL) approach in determining a reference dose, the utility of CATREG for risk assessment remains unclear. This study proposes a CATREG model to extend the BMD approach to ordered categorical responses by modeling severity levels as censored interval limits of a standard normal distribution. The BMD is calculated as a weighted average of the BMDs obtained at dichotomous cutoffs for each adverse severity level above the critical effect, with the weights being proportional to the reciprocal of the expected loss at the cutoff under the normal probability model. This approach provides a link between the current BMD procedures for dichotomous and continuous data. We estimate the CATREG parameters using a Markov chain Monte Carlo simulation procedure. The proposed method is demonstrated using examples of aldicarb and urethane, each with several categories of severity levels. Simulation studies comparing the BMD and BMDL (lower confidence bound on the BMD) using the proposed method to the correspondent estimates using the existing methods for dichotomous and continuous data are quite compatible; the difference is mainly dependent on the choice of cutoffs for the severity levels.  相似文献   

2.
West  R. Webster  Kodell  Ralph L. 《Risk analysis》1999,19(3):453-459
Methods of quantitative risk assessment for toxic responses that are measured on a continuous scale are not well established. Although risk-assessment procedures that attempt to utilize the quantitative information in such data have been proposed, there is no general agreement that these procedures are appreciably more efficient than common quantal dose–response procedures that operate on dichotomized continuous data. This paper points out an equivalence between the dose–response models of the nonquantal approach of Kodell and West(1) and a quantal probit procedure, and provides results from a Monte Carlo simulation study to compare coverage probabilities of statistical lower confidence limits on dose corresponding to specified additional risk based on applying the two procedures to continuous data from a dose–response experiment. The nonquantal approach is shown to be superior, in terms of both statistical validity and statistical efficiency.  相似文献   

3.
Essential elements such as copper and manganese may demonstrate U‐shaped exposure‐response relationships due to toxic responses occurring as a result of both excess and deficiency. Previous work on a copper toxicity database employed CatReg, a software program for categorical regression developed by the U.S. Environmental Protection Agency, to model copper excess and deficiency exposure‐response relationships separately. This analysis involved the use of a severity scoring system to place diverse toxic responses on a common severity scale, thereby allowing their inclusion in the same CatReg model. In this article, we present methods for simultaneously fitting excess and deficiency data in the form of a single U‐shaped exposure‐response curve, the minimum of which occurs at the exposure level that minimizes the probability of an adverse outcome due to either excess or deficiency (or both). We also present a closed‐form expression for the point at which the exposure‐response curves for excess and deficiency cross, corresponding to the exposure level at which the risk of an adverse outcome due to excess is equal to that for deficiency. The application of these methods is illustrated using the same copper toxicity database noted above. The use of these methods permits the analysis of all available exposure‐response data from multiple studies expressing multiple endpoints due to both excess and deficiency. The exposure level corresponding to the minimum of this U‐shaped curve, and the confidence limits around this exposure level, may be useful in establishing an acceptable range of exposures that minimize the overall risk associated with the agent of interest.  相似文献   

4.
Weight of Evidence: A Review of Concept and Methods   总被引:1,自引:0,他引:1  
Douglas L. Weed 《Risk analysis》2005,25(6):1545-1557
"Weight of evidence" (WOE) is a common term in the published scientific and policy-making literature, most often seen in the context of risk assessment (RA). Its definition, however, is unclear. A systematic review of the scientific literature was undertaken to characterize the concept. For the years 1994 through 2004, PubMed was searched for publications in which "weight of evidence" appeared in the abstract and/or title. Of the 276 papers that met these criteria, 92 were selected for review: 71 papers published in 2003 and 2004 (WOE appeared in abstract/title) and 21 from 1994 through 2002 (WOE appeared in title). WOE has three characteristic uses in this literature: (1) metaphorical, where WOE refers to a collection of studies or to an unspecified methodological approach; (2) methodological, where WOE points to established interpretative methodologies (e.g., systematic narrative review, meta-analysis, causal criteria, and/or quality criteria for toxicological studies) or where WOE means that "all" rather than some subset of the evidence is examined, or rarely, where WOE points to methods using quantitative weights for evidence; and (3) theoretical, where WOE serves as a label for a conceptual framework. Several problems are identified: the frequent lack of definition of the term "weight of evidence," multiple uses of the term and a lack of consensus about its meaning, and the many different kinds of weights, both qualitative and quantitative, which can be used in RA. A practical recommendation emerges: the WOE concept and its associated methods should be fully described when used. A research agenda should examine the advantages of quantitative versus qualitative weighting schemes, how best to improve existing methods, and how best to combine those methods (e.g., epidemiology's causal criteria with toxicology's quality criteria).  相似文献   

5.
Toxicologists are often interested in assessing the joint effect of an exposure on multiple reproductive endpoints, including early loss, fetal death, and malformation. Exposures that occur prior to mating or extremely early in development can adversely affect the number of implantation sites or fetuses that form within each dam and may even prevent pregnancy. A simple approach for assessing overall adverse effects in such studies is to consider fetuses or implants that fail to develop due to exposure as missing data. The missing data can be imputed, and standard methods for the analysis of quantal response data can then be used for quantitative risk assessment or testing. In this article, a new bias-corrected imputation procedure is proposed and evaluated. The procedure is straightforward to implement in standard statistical packages and has excellent operating characteristics when used in combination with a marginal model fit with generalized estimating equations. The methods are applied to data from a reproductive toxicity study of Nitrofurazone conducted by the National Toxicology Program.  相似文献   

6.
Deriving weights from pairwise comparison matrices (PCM) is a highly researched topic. The analytic hierarchy process (AHP) traditionally uses the eigenvector method for the purpose. Numerous other methods have also been suggested. A distinctive feature of all these methods is that they associate a quantitative meaning to the judgemental information given by the decision-maker. In contrast, the verbal scale used in AHP to capture judgements does not associate such a quantitative meaning. Though this issue of treating judgements qualitatively is recognized in the extant literature on multi-criteria decision making, unfortunately, there is no research effort so far in the AHP literature. Deriving motivation from the application of data envelopment analysis (DEA) for deriving weights, it is proposed in this paper that DEA models developed to deal with a mix of qualitative and quantitative factors can be used to derive weights from PCMs by treating judgements as qualitative factors. The qualitative DEA model is discussed and illustrated in this paper.  相似文献   

7.
Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values, and risk screening values. We aim to use computational toxicology and quantitative high‐throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene may induce DNA damage and steatosis using qHTS data and two separate adverse outcome pathways (AOPs). We also demonstrate how bootstrap natural spline‐based meta‐regression can be used to integrate data across multiple assay replicates to generate a concentration–response curve. We used this analysis to calculate an in vitro point of departure of 0.751 μM and risk‐specific in vitro concentrations of 0.29 μM and 0.28 μM for 1:1,000 and 1:10,000 risk, respectively, for DNA damage. Based on the available evidence, and considering that only a single HSD17B4 assay is available, we have low overall confidence in the steatosis hazard identification. This case study suggests that coupling qHTS assays with AOPs and ontologies will facilitate hazard identification. Combining this with quantitative evidence integration methods, such as bootstrap meta‐regression, may allow risk assessors to identify points of departure and risk‐specific internal/in vitro concentrations. These results are sufficient to prioritize the chemicals; however, in the longer term we will need to estimate external doses for risk screening purposes, such as through margin of exposure methods.  相似文献   

8.
Quantitative Approaches in Use to Assess Cancer Risk   总被引:4,自引:0,他引:4  
  相似文献   

9.
Economic systems are increasingly prone to complexity and uncertainty. Therefore, making well-informed decisions requires risk analysis, control and mitigation. In some areas such as finance, insurance, crisis management and health care, the importance of considering risk is largely acknowledged and well-elaborated, yet rather heterogeneous concepts and approaches for risk management have been developed. The increased frequency and the severe consequences of past supply chain disruptions have resulted in an increasing interest in risk. This development has led to the adoption of the risk concepts, terminologies and methods from related fields. In this paper, existing approaches for quantitative supply chain risk management are reviewed by setting the focus on the definition of supply chain risk and related concepts.  相似文献   

10.
Novel materials with unique or enhanced properties relative to conventional materials are being developed at an increasing rate. These materials are often referred to as advanced materials (AdMs) and they enable technological innovations that can benefit society. Despite their benefits, however, the unique characteristics of many AdMs, including many nanomaterials, are poorly understood and may pose environmental safety and occupational health (ESOH) risks that are not readily determined by traditional risk assessment methods. To assess these risks while keeping up with the pace of development, technology developers and risk assessors frequently employ risk‐screening methods that depend on a clear definition for the materials that are to be assessed (e.g., engineered nanomaterial) as well as a method for binning materials into categories for ESOH risk prioritization. The term advanced material lacks a consensus definition and associated categorization or grouping system for risk screening. In this study, we aim to establish a practitioner‐driven definition for AdMs and a practitioner‐validated framework for categorizing AdMs into conceptual groupings based on material characteristics. Results from multiple workshops and interviews with practitioners provide consistent differentiation between AdMs and conventional materials, offer functional nomenclature for application science, and provide utility for future ESOH risk assessment prioritization. The definition and categorization framework established here serve as a first step in determining if and when there is a need for specific ESOH and regulatory screening for an AdM as well as the type and extent of risk‐related information that should be collected or generated for AdMs and AdM‐enabled technologies.  相似文献   

11.
Polycyclic aromatic hydrocarbons (PAHs) have been labeled contaminants of concern due to their carcinogenic potential, insufficient toxicological data, environmental ubiquity, and inconsistencies in the composition of environmental mixtures. The Environmental Protection Agency is reevaluating current methods for assessing the toxicity of PAHs, including the assumption of toxic additivity in mixtures. This study was aimed at testing mixture interactions through in vitro cell culture experimentation, and modeling the toxicity using quantitative structure‐activity relationships (QSAR). Clone‐9 rat liver cells were used to analyze cellular proliferation, viability, and genotoxicity of 15 PAHs in single doses and binary mixtures. Tests revealed that many mixtures have nonadditive toxicity, but display varying mixture effects depending on the mixture composition. Many mixtures displayed antagonism, similar to other published studies. QSARs were then developed using the genetic function approximation algorithm to predict toxic activity both in single PAH congeners and in binary mixtures. Effective concentrations inhibiting 50% of the cell populations were modeled, with R2 = 0.90, 0.99, and 0.84, respectively. The QSAR mixture algorithms were then adjusted to account for the observed mixture interactions as well as the mixture composition (ratios) to assess the feasibility of QSARs for mixtures. Based on these results, toxic addition is improbable and therefore environmental PAH mixtures are likely to see nonadditive responses when complex interactions occur between components. Furthermore, QSAR may be a useful tool to help bridge these data gaps surrounding the assessment of human health risks that are associated with PAH exposures.  相似文献   

12.
The last few decades have seen increasingly widespread use of risk assessment and management techniques as aids in making complex decisions. However, despite the progress that has been made in risk science, there still remain numerous examples of risk-based decisions and conclusions that have caused great controversy. In particular, there is a great deal of debate surrounding risk assessment: the role of values and ethics and other extra-scientific factors, the efficacy of quantitative versus qualitative analysis, and the role of uncertainty and incomplete information. Many of the epistemological and methodological issues confronting risk assessment have been explored in general systems theory, where techniques exist to manage such issues. However, the use of systems theory and systems analysis tools is still not widespread in risk management. This article builds on the Alachlor risk assessment case study of Brunk, Haworth, and Lee to present a systems-based view of the risk assessment process. The details of the case study are reviewed and the authors' original conclusions regarding the effects of extra-scientific factors on risk assessment are discussed. Concepts from systems theory are introduced to provide a mechanism with which to illustrate these extra-scientific effects The role of a systems study within a risk assessment is explained, resulting in an improved view of the problem formulation process The consequences regarding the definition of risk and its role in decision making are then explored.  相似文献   

13.
Beliefs about risks associated with two risk agents, AIDS and toxic waste, are modeled using knowledge-based methods and elicited from subjects via interactive computer technology. A concept net is developed to organize subject responses concerning the consequences of the risk agents. It is found that death and adverse personal emotional and sociological consequences are most associated with AIDS. Toxic waste is most associated with environmental problems. These consequence profiles are quite dissimilar, although past work in risk perception would have judged the risk agents as being quite similar. Subjects frequently used causal semantics to represent their beliefs and "% of time" instead of "probability" to represent likelihoods. The news media is the most prevalent source of risk information although experiences of acquaintances appear more credible. The results suggest that "broadly based risk" communication may be ineffective because people differ in their conceptual representation of risk beliefs. In general, the knowledge-based approach to risk perception representation has great potential to increase our understanding of important risk topics.  相似文献   

14.
A persistent problem in health risk analysis where it is known that a disease may occur as a consequence of multiple risk factors with interactions is allocating the total risk of the disease among the individual risk factors. This problem, referred to here as risk apportionment, arises in various venues, including: (i) public health management, (ii) government programs for compensating injured individuals, and (iii) litigation. Two methods have been described in the risk analysis and epidemiology literature for allocating total risk among individual risk factors. One method uses weights to allocate interactions among the individual risk factors. The other method is based on risk accounting axioms and finding an optimal and unique allocation that satisfies the axioms using a procedure borrowed from game theory. Where relative risk or attributable risk is the risk measure, we find that the game‐theory‐determined allocation is the same as the allocation where risk factor interactions are apportioned to individual risk factors using equal weights. Therefore, the apportionment problem becomes one of selecting a meaningful set of weights for allocating interactions among the individual risk factors. Equal weights and weights proportional to the risks of the individual risk factors are discussed.  相似文献   

15.
Various methods for risk characterization have been developed using probabilistic approaches. Data on Vietnamese farmers are available for the comparison of outcomes for risk characterization using different probabilistic methods. This article addresses the health risk characterization of chlorpyrifos using epidemiological dose‐response data and probabilistic techniques obtained from a case study with rice farmers in Vietnam. Urine samples were collected from farmers and analyzed for trichloropyridinol (TCP), which was converted into absorbed daily dose of chlorpyrifos. Adverse health response doses due to chlorpyrifos exposure were collected from epidemiological studies to develop dose‐adverse health response relationships. The health risk of chlorpyrifos was quantified using hazard quotient (HQ), Monte Carlo simulation (MCS), and overall risk probability (ORP) methods. With baseline (prior to pesticide spraying) and lifetime exposure levels (over a lifetime of pesticide spraying events), the HQ ranged from 0.06 to 7.1. The MCS method indicated less than 0.05% of the population would be affected while the ORP method indicated that less than 1.5% of the population would be adversely affected. With postapplication exposure levels, the HQ ranged from 1 to 32.5. The risk calculated by the MCS method was that 29% of the population would be affected, and the risk calculated by ORP method was 33%. The MCS and ORP methods have advantages in risk characterization due to use of the full distribution of data exposure as well as dose response, whereas HQ methods only used the exposure data distribution. These evaluations indicated that single‐event spraying is likely to have adverse effects on Vietnamese rice farmers.  相似文献   

16.
针对现有DEMATEL指标权重确定方法大多基于个体决策,且未考虑群体决策评价标度不一致的情况,提出一种新的基于三维密度算子的群体DEMATEL指标权重确定方法。首先,定义了不同评价标度的转换函数,以此将群体DEMATEL矩阵进行一致化处理;其次,给出一种群体DEMATEL矩阵的聚类方法,在此基础上利用三维密度算子对其进行集结;最后,依据DEMATEL方法识别出指标的中心度和原因度,并计算各指标的权重。文末通过一个应用实例验证了所提方法的可行性与有效性。实例结果表明,该方法由于能够较好解决群体决策评价标度不一致的问题,还能够充分考虑群体决策的共识度,因此可使指标权重结果更为客观合理且更为可靠。  相似文献   

17.
Concerning the essence of risk, we suggest a new definition of risk: a scene in the future associated with some adverse incident. In many cases, risks are rather fuzzy for our perception because of the shortage of knowledge or information about the systems that determine the adverse incidents. We introduce a concept of fuzzy risk based on the new risk definition and fuzzy sets. And, in this article, we suggest a fuzzy average algorithm to update a fuzzy risk that stores all information from the original data. To illustrate the algorithm, we update a soft risk map of flood where the fuzzy risks are calculated using the interior-outer-set model.  相似文献   

18.
Modeling for Risk Assessment of Neurotoxic Effects   总被引:2,自引:0,他引:2  
The regulation of noncancer toxicants, including neurotoxicants, has usually been based upon a reference dose (allowable daily intake). A reference dose is obtained by dividing a no-observed-effect level by uncertainty (safety) factors to account for intraspecies and interspecies sensitivities to a chemical. It is assumed that the risk at the reference dose is negligible, but no attempt generally is made to estimate the risk at the reference dose. A procedure is outlined that provides estimates of risk as a function of dose. The first step is to establish a mathematical relationship between a biological effect and the dose of a chemical. Knowledge of biological mechanisms and/or pharmacokinetics can assist in the choice of plausible mathematical models. The mathematical model provides estimates of average responses as a function of dose. Secondly, estimates of risk require selection of a distribution of individual responses about the average response given by the mathematical model. In the case of a normal or lognormal distribution, only an estimate of the standard deviation is needed. The third step is to define an adverse level for a response so that the probability (risk) of exceeding that level can be estimated as a function of dose. Because a firm response level often cannot be established at which adverse biological effects occur, it may be necessary to at least establish an abnormal response level that only a small proportion of individuals would exceed in an unexposed group. That is, if a normal range of responses can be established, then the probability (risk) of abnormal responses can be estimated. In order to illustrate this process, measures of the neurotransmitter serotonin and its metabolite 5-hydroxyindoleacetic acid in specific areas of the brain of rats and monkeys are analyzed after exposure to the neurotoxicant methylene-dioxymethamphetamine. These risk estimates are compared with risk estimates from the quantal approach in which animals are classified as either abnormal or not depending upon abnormal serotonin levels.  相似文献   

19.
Quantitative risk analysis is being extensively employed to support policymakers and provides a strong conceptual framework for evaluating decision alternatives under uncertainty. Many problems involving environmental risks are, however, of a spatial nature, i.e., containing spatial impacts, spatial vulnerabilities, and spatial risk‐mitigation alternatives. Recent developments in multicriteria spatial analysis have enabled the assessment and aggregation of multiple impacts, supporting policymakers in spatial evaluation problems. However, recent attempts to conduct spatial multicriteria risk analysis have generally been weakly conceptualized, without adequate roots in quantitative risk analysis. Moreover, assessments of spatial risk often neglect the multidimensional nature of spatial impacts (e.g., social, economic, human) that are typically occurring in such decision problems. The aim of this article is therefore to suggest a conceptual quantitative framework for environmental multicriteria spatial risk analysis based on expected multi‐attribute utility theory. The framework proposes: (i) the formal assessment of multiple spatial impacts; (ii) the aggregation of these multiple spatial impacts; (iii) the assessment of spatial vulnerabilities and probabilities of occurrence of adverse events; (iv) the computation of spatial risks; (v) the assessment of spatial risk mitigation alternatives; and (vi) the design and comparison of spatial risk mitigation alternatives (e.g., reductions of vulnerabilities and/or impacts). We illustrate the use of the framework in practice with a case study based on a flood‐prone area in northern Italy.  相似文献   

20.
The research described here is part of a larger risk assessment project to aid the U.S. Environmental Protection Agency (EPA) in its review of the primary National Ambient Air Quality Standard for lead. The methodology can be applied to many situations in which a policy decision about a toxic substance is required in the face of incomplete data. Numerical results are presented for three potentially adverse lead-induced effects of interest to EPA: elevated erythrocyte protoporphyrin (EP), hemoglobin (Hb) decrement, and intelligence quotient (IQ) decrement.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号