首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
Various methods for risk characterization have been developed using probabilistic approaches. Data on Vietnamese farmers are available for the comparison of outcomes for risk characterization using different probabilistic methods. This article addresses the health risk characterization of chlorpyrifos using epidemiological dose‐response data and probabilistic techniques obtained from a case study with rice farmers in Vietnam. Urine samples were collected from farmers and analyzed for trichloropyridinol (TCP), which was converted into absorbed daily dose of chlorpyrifos. Adverse health response doses due to chlorpyrifos exposure were collected from epidemiological studies to develop dose‐adverse health response relationships. The health risk of chlorpyrifos was quantified using hazard quotient (HQ), Monte Carlo simulation (MCS), and overall risk probability (ORP) methods. With baseline (prior to pesticide spraying) and lifetime exposure levels (over a lifetime of pesticide spraying events), the HQ ranged from 0.06 to 7.1. The MCS method indicated less than 0.05% of the population would be affected while the ORP method indicated that less than 1.5% of the population would be adversely affected. With postapplication exposure levels, the HQ ranged from 1 to 32.5. The risk calculated by the MCS method was that 29% of the population would be affected, and the risk calculated by ORP method was 33%. The MCS and ORP methods have advantages in risk characterization due to use of the full distribution of data exposure as well as dose response, whereas HQ methods only used the exposure data distribution. These evaluations indicated that single‐event spraying is likely to have adverse effects on Vietnamese rice farmers.  相似文献   

3.
Listeria monocytogenes is a leading cause of hospitalization, fetal loss, and death due to foodborne illnesses in the United States. A quantitative assessment of the relative risk of listeriosis associated with the consumption of 23 selected categories of ready‐to‐eat foods, published by the U.S. Department of Health and Human Services and the U.S. Department of Agriculture in 2003, has been instrumental in identifying the food products and practices that pose the greatest listeriosis risk and has guided the evaluation of potential intervention strategies. Dose‐response models, which quantify the relationship between an exposure dose and the probability of adverse health outcomes, were essential components of the risk assessment. However, because of data gaps and limitations in the available data and modeling approaches, considerable uncertainty existed. Since publication of the risk assessment, new data have become available for modeling L. monocytogenes dose‐response. At the same time, recent advances in the understanding of L. monocytogenes pathophysiology and strain diversity have warranted a critical reevaluation of the published dose‐response models. To discuss strategies for modeling L. monocytogenes dose‐response, the Interagency Risk Assessment Consortium (IRAC) and the Joint Institute for Food Safety and Applied Nutrition (JIFSAN) held a scientific workshop in 2011 (details available at http://foodrisk.org/irac/events/ ). The main findings of the workshop and the most current and relevant data identified during the workshop are summarized and presented in the context of L. monocytogenes dose‐response. This article also discusses new insights on dose‐response modeling for L. monocytogenes and research opportunities to meet future needs.  相似文献   

4.
《Risk analysis》2018,38(5):1052-1069
This study investigated whether, in the absence of chronic noncancer toxicity data, short‐term noncancer toxicity data can be used to predict chronic toxicity effect levels by focusing on the dose–response relationship instead of a critical effect. Data from National Toxicology Program (NTP) technical reports have been extracted and modeled using the Environmental Protection Agency's Benchmark Dose Software. Best‐fit, minimum benchmark dose (BMD), and benchmark dose lower limits (BMDLs) have been modeled for all NTP pathologist identified significant nonneoplastic lesions, final mean body weight, and mean organ weight of 41 chemicals tested by NTP between 2000 and 2012. Models were then developed at the chemical level using orthogonal regression techniques to predict chronic (two years) noncancer health effect levels using the results of the short‐term (three months) toxicity data. The findings indicate that short‐term animal studies may reasonably provide a quantitative estimate of a chronic BMD or BMDL. This can allow for faster development of human health toxicity values for risk assessment for chemicals that lack chronic toxicity data.  相似文献   

5.
Drawing on upper echelons theory, we argue that there will be an inverted U‐curve‐shaped relationship between the top management team's (TMT's) level of international experience and a firm's internationalization speed. Accounting for the role of executive job demands highlighted in upper echelons theory, we further suggest that competitive pressure, product diversification and geographic scope moderate the relationship between TMT international experience and internationalization speed by increasing the demands of TMT managers’ jobs. Using data on the international expansion of 91 retailers between 2003 and 2012, we find empirical support for the inverted U‐curve‐shaped effect of TMT international experience and the moderating role of competitive pressure. We find no moderating effect of product diversification or geographic scope.  相似文献   

6.
This article presents a regression‐tree‐based meta‐analysis of rodent pulmonary toxicity studies of uncoated, nonfunctionalized carbon nanotube (CNT) exposure. The resulting analysis provides quantitative estimates of the contribution of CNT attributes (impurities, physical dimensions, and aggregation) to pulmonary toxicity indicators in bronchoalveolar lavage fluid: neutrophil and macrophage count, and lactate dehydrogenase and total protein concentrations. The method employs classification and regression tree (CART) models, techniques that are relatively insensitive to data defects that impair other types of regression analysis: high dimensionality, nonlinearity, correlated variables, and significant quantities of missing values. Three types of analysis are presented: the RT, the random forest (RF), and a random‐forest‐based dose‐response model. The RT shows the best single model supported by all the data and typically contains a small number of variables. The RF shows how much variance reduction is associated with every variable in the data set. The dose‐response model is used to isolate the effects of CNT attributes from the CNT dose, showing the shift in the dose‐response caused by the attribute across the measured range of CNT doses. It was found that the CNT attributes that contribute the most to pulmonary toxicity were metallic impurities (cobalt significantly increased observed toxicity, while other impurities had mixed effects), CNT length (negatively correlated with most toxicity indicators), CNT diameter (significantly positively associated with toxicity), and aggregate size (negatively correlated with cell damage indicators and positively correlated with immune response indicators). Increasing CNT N2‐BET‐specific surface area decreased toxicity indicators.  相似文献   

7.
The U.S. Environmental Protection Agency's (EPA) Integrated Risk Information System (IRIS) database, the authoritative source of U.S. risk assessment toxicity factors, currently lacks an oral reference dose (RfD) for copper. In the absence of such a value, various health-based reference values for copper are available for use in risk assessment. We summarize the scientific bases and differences in assumptions among key reference values for ingested copper to guide selection of appropriate values for risk assessment. A comprehensive review of the scientific literature best supports the oral RfD of 0.04 mg/kg body weight/day derived by EPA from their Drinking Water Action Level. This value is based on acute gastrointestinal effects but is further supported by broader analysis of copper deficiency and toxicity.  相似文献   

8.
The dose‐response analyses of cancer and noncancer health effects of aldrin and dieldrin were evaluated using current methodology, including benchmark dose analysis and the current U.S. Environmental Protection Agency (U.S. EPA) guidance on body weight scaling and uncertainty factors. A literature review was performed to determine the most appropriate adverse effect endpoints. Using current methodology and information, the estimated reference dose values were 0.0001 and 0.00008 mg/kg‐day for aldrin and dieldrin, respectively. The estimated cancer slope factors for aldrin and dieldrin were 3.4 and 7.0 (mg/kg‐day)?1, respectively (i.e., about 5‐ and 2.3‐fold lower risk than the 1987 U.S. EPA assessments). Because aldrin and dieldrin are no longer used as pesticides in the United States, they are presumed to be a low priority for additional review by the U.S. EPA. However, because they are persistent and still detected in environmental samples, quantitative risk assessments based on the best available methods are required. Recent epidemiologic studies do not demonstrate a causal association between aldrin and dieldrin and human cancer risk. The proposed reevaluations suggest that these two compounds pose a lower human health risk than currently reported by the U.S. EPA.  相似文献   

9.
This paper studies a shape‐invariant Engel curve system with endogenous total expenditure, in which the shape‐invariant specification involves a common shift parameter for each demographic group in a pooled system of nonparametric Engel curves. We focus on the identification and estimation of both the nonparametric shapes of the Engel curves and the parametric specification of the demographic scaling parameters. The identification condition relates to the bounded completeness and the estimation procedure applies the sieve minimum distance estimation of conditional moment restrictions, allowing for endogeneity. We establish a new root mean squared convergence rate for the nonparametric instrumental variable regression when the endogenous regressor could have unbounded support. Root‐n asymptotic normality and semiparametric efficiency of the parametric components are also given under a set of “low‐level” sufficient conditions. Our empirical application using the U.K. Family Expenditure Survey shows the importance of adjusting for endogeneity in terms of both the nonparametric curvatures and the demographic parameters of systems of Engel curves.  相似文献   

10.
We examine the role of knowledge diversity among unit members on an organizational unit's productivity. Utilizing a proprietary data set of corrective maintenance tasks from a large software‐services firm, we investigate the impact of two key within‐unit diversity metrics: interpersonal diversity and intrapersonal diversity. We analyze the independent influence of interpersonal diversity and the interactive influence of interpersonal diversity and intrapersonal diversity on organizational unit's productivity. Finally, we examine how diversity moderates productivity of an organizational unit when employee turnover occurs. Our analysis reveals the following key insights: (a) interpersonal diversity has an inverted U‐shaped effect on organizational unit's productivity; (b) intrapersonal diversity moderates the influence of interpersonal diversity on organizational‐unit productivity; (c) at higher levels of interpersonal diversity, rate of decrease in productivity of the organizational unit due to turnover is higher. We discuss the resulting theoretical and managerial insights associated with these findings.  相似文献   

11.
Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values, and risk screening values. We aim to use computational toxicology and quantitative high‐throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene may induce DNA damage and steatosis using qHTS data and two separate adverse outcome pathways (AOPs). We also demonstrate how bootstrap natural spline‐based meta‐regression can be used to integrate data across multiple assay replicates to generate a concentration–response curve. We used this analysis to calculate an in vitro point of departure of 0.751 μM and risk‐specific in vitro concentrations of 0.29 μM and 0.28 μM for 1:1,000 and 1:10,000 risk, respectively, for DNA damage. Based on the available evidence, and considering that only a single HSD17B4 assay is available, we have low overall confidence in the steatosis hazard identification. This case study suggests that coupling qHTS assays with AOPs and ontologies will facilitate hazard identification. Combining this with quantitative evidence integration methods, such as bootstrap meta‐regression, may allow risk assessors to identify points of departure and risk‐specific internal/in vitro concentrations. These results are sufficient to prioritize the chemicals; however, in the longer term we will need to estimate external doses for risk screening purposes, such as through margin of exposure methods.  相似文献   

12.
13.
Quantitative risk assessments for physical, chemical, biological, occupational, or environmental agents rely on scientific studies to support their conclusions. These studies often include relatively few observations, and, as a result, models used to characterize the risk may include large amounts of uncertainty. The motivation, development, and assessment of new methods for risk assessment is facilitated by the availability of a set of experimental studies that span a range of dose‐response patterns that are observed in practice. We describe construction of such a historical database focusing on quantal data in chemical risk assessment, and we employ this database to develop priors in Bayesian analyses. The database is assembled from a variety of existing toxicological data sources and contains 733 separate quantal dose‐response data sets. As an illustration of the database's use, prior distributions for individual model parameters in Bayesian dose‐response analysis are constructed. Results indicate that including prior information based on curated historical data in quantitative risk assessments may help stabilize eventual point estimates, producing dose‐response functions that are more stable and precisely estimated. These in turn produce potency estimates that share the same benefit. We are confident that quantitative risk analysts will find many other applications and issues to explore using this database.  相似文献   

14.
This study examines how time spent in problem definition affects problem solving in projects such as Six Sigma projects. Our hypotheses are tested using data collected from 1558 Six Sigma projects in a company. The results show evidence of a U‐shaped relationship between the amount of time spent in the Define phase and project duration. This finding suggests that spending too little time on problem definition potentially causes poor problem formulation, which leads to deficient problem solving and lengthens overall project time. On the other hand, too much time spent on problem definition can lead to unneeded delays in project completion due to diminishing returns on problem definition efforts. Furthermore, the optimal balance between spending too little and too much time depends on prior project experience and project complexity. Prior project experience reduced project completion time and weakened the U‐shaped effect. Conversely, complex projects took longer and appeared to show some evidence of a stronger U‐shaped effect; this suggests balancing the time spent in the Define phase was more challenging for complex projects. Our study also underscores the importance of managing project duration, as projects that were completed faster tended to be associated with higher project savings.  相似文献   

15.
The U.S. electric power system is increasingly vulnerable to the adverse impacts of extreme climate events. Supply inadequacy risk can result from climate‐induced shifts in electricity demand and/or damaged physical assets due to hydro‐meteorological hazards and climate change. In this article, we focus on the risks associated with the unanticipated climate‐induced demand shifts and propose a data‐driven approach to identify risk factors that render the electricity sector vulnerable in the face of future climate variability and change. More specifically, we have leveraged advanced supervised learning theory to identify the key predictors of climate‐sensitive demand in the residential, commercial, and industrial sectors. Our analysis indicates that variations in mean dew point temperature is the common major risk factor across all the three sectors. We have also conducted a statistical sensitivity analysis to assess the variability in the projected demand as a function of the key climate risk factor. We then propose the use of scenario‐based heat maps as a tool to communicate the inadequacy risks to stakeholders and decisionmakers. While we use the state of Ohio as a case study, our proposed approach is equally applicable to all other states.  相似文献   

16.
The effect of bioaerosol size was incorporated into predictive dose‐response models for the effects of inhaled aerosols of Francisella tularensis (the causative agent of tularemia) on rhesus monkeys and guinea pigs with bioaerosol diameters ranging between 1.0 and 24 μm. Aerosol‐size‐dependent models were formulated as modification of the exponential and β‐Poisson dose‐response models and model parameters were estimated using maximum likelihood methods and multiple data sets of quantal dose‐response data for which aerosol sizes of inhaled doses were known. Analysis of F. tularensis dose‐response data was best fit by an exponential dose‐response model with a power function including the particle diameter size substituting for the rate parameter k scaling the applied dose. There were differences in the pathogen's aerosol‐size‐dependence equation and models that better represent the observed dose‐response results than the estimate derived from applying the model developed by the International Commission on Radiological Protection (ICRP, 1994) that relies on differential regional lung deposition for human particle exposure.  相似文献   

17.
Microbial food safety risk assessment models can often at times be simplified by eliminating the need to integrate a complex dose‐response relationship across a distribution of exposure doses. This is possible if exposure pathways lead to pathogens at exposure that consistently have a small probability of causing illness. In this situation, the probability of illness will follow an approximately linear function of dose. Consequently, the predicted probability of illness per serving across all exposures is linear with respect to the expected value of dose. The majority of dose‐response functions are approximately linear when the dose is low. Nevertheless, what constitutes “low” is dependent on the parameters of the dose‐response function for a particular pathogen. In this study, a method is proposed to determine an upper bound of the exposure distribution for which the use of a linear dose‐response function is acceptable. If this upper bound is substantially larger than the expected value of exposure doses, then a linear approximation for probability of illness is reasonable. If conditions are appropriate for using the linear dose‐response approximation, for example, the expected value for exposure doses is two to three logs10 smaller than the upper bound of the linear portion of the dose‐response function, then predicting the risk‐reducing effectiveness of a proposed policy is trivial. Simple examples illustrate how this approximation can be used to inform policy decisions and improve an analyst's understanding of risk.  相似文献   

18.
Disruptive events such as natural disasters, loss or reduction of resources, work stoppages, and emergent conditions have potential to propagate economic losses across trade networks. In particular, disruptions to the operation of container port activity can be detrimental for international trade and commerce. Risk assessment should anticipate the impact of port operation disruptions with consideration of how priorities change due to uncertain scenarios and guide investments that are effective and feasible for implementation. Priorities for protective measures and continuity of operations planning must consider the economic impact of such disruptions across a variety of scenarios. This article introduces new performance metrics to characterize resiliency in interdependency modeling and also integrates scenario‐based methods to measure economic sensitivity to sudden‐onset disruptions. The methods will be demonstrated on a U.S. port responsible for handling $36.1 billion of cargo annually. The methods will be useful to port management, private industry supply chain planning, and transportation infrastructure management.  相似文献   

19.
Upper Confidence Limits on Excess Risk for Quantitative Responses   总被引:8,自引:0,他引:8  
The definition and observation of clear-cut adverse health effects for continuous (quantitative) responses, such as altered body weights or organ weights, are difficult propositions. Thus, methods of risk assessment commonly used for binary (quantal) toxic responses such as cancer are not directly applicable. In this paper, two methods for calculating upper confidence limits on excess risk for quantitative toxic effects are proposed, based on a particular definition of an adverse quantitative response. The methods are illustrated with data from a dose-response study, and their performance is evaluated with a Monte Carlo simulation study.  相似文献   

20.
This study utilizes old and new Norovirus (NoV) human challenge data to model the dose‐response relationship for human NoV infection. The combined data set is used to update estimates from a previously published beta‐Poisson dose‐response model that includes parameters for virus aggregation and for a beta‐distribution that describes variable susceptibility among hosts. The quality of the beta‐Poisson model is examined and a simpler model is proposed. The new model (fractional Poisson) characterizes hosts as either perfectly susceptible or perfectly immune, requiring a single parameter (the fraction of perfectly susceptible hosts) in place of the two‐parameter beta‐distribution. A second parameter is included to account for virus aggregation in the same fashion as it is added to the beta‐Poisson model. Infection probability is simply the product of the probability of nonzero exposure (at least one virus or aggregate is ingested) and the fraction of susceptible hosts. The model is computationally simple and appears to be well suited to the data from the NoV human challenge studies. The model's deviance is similar to that of the beta‐Poisson, but with one parameter, rather than two. As a result, the Akaike information criterion favors the fractional Poisson over the beta‐Poisson model. At low, environmentally relevant exposure levels (<100), estimation error is small for the fractional Poisson model; however, caution is advised because no subjects were challenged at such a low dose. New low‐dose data would be of great value to further clarify the NoV dose‐response relationship and to support improved risk assessment for environmentally relevant exposures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号